Azure Functions are great! HTTP triggered Azure Functions are also great, but there’s one downside. All HTTP triggered Azure Functions are publicly available. While this might be useful in a lot of scenario’s, it’s also quite possible you don’t want ‘strangers’ hitting your public endpoints all the time.

One way you can solve this is by adding a small bit of authentication on your Azure Functions.

For HTTP Triggered functions you can specify the level of authority one needs to have in order to execute it. There are five levels you can choose from. It’s `Anonymous`, `Function`, `Admin`, `System` and `User`. When using C# you can specify the authorization level in the HttpTrigger-attribute, you can also specify this in the function.json file of course. If you want a Function to be accessed by anyone, the following piece of code will work because the authorization is set to Anonymous.

[FunctionName("Anonymous")]
public static HttpResponseMessage Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", Route = null)]
    HttpRequestMessage req,
    ILogger logger)
{
    // your code

    return req.CreateResponse(HttpStatusCode.OK);
}

If you want to use any of the other levels, just change the AuthorizationLevel enum to any of the other values corresponding to the level of access you want. I’ve created a sample project on GitHub containing several Azure Functions with different authorization levels so you can test out the difference in the authorization levels yourself. Keep in mind, when running the Azure Functions locally, the authorization attribute is ignored and you can call any Function no matter which level is specified.

Whenever a different level a different level as Anonymous is used, the caller has to specify an additional parameter in the request in order to get authorized to use the Azure Function. Adding this additional parameter can be done in 2 different ways.
The first one is adding a `code`-parameter to the querystring which value contains the secret necessary for calling the Azure Function. When using this approach, a request can look like this: https://[myFunctionApp].azurewebsites.net/api/myFunction?code=theSecretCode

Another approach is to add this secret value in the header of your request. If this is your preferred way, you can add a header called `x-functions-key` which value has to contain the secret code to access the Azure Function.

There are no real pros or cons to any of these two approaches, it quite depends on your solution needs and how you want to integrate these Functions.

Which level should I choose?

Well, it depends.

The Anonymous level is pretty straightforward, so I’ll skip this one.

The User level is also fairly simple, as it’s not supported (yet). From what I understand, this level can and will be used in the future in order to support custom authorization mechanisms (EasyAuth). There’s an issue on GitHub which tracks this feature.

The Function level should be used if you want to give some other system (or user) access to this specific Azure Function. You will need to create a Function Key which the end-user/system will have to specify in the request they are making to the Azure Function. If this Function Key is used for a different Azure Function, it won’t get accepted and the caller will receive a 401 Unauthorized.

Only ones left are the Admin and System levels. Both of them are fairly similar, they work with so-called Host Keys. These Host Keys work on all Azure Functions which are deployed on the same system (Function App). This is different from the earlier mentioned Function Keys, which only work for one specific function.
The main difference between these two is, System only works with the _master host key. The Admin level should also work with all of the other defined host keys. I’m writing the word' ‘should’, because I couldn’t get this level to work with any of the other host keys. Both only appeared to be working with the defined _master key. This might have been a glitch at the time, but it’s good to investigate once you get started with this.

How do I set up those keys?

Sadly, you can’t specify them via an ARM template (yet?). My guess is this will never be possible as it’s something you want to manage yourself per environment. So how to proceed? Well head to the Azure Portal and check out the Azure Function you want to see or create keys for.

You can manage the keys for each Azure Function in the portal and even create new ones if you like.

clip_image001

I probably don’t have to mention this, but just to be on the safe side, you don’t want to distribute these keys to a client-side application. The reason for this is pretty obvious, it’s because the key will be sent in the request, therefore it’s not a secret anymore. Anyone can check out this request and see which key is sent to the Azure Function.
You only want to use these keys (both Function Keys and Host Keys) when making requests between server-side applications. This way your keys will never be exposed to the outside world and minimize the chance of a key getting compromised. If for some reason a key does become compromised you can Renew or Revoke a key.

One gotcha!

There’s one gotcha when creating an HTTP Triggered Function with the Admin authorization level. You can’t prefix these Functions with the term ‘Admin’. For example, when creating a function called ‘AdministratorActionWhichDoesSomethingImportant’ you won’t be able to deploy and run it. You’ll receive an error there’s a routing conflict.

[21-8-2018 19:07:41] The following 1 functions are in error:
[21-8-2018 19:07:41] AdministratorActionWhichDoesSomethingImportant : The specified route conflicts with one or more built in routes.

Or when navigating to the Function in the portal you’ll get this error message popped up.

image

Probably something you want to know in advance, before designing your complete API with Azure Functions.

There’s a relative new feature available in Azure called Managed Service Identity. What it does is create an identity for a service instance in the Azure AD tenant, which in its turn can be used to access other resources within Azure. This is a great feature, because now you don’t have to maintain and create identities for your applications by yourself anymore. All of this management is handled for you when using a System Assigned Identity. There’s also an option to use User Assigned Identities which work a bit different.

Because I’m an Azure Function fanboy and want to store my secrets within Azure Key Vault, I was wondering if I was able to configure MSI via an ARM template and access the Key Vault from an Azure Function without specifying an identity by myself.

As most of the things, setting this up is rather easy, once you know what to do.

The ARM template

The documentation states you can add an `identity` property to your Azure App Service in order to enable MSI.

"identity": {
    "type": "SystemAssigned"
}

This setting is everything you need in order to create a new service principal (identity) within the Azure Active Directory. This new identity has the exact same name as your App Service, so it should be easy to identify.

If you want to check out yourself if everything worked, you can check the AAD Audit Log. It should have a couple of lines stating a new service principal has been created.

clip_image001

You can also check out the details of which has happened by clicking on the lines.

image

Not very interesting, until something is broken or needs debugging.

An easier method to check if your service principal has been created is by checking the Enterprise Applications within your AAD tenant. If your deployment has been successful, there’s an application with the same name as your App Service.

clip_image001[5]

Step two in your ARM template

After having added the identity to the App Service, you now have access to the `tenantId` and `principalId` which belong to this identity. These properties are necessary in order to give your App Service identity access to the Azure Key Vault. If you’re familiar with Key Vault, you probably know there are some Access Policies you have to define in order to get access to specific areas in the Key Vault.

Figuring out how to retrieve the new App Service properties was the hardest part of this complete post, for me. Eventually I figured out how to access these properties, thanks to an answer on Stack Overflow. What I ended up doing is retrieving a reference to the App Service in the `accessPolicies` block of the Key Vault resource and use the `identity.tenantId` and `identity.principalId`.

"accessPolicies": [
{
  "tenantId": "[reference(concat('Microsoft.Web/sites/', parameters('webSiteName')), '2018-02-01', 'Full').identity.tenantId]",
  "objectId": "[reference(concat('Microsoft.Web/sites/', parameters('webSiteName')), '2018-02-01', 'Full').identity.principalId]",
  "permissions": {
    "keys": [],
    "secrets": [
      "get"
    ],
    "certificates": [],
    "storage": []
  }
}],

Easy, right? Well, if you’re an ARM-template guru probably.

Now deploy your template again and you should be able to see your service principal being added to the Key Vault access policies.

clip_image001[7]

Because we’ve specified the identity has access to retrieve (GET) secrets, in theory we are now able to use the Key Vault.

Retrieving data from the Key Vault

This is actually the easiest part. There’s a piece of code you can copy from the documentation pages, because it just works!

var azureServiceTokenProvider = new AzureServiceTokenProvider();
var keyvaultClient = new KeyVaultClient(new KeyVaultClient.AuthenticationCallback(azureServiceTokenProvider.KeyVaultTokenCallback));
            
var secretValue = await keyvaultClient.GetSecretAsync($"https://{myVault}.vault.azure.net/", "MyFunctionSecret");
            
return req.CreateResponse(HttpStatusCode.OK, $"Hello World! This is my secret value: `{secretValue.Value}`.");

The above piece of code retrieves a secret from the Key Vault and shows it in the response of the Azure Function. The result should look something like the following response I saw in Firefox.

image

Using the `KeyVaultTokenCallback` is exclusive to be used with the Key Vault (hence the name). If you want to use MSI with other Azure services, you will need to use the `GetAccessTokenAsync` method in order to retrieve an access token to access the other Azure service.

So, that’s all there is to it in order to make your Azure Function or Azure environment a bit more safe with these managed identities.
If you want to check out the complete source code, it’s available on GitHub.

I totally recommend using MSI, because it’ll make your code, software and templates much safer and secure.

I’m in the process of adding an ARM template to an open source project I’m contributing to. All of this was pretty straightforward, until I needed to add some secrets and connection strings to the project.

While it’s totally possible to integrate these secrets in your ARM parameter file or in your continuous deployment pipeline, I wanted to do something a bit more advanced and secure. Of course, Azure Key Vault comes to mind! I’ve already used this in some of my other ASP.NET projects and Azure Functions, so nothing new here.

The thing is, the projects I’ve worked on, always retrieved the secrets from Key Vault like the following example:

"adminPassword": {
    "reference": {
        "keyVault": {
        "id": "/subscriptions/<subscription-id>/resourceGroups/examplegroup/providers/Microsoft.KeyVault/vaults/<vault-name>"
        },
        "secretName": "examplesecret"
    }
}

While this isn’t a bad thing per se, I don’t like having the `subscription-id` hardcoded in this configuration, especially when doing open source development. Mainly because other people can’t access my Key Vault, so they’ll run into trouble when deploying this template. Therefore, I started investigating if this subscription id can be added dynamically.

Introducing the Dynamic Id

Lucky for us the ARM-team has us covered! By changing the earlier mentioned configuration a bit you’re able to use the function `subscription().subscriptionId` in order to get your own subscription id.

"adminPassword": {
    "reference": {
        "keyVault": {
        "id": "[resourceId(subscription().subscriptionId,  parameters('vaultResourceGroup'), 'Microsoft.KeyVault/vaults', parameters('vaultName'))]"
        },
        "secretName": "[parameters('secretName')]"
    }
},

Downside though, this doesn’t work in your parameter file!

It also doesn’t work in your normal ARM template!

So what’s left? Well, using ARM templates in combination with nested templates! Nested templates are the key to using this dynamic id. Nested templates aren’t something I envy using, because it’s easy to get lost in all of those open files.

Well, enough sample configuration for now, let’s see how this looks like in an actual file.

{
    "apiVersion": "2015-01-01",
    "name": "nestedTemplate",
    "type": "Microsoft.Resources/deployments",
    "properties": {
        "mode": "Incremental",
        "templateLink": {
            "uri": "[concat(parameters('templateBaseUri'), 'my-nested-template.json')]",
            "contentVersion": "1.0.0.0"
        },
        "parameters": {
            "resourcegroup": {
                "value": "[parameters('resourcegroup')]"
            },
            "hostingPlanName": {
                "value": "[parameters('hostingPlanName')]"
            },
            "skuName": {
                "value": "[parameters('skuName')]"
            },
            "skuCapacity": {
                "value": "[parameters('skuCapacity')]"
            },
            "websiteName": {
                "value": "[parameters('websiteName')]"
            },
            "vaultName": {
                "value": "[parameters('vaultName')]"
            },
            "mySuperSecretValueForTheAppService": {
                "reference": {
                    "keyVault": {
                        "id": "[resourceId(subscription().subscriptionId,  parameters('resourcegroup'), 'Microsoft.KeyVault/vaults', parameters('vaultName'))]"
                    },
                    "secretName": "MySuperSecretValueForTheAppService"
                }
            }
        }
    }
}

In order to use the dynamic id, you have to add it to the `parameters`-section of the nested template resource. Anywhere else in the process is too early or too late to retrieve those values. Ask me how I know…

The observant reader might also notice me using the `templateLink` property with an URI inside.

"templateLink": {
    "uri": "[concat(parameters('templateBaseUri'), 'my-nested-template.json')]",
    "contentVersion": "1.0.0.0"
}

This is because you can only use these functions when the nested template is located on a (public) remote location. Another reason why I don’t really like this approach. Linking to a remote location means you can’t use the templates which are located inside the artifact package you are deploying. There is an issue on the feedback portal asking to support local file locations, but it’s not implemented (yet).

For now we just have to copy the template(s) to a remote location during the CI-build process (or do some template-extraction-and-publication-magic in the deployment pipeline). Whenever the CD pipeline runs, it’ll have to try to use the templates which are pushed to this remote location. Sounds like a lot of work, that’s because it is!

You might wonder how does this nested template look like? Well, it looks a lot like a ‘normal’ template

{
    "$schema": "https://schema.management.azure.com/schemas/2015-01-01/deploymentTemplate.json#",
    "contentVersion": "1.0.0.0",
    "parameters": {
        "resourcegroup": {
            "type": "string"
        },
        "hostingPlanName": {
            "type": "string",
            "minLength": 1
        },
        "skuName": {
            "type": "string",
            "defaultValue": "F1",
            "allowedValues": [
                "F1",
                "D1",
                "B1",
                "B2",
                "B3",
                "S1",
                "S2",
                "S3",
                "P1",
                "P2",
                "P3",
                "P4"
            ],
            "metadata": {
                "description": "Describes plan's pricing tier and instance size. Check details at https://azure.microsoft.com/en-us/pricing/details/app-service/"
            }
        },
        "skuCapacity": {
            "type": "int",
            "defaultValue": 1,
            "minValue": 1,
            "metadata": {
                "description": "Describes plan's instance count"
            }
        },
        "websiteName": {
            "type": "string"
        },
        "vaultName": {
            "type": "string"
        },
        "mySuperSecretValueForTheAppService": {
            "type": "securestring"
        }
    },
    "variables": {},
    "resources": [{
            "apiVersion": "2015-08-01",
            "name": "[parameters('hostingPlanName')]",
            "type": "Microsoft.Web/serverfarms",
            "location": "[resourceGroup().location]",
            "tags": {
                "displayName": "HostingPlan"
            },
            "sku": {
                "name": "[parameters('skuName')]",
                "capacity": "[parameters('skuCapacity')]"
            },
            "properties": {
                "name": "[parameters('hostingPlanName')]"
            }
        },
        {
            "apiVersion": "2015-08-01",
            "name": "[parameters('webSiteName')]",
            "type": "Microsoft.Web/sites",
            "location": "[resourceGroup().location]",
            "dependsOn": [
                "[resourceId('Microsoft.Web/serverFarms/', parameters('hostingPlanName'))]"
            ],
            "tags": {
                "[concat('hidden-related:', resourceGroup().id, '/providers/Microsoft.Web/serverfarms/', parameters('hostingPlanName'))]": "empty",
                "displayName": "Website"
            },
            "properties": {
                "name": "[parameters('webSiteName')]",
                "serverFarmId": "[resourceId('Microsoft.Web/serverfarms', parameters('hostingPlanName'))]"
            },
            "resources": [{
                "name": "appsettings",
                "type": "config",
                "apiVersion": "2015-08-01",
                "dependsOn": [
                    "[resourceId('Microsoft.Web/Sites/', parameters('webSiteName'))]"
                ],
                "tags": {
                    "displayName": "appSettings"
                },
                "properties": {
                    "MySuperSecretValueForTheAppService": "[parameters('mySuperSecretValueForTheAppService')]"
                }
            }]
        }
    ],
    "outputs": {}
}

This nested template is responsible for creating an Azure App Service with an Application Setting containing the secret we retrieved from Azure Key Vault in the main template. Pretty straightforward, especially if you’ve worked with ARM templates before.

If you want to see the complete templates & solution, check out my GitHub repository with this sample templates.

The deployment

All of this configuration is fun and games, but does it actually do the job?

One way to find out and that’s setting up a proper deployment pipeline! I’m most familiar using VSTS, so that’s the tool I’ll be using.

Create a new Release, add a new artifact to the location of your templates and create a new environment.

For testing purposes, this environment only needs to have a single step based on the `Create or Update Resource Group`-task.

In this task you will need to select the ARM Template file, along with the parameters file you want to use. Of course, all of the secrets I don’t want to specify, or want to override, I’m placing in the `Override template parameters`-section. Most important is the parameter for the `templateBaseUri`. This parameter contains the base URI to the location where the nested template(s) are stored.


image

It makes sense to override this setting as it’s quite possible you don’t want to use the GitHub location over here, but some location on a public blob container created by your CI-build.

Now save this pipeline and queue a release.

If all goes well, the deployment will fail with a `KeyVaultParameterReferenceNotFound` error.

At least one resource deployment operation failed. Please list deployment operations for details. Please see https://aka.ms/arm-debug for usage details.
Details:
BadRequest: {
"error": {
"code": "KeyVaultParameterReferenceNotFound",
"message": "The specified KeyVault '/subscriptions/[subscription-id]/resourceGroups/nested-template-sample/providers/Microsoft.KeyVault/vaults/nested-template-vault' could not be found. Please see https://aka.ms/arm-keyvault for usage details."
}
} undefined
Task failed while creating or updating the template deployment.

Or a bit more visual:

clip_image001

This makes sense as we’re trying to retrieve a secret from the Azure Key Vault which doesn’t exist yet!

If you head down to the Azure Portal and check out the resource group you’ll notice both the resource group and the Key Vault has been created.

clip_image001[7]

The only thing which we need to do is add the `MySuperSecretValueForTheAppService` to the Key Vault.

clip_image001[9]

Once it’s added we can try the release again. All steps should be green now.

clip_image001[11]

You can verify in the resource group both the hosting plan and the App Service have been created now.

clip_image001[13]

Zooming in on the Application Settings of the App Service you’re also able to see the secret value which has been retrieved from Azure Key Vault!

clip_image001[15]

Proof the dynamic id is working when using the dynamic id and a nested template!

Too bad a `securestring` is still rendered in plain text on the portal, but that’s a completely different issue.


It has taken me quite some time to figure out all of the above steps. Probably because I’m no CI/CD expert, so I hope the above post will help others who aren’t experts on the matter also.

So you might remember me posting about using the Let’s Encrypt site extension for Azure App Services, called Azure Let’s Encrypt, created by SJKP.

This has quite well for over a year now and even works for Function Apps.

However, last month I got notified my SSL certificate was expired on one of my sites. Strange, as an automated job should just handle this for me. I thought the job probably didn’t execute because of some glitch in the matrix. Therefore I logged in manually, started the site extension wizard again and was stuck on this screen.

/posts/files/8f2e3008-da76-46b2-b583-065827452f3f.png

The reason I was stuck was because the ClientId and ClientSecret didn’t work anymore. As these settings hadn’t changed since I started using this extension I found it quite strange.

Apparently, the Server Principal, which I had created last year, somehow had changed and I didn’t know how to change it back. Lucky for me, managing the AAD isn’t very hard to do nowadays. With a bit of trial and error I was able to create a new SNP and use these details on the Let’s Encrypt site extension.

Creating a new application in AAD

First thing you need to do is add a new Appliction to your AAD. Be sure to pick the option App registrations over here and press the New application registration.

clip_image001

When creating an application you have to specify a name, I chose `LetsEncrypt` and which type it is. Just choose the `Web app/ API` option over here. The other mandatory field called `Sign-on URL` isn’t used in our scenario so you can use any URL you like.

When your application is created you’ll see be navigated to the overview page of this application. Be sure to copy the Application ID from over here as you need it later on. This value has to be used as the ClientId in the site extension.

image

Next thing we need to do is add a Key to this application. You can add new keys by Settings link and choose the Keys option. This key will be used as the Client Secret. Be sure to copy the value after saving as this is the only time you’ll be able to see it.

image

Also note the Expires option.

The default expiration date is set to 1 year from now. This has led me to believe the ClientSecret of my earlier SNP is probably expired. In hindsight I could probably have updated the value in my old SNP and be done with it.

We now have everything we need from our application, so the next thing is to set up the resource group.

Set up your resource group

We need the newly created application to do stuff inside our resource group. Therefore we need to add some permission to it.

To do so, head down to the resource group which contains your app service(s) and pick the Access control (IAM) option.

clip_image001[5]

From over here you can select your newly created application and grant it the Contributor role.

clip_image001[7]

If everything goes well you’ll see the application is added to the list of contributors of this resource.

image

Running the wizard again

Everything should be set up correctly now so you can head back to the wizard of the site extension. Be sure to fill out the ClientId and ClientSecret with the newly retrieved values from the application.

After doing so and trying to proceed to the next screen I was prompted with the message `The ClientId registered under application settings [guid] does not match the ClientId you entered here [guid]` as you can see in the screenshot below.

image

The first time I ran this wizard (a year ago) it was able to create and update the application settings of the App Service. Apparently this has changed and I had to change the Application Settings by myself in the App Service before I was able to continue in the Let’s Encrypt site extension.

For completeness sake, if you’re running a Function App, you can find the settings under All settings, which will navigate you to the App Service settings.

clip_image001[9]

After you’ve changed these settings you should be able to proceed and continue with requesting your SSL certificates.

That’s all there is to it!

Hope it helps whenever you run into problems if your SNP doesn’t work anymore. As I already mentioned, in hindsight it would probably have been much easier by just updating the Key of my original SNP, which I’ll probably need to do in 2 years from now when this new secret will expire.

Using certificates to secure, sign and validate information has become a common practice in the past couple of years. Therefore, it makes sense to use them in combination with Azure Functions as well.

As Azure Functions are hosted on top of an Azure App Service this is quite possible, but you do have to configure something before you can start using certificates.

Adding your certificate to the Function App

Let’s just start at the beginning, in case you are wondering on how to add these certificates to your Function App. Adding certificates is ‘hidden’ on the SSL blade in the Azure portal. Over here you can add SSL certificates, but also regular certificates

image

Keep in mind though, if you are going to use certificates in your own project, please just add them to Azure Key Vault in order to keep them secure. Using the Key Vault is the preferred way to work with certificates (and secrets).

For the purpose of this post I’ve just pressed the Upload Certificate-link, which will prompt you with a new blade from which you can upload a private or public certificate.

clip_image001[4]

You will be able to see the certificate’s thumbprint, name and expiration date on the SSL blade if it has been added correctly.

image

There was a time where you couldn’t use certificates if your Azure Functions were located on a Consumption plan. Luckily this issue has been resolved, which means we can now use our uploaded certificates in both a Consumption and an App Service plan.

Configure the Function App

As I had written before, in order to use certificates in your code there is one little configuration matter which has to be addressed. By default the Function App (read: App Service) is locked down quite nicely which results in not being able to retrieve certificates from the certificate store.

The code I’m using to retrieve a certificate from the store is shown below.

private static X509Certificate2 GetCertificateByThumbprint()
{
    var store = new X509Store(StoreName.My, StoreLocation.CurrentUser);
    store.Open(OpenFlags.ReadOnly | OpenFlags.OpenExistingOnly);
    var certificateCollection = store.Certificates.Find(X509FindType.FindByThumbprint, CertificateThumprint, false);

    store.Close();

    foreach (var certificate in certificateCollection)
    {
        if (certificate.Thumbprint == CertificateThumprint)
        {
            return certificate;
        }
    }
    throw new CryptographicException("No certificate found with thumbprint: " + CertificateThumprint);
}

Note, if you upload a certificate to your App Service, Azure will place this certificate inside the `CurrentUser/My` store.

Running this code right now will result in an empty `certificateCollection` collection, therefore a `CryptographicException` is thrown. In order to get access to the certificate store we need to add an Application Setting called `WEBSITE_LOAD_CERTIFICATES`. The value of this setting can be any certificate thumbprint you want (comma separated) or just add an asterisk (*) to allow any certificate to be loaded.

After having added this single application setting the above code will run just fine and return the certificate matching the thumbprint.

Using the certificate

Using certificates to sign or validate values isn’t rocket science, but strange things can occur! This was also the case when I wanted to use my own self-signed certificate in a function.

I was loading my private key from the store and used it to sign some message, like in the code below.

private static string SignData(X509Certificate2 certificate, string message)
{
    using (var csp = (RSACryptoServiceProvider)certificate.PrivateKey)
    {
        var hashAlgorithm = CryptoConfig.MapNameToOID("SHA256");
        var signature = csp.SignData(Encoding.UTF8.GetBytes(message), hashAlgorithm);
        return Convert.ToBase64String(signature);
    }
}

This code works perfectly, until I started running it inside an Azure Function (or any other App Service for that matter). When running this piece of code I was confronted with the following exception

System.Security.Cryptography.CryptographicException: Invalid algorithm specified.
    at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
    at System.Security.Cryptography.Utils.SignValue(SafeKeyHandle hKey, Int32 keyNumber, Int32 calgKey, Int32 calgHash, Byte[] hash, Int32 cbHash, ObjectHandleOnStack retSignature)
    at System.Security.Cryptography.Utils.SignValue(SafeKeyHandle hKey, Int32 keyNumber, Int32 calgKey, Int32 calgHash, Byte[] hash)
    at System.Security.Cryptography.RSACryptoServiceProvider.SignHash(Byte[] rgbHash, Int32 calgHash)
    at System.Security.Cryptography.RSACryptoServiceProvider.SignData(Byte[] buffer, Object halg)

So, an `Invalid algorithm specified`? Sounds strange, as this code runs perfectly fine on my local system and any other system I ran it on.

After having done some research on the matter, it appears the underlying Crypto API is choosing the wrong Cryptographic Service Provider. From what I’ve read the framework is picking CSP number 1, instead of CSP 24, which is necessary for SHA-265. Apparently there have been some changes on this matter in the Windows XP SP3 era, so I don’t know why this still is a problem with our (new) certificates. Then again, I’m no expert on the matter.

If you are experiencing the above problem, the best solution is to request new certificates created with the `Microsoft Enhanced RSA and AES Cryptographic Provider` (CSP 24). If you aren’t in the position to request or use these new certificates, there is a way to overcome the issue.

You can still load and use the current certificate, but you need to export all of the properties and create a new `RSACryptoServiceProvider` with the contents of this certificate. This way you can specify which CSP you want to use along with your current certificate.
The necessary code is shown in the block below.

private static string SignData(X509Certificate2 certificate, string message)
{
    using (var csp = (RSACryptoServiceProvider)certificate.PrivateKey)
    {
        var hashAlgorithm = CryptoConfig.MapNameToOID("SHA256");

        var privateKeyBlob = csp.ExportCspBlob(true);
        var cp = new CspParameters(24);
        var newCsp = new RSACryptoServiceProvider(cp);
        newCsp.ImportCspBlob(privateKeyBlob);

        var signature = newCsp.SignData(Encoding.UTF8.GetBytes(message), hashAlgorithm);
        return Convert.ToBase64String(signature);
    }
}

Do keep in mind, this is something you want to use with caution. Being able to export all properties of a certificate, including the private key, isn’t something you want to expose to your code very often. So if you are in need of such a solution, please consult with your security officer(s) before implementing!

As I mentioned, the code block above works fine inside an App Service and also when running inside an Azure Function on the App Service plan. If you are running your Azure Functions in the Consumption plan, you are out of luck!
Running this code will result in the following exception message.

Microsoft.Azure.WebJobs.Host.FunctionInvocationException: Exception while executing function: Sign ---> System.Security.Cryptography.CryptographicException: Key not valid for use in specified state.
   at System.Security.Cryptography.CryptographicException.ThrowCryptographicException(Int32 hr)
   at System.Security.Cryptography.Utils.ExportCspBlob(SafeKeyHandle hKey, Int32 blobType, ObjectHandleOnStack retBlob)
   at System.Security.Cryptography.Utils.ExportCspBlobHelper(Boolean includePrivateParameters, CspParameters parameters, SafeKeyHandle safeKeyHandle)
   at Certificates.Sign.SignData(X509Certificate2 certificate, String xmlString)
   at Certificates.Sign.Run(HttpRequestMessage req, String message, TraceWriter log)
   at lambda_method(Closure , Sign , Object[] )
   at Microsoft.Azure.WebJobs.Host.Executors.MethodInvokerWithReturnValue`2.InvokeAsync(TReflected instance, Object[] arguments)
   at Microsoft.Azure.WebJobs.Host.Executors.FunctionInvoker`2.d__9.MoveNext()

My guess is this has something to do with the nature of the Consumption plan and it being a ‘real’ serverless implementation. I haven’t looked into the specifics yet, but not having access to server resources makes sense.

It has taken me quite some time to figure this out, so I hope it helps you a bit!