Getting Started with Azure Functions

I was investigating Azure Functions not so long ago when I came across the Azure Function challenge – a set of tasks posted by Microsoft for people to try out and learn about Azure Functions. Azure Functions is a relatively new (at time of writing) service offered by Microsoft, and is Microsoft’s version of AWS Lambda functions. It allows you to write a snippet of code which is triggered by certain events, and you’re charged by the number of times your code is executed and the memory consumption * time (measured in GB-seconds) of your executions. Here’s how I got started.

Editing tools

The first method of actually getting the code into Azure is directly through portal.azure.com. The second method is GitHub combined with Visual Studio Tools for Azure Functions, but it’s still in preview, has a particular set of prerequisites to install, and has a few known issues. However, the biggest advantages is that you get IntelliSense and you can run the code locally in an Azure simulator.

Triggers

Triggers are what cause your Azure Function to run, and you can select which trigger either through the Azure Portal, or by editing the file function.json in the Azure Function’s project directory. A few examples are triggers are BlobTrigger, HttpTrigger, QueueTrigger, and TimerTrigger. The file function.json is where you configure triggers and data bindings.

Input and output binding

Once you get the trigger set up, you most likely will want to get at the input parameters, and also read or write to Azure Storage. For HTTP triggers, the HTTP request is bound by default to a request parameter called req, and the HTTP response is the return value. The example code provided by Microsoft does a good job of illustrating how to use both of them.

Accessing Azure Storage

Sometimes you want to access Azure Storage to… well, to store something from the request. I found the documentation here kind of lacking (Azure Functions hit v1.0 in November 2016) and figured this out through a lot of trial and error, so I hope someone finds this useful.

To write out to an Azure Table, first you need an actual Azure Storage Table… Assuming you have one set up, you need the account name, account key, the storage account’s connection string, and the target table name. Give the table parameter a name like tableBinding, which you will use in code. This is what my configuration looks like in the Azure portal:

This is what the corresponding function.json looks like:

{
  "bindings": [
    {
      "type": "httpTrigger",
      "direction": "in",
      "webHookType": "genericJson",
      "name": "req"
    },
    {
      "type": "http",
      "direction": "out",
      "name": "res"
    },
    {
      "type": "table",
      "name": "tableBinding",
      "tableName": "AzureChallengeTable",
      "connection": "azurefunctionsc99a8a83_STORAGE",
      "direction": "out"
    }
  ],
  "disabled": false
}

Note the “connection” property, which is set to an environment variable which can be defined in your appsettings.json file locally, or in the Azure Functions portal under Function app settings > Configure app settings > App settings section, and contains the connection string to the storage account. This is what allows you to access the Azure Storage account if you’re running locally.

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "",
    "AzureWebJobsDashboard": "",
    "azurefunctionsc99a8a83_STORAGE": "ConnectionStringHere"
 }

We configured the table binding to be called tableBinding, so our function signature now looks like this:

public static HttpResponseMessage Run(HttpRequestMessage req, CloudTable tableBinding, TraceWriter log)

You will also need to add #r Microsoft.WindowsAzure.Storage to the top of the run.csx file and add using Microsoft.WindowsAzure.Storage.Table (and import the Nuget package if you’re coding from VS). Now, we can insert objects that are a subclass of TableEntity (make sure you define how PartitionKey and RowKey are calculated) with a line like this:

tableBinding.Execute(TableOperation.InsertOrReplace(myEntity));

To read:

var tableResult = tableBinding.Execute(TableOperation.Retrieve<MyEntity>(partitionKey, rowKey));
var myEntity = tableResult.Result as MyEntity;
Deploying

If you wrote your code in VS, you can upload the project to GitHub, then use the Azure Functions portal to deploy the code to GitHub. One very important thing I noticed here is that the *.json and run.csx files must be at the top level of your repo; if you have multiple functions residing in multiple directories in the repo, Azure won’t find the code. So this basically means you need one repo per Azure Function.

One other quirk I noticed was that sometimes I got an HTTP 4xx response when testing, which was caused by an Azure bug with function keys. The workaround for this is to call the Azure Function with the key for the function, not the admin key (which is shared by all your Azure Functions).

In conclusion

Azure Functions (and AWS Lambda) is promising because it is simple, covers some very useful and common scenarios, and is easy to scale, but I found the documentation kind of spotty and hope it will improve in the coming months.

Leave a Reply