# Using Azure Functions to empower your Teams

In today’s world we’re receiving an enormous amount of e-mail.
A lot of the e-mail I’m receiving during the day (and night) is about errors happening in our cloud environment and sometimes someone needs to follow up on this.

At the moment this is a real pain because there’s a lot of false-positives in those e-mails due to the lack of configuration and possibilities in our monitoring software. The amount of e-mails is so painful, most of us have created email rules so these monitoring emails ‘go away’ and we only check them once per day. Not really an ideal solution.

But what if I told you all of this pain can go away with some serverless magic and the power of Microsoft Teams. Sounds great, right?

# How to integrate with Microsoft Teams?

This is actually the easiest part if you’re a developer.

If you’re already running Microsoft Teams on your Office 365 tenant, you can add a channel to a team to which you belong and add a Webhook connector to it. I’ve created a channel called Alerts on which I added an Incoming Webhook connector.

After having saved the connector you’ll be able to copy the actual webhook URL which you need to use in order to POST messages to the channel.

In order to test this webhook, you can spin up Postman and send a POST request to the webhook URL.

The only thing you need to specify is the text property, but in most cases adding a title makes the message a bit prettier.

{
"title": "The blog demo",
"text": "Something has happened and I want you to know about it!"
}


When opening up the Teams you’ll see the message added to the channel.

That’s all there is to it in order to set up integration from an external system to your Team.

# How will this help me?

Well, think about it. By sending a POST to a webhook, you can alert one (or more) teams inside your organization. If there’s an event which someone needs to respond to, like an Application Insights event or some business logic which is failing for a non-obvious reason, you can send this message real-time to the team responsible for the service.

Did you also know you can create so-called ‘actionable messages’ within Teams? An actionable message can be created with a couple of buttons which will invoke an URL when pressed. In Teams this looks a bit like so:

By pressing either one of those buttons a specified URL gets invoked (GET) and as you can probably imagine, those URL’s can be implemented to resolve the event automatically which has triggered the message in the first place.

A schematic view on how you can implement such a solution is shown below.

Over here you’re seeing an Event Grid, which contains events of stuff happening in your overall Azure solution. An Azure Function is subscribed to a specific topic and once it’s triggered a message is being posted on the Teams channel. This can be an actionable message or a plain message.
If it’s an actionable message, a button can be pressed which in its turn also sends a GET-request to a different Azure Function. You want this Function to be fast, so the only thing it does is validate the request and stores the message (command) on a (Service Bus) queue. A different Azure Function will be triggered, which will make sure the command will be executed properly by invoking an API/service which is responsible for ‘solving’ the issue.
Of course, you can also implement the resolving logic inside the last Azure Function, this depends a bit on your overall solution architecture and your opinion on decoupling systems.

# How will my Azure Function post to Teams?

In order to send messages to Teams via an Azure Function, you will have to POST a message to a Teams webhook. This works exactly the same as making a HTTP request to any other service. An example is shown over here.

private static readonly HttpClient HttpClient = new HttpClient();

[FunctionName("InvokeTeamsHook")]
[HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "InvokeTeams")]
HttpRequestMessage req,
ILogger log)
{

var plainTeamsMessage = new PlainTeamsMessage { Title = message.Title, Text = message.Text };
var content = new StringContent(JsonConvert.SerializeObject(plainTeamsMessage), Encoding.UTF8, "application/json");

}

public class IncomingTeamsMessage
{
public string Title { get; set; }
public string Text { get; set; }
}

private class PlainTeamsMessage
{
public string Title { get; set; }
public string Text { get; set; }
}


This sample is creating a ‘plain’ message in Teams. When POSTing a piece of JSON in the IncomingTeamsMessage format to the Azure Function, for example, the following.

{
"title": "My title in Teams",
"text": "The message which is being posted."
}


It will show up as the following message within Teams.

Of course, this is a rather simple example. You can extend this by also creating actionable messages. In such a case, you need to extend the model with the appropriate properties and POST it in the same way to Teams.

Even though Teams isn’t something I develop a lot for (read: never), I will spend the upcoming weeks investigating on how to update our DevOps work to the 21st century. By leveraging the power of Teams I’m pretty sure a lot of ‘manual’ operations can be made easier, if not automated completely.

# Create your own custom bindings with Azure Functions

The default Azure Functions runtime comes with quite a lot of bindings and triggers which enable you to create a highly scalable solution within the Azure environment. You can connect to service buses, storage accounts, Event Grid, Cosmos DB, HTTP calls, etc.

However, sometimes this isn’t enough.
That’s why the Azure Functions team has released functionality which enables you to create your own custom bindings. This should make it easy for you to read and write data to any service or location you need to, even if it’s not supported out of the box.

There is some documentation available on how to create a custom binding at this time and even a nice sample on GitHub to get you started. The thing is…this documentation and samples are written for Version 1 of the Azure Functions runtime. If you want to use custom bindings in Azure Functions V2, you need to do some additional stuff. There are still changes being made on this subject, so it’s quite possible the current workflow will be broken in the future.

For this post, I’ve created a sample binding which is capable of reading data from a local disk. Nothing fancy and definitely not something you want in production, but it’s easy to test and shows you how the stuff has to be set up.

The first step you need to take is to create a new class library (NetStandard 2) in which you will add all the files necessary to create a custom binding. This class library is necessary because it’s loaded inside the runtime via reflection magic.

Once you’ve created this class library, you can continue creating a Binding, which is also mentioned in the docs. A binding can look like this.

[Extension("MySimpleBinding")]
public class MySimpleBinding : IExtensionConfigProvider
{
public void Initialize(ExtensionConfigContext context)
{
rule.BindToInput<MySimpleModel>(BuildItemFromAttribute);
}

private MySimpleModel BuildItemFromAttribute(MySimpleBindingAttribute arg)
{
string content = default(string);
if (File.Exists(arg.Location))
{
}

return new MySimpleModel
{
FullFilePath = arg.Location,
Content = content
};
}
}


Implement the IExtensionConfigProvider and specify a proper BindingRule.

And of course, we shouldn’t forget to add an attribute.

[Binding]
[AttributeUsage(AttributeTargets.Parameter | AttributeTargets.ReturnValue)]
public class MySimpleBindingAttribute : Attribute
{
[AutoResolve]
public string Location { get; set; }
}

Because we’re using a self-defined model over here called MySimpleModel it makes sense to add this to your class library as well. I like to keep it simple, so the model only has 2 properties.

public class MySimpleModel
{
public string FullFilePath { get; set; }
public string Content { get; set; }
}


According to the docs, this is enough to use the new custom binding in your Azure Functions like so.

[FunctionName("CustomBindingFunction")]
public static IActionResult RunCustomBindingFunction(
[HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = "custombinding/{name}")]
HttpRequest req,
string name,
[MySimpleBinding(Location = "%filepath%\\{name}")]
MySimpleModel simpleModel)
{
return (ActionResult) new OkObjectResult(simpleModel.Content);
}


But, this doesn’t work. Or at least, not at this moment.

When starting the Azure Function emulator you’ll see something similar to the following.

[3-1-2019 08:51:37] Error indexing method 'CustomBindingFunction.Run'

[3-1-2019 08:51:37] Microsoft.Azure.WebJobs.Host: Error indexing method 'CustomBindingFunction.Run'. Microsoft.Azure.WebJobs.Host: Cannot bind parameter 'simpleModel' to type MySimpleModel. Make sure the parameter Type is supported by the binding. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).

[3-1-2019 08:51:37] Function 'CustomBindingFunction.Run' failed indexing and will be disabled.

[3-1-2019 08:51:37] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).

Not what you’d expect when following the docs line by line.

The errors do give a valid pointer though. It’s telling us we should have registered the Type on startup via the IWebJobsBuilder builder. Makes sense, if you’re using Azure App Service WebJobs.
Seeing Azure Functions are based on Azure App Services, it kind of makes sense there’s also some/a lot of shared logic between Azure Functions and Azure Web Jobs.

So, what do you need to do now?
Well, add an IWebJobsStartup implementation and make sure to add your extension to the IWebJobsBuilder. The startup class should look a bit like this.

[assembly: WebJobsStartup(typeof(MySimpleBindingStartup))]
namespace MyFirstCustomBindingLibrary
{
public class MySimpleBindingStartup : IWebJobsStartup
{
public void Configure(IWebJobsBuilder builder)
{
}
}
}


To make stuff pretty, I’ve created an extension method to add my simple binding.

public static IWebJobsBuilder AddMySimpleBinding(this IWebJobsBuilder builder)
{
if (builder == null)
{
throw new ArgumentNullException(nameof(builder));
}

return builder;
}


Having added these classes to your class library will make sure the binding will get picked up via reflection when starting up the Azure Function. Don’t forget to add the assembly-attribute at the top of the startup class. If you do, the binding won’t get resolved (ask me how I know…).

If you want to see all of the code and how this interacts with each other, please check out my GitHub repository on this subject. Or, if this post has helped you feel free to add a ‘Thank you’-comment or upvote my question (and answer) on Stack Overflow.

# It’s a new year!

Happy 2019 all!

Just like every other blogger on the world, I also want to write a small retrospective on 2018 and prospective on 2019.

Let’s start with the retrospective first. As I mentioned last year, we were expecting a daughter somewhere in January of 2018. As it happens, she is born on January 24th and very healthy. Even though this was still early in the year, I knew for a fact this day would be the best one of the entire year.
Having 2 small kids growing up in the house is hard, but so gratifying. Being able to work remote is also a great way to see your kids grow up. I think this is very important because everything is happening just too fast!

Aside from blogging quite a lot last year, I’ve also been active in the speaking world. Speaking at a meetup, user group or conference is a great way to spread knowledge to a group of people, interact with them and make new friends. I really love doing this and am grateful for my employer, 4DotNet, is very keen in providing support whenever needed. The best thing about speaking at conferences in different countries is to meet new people and cultures. It’s something I can recommend to everyone.

Because I’ve spent a lot of time blogging, speaking, contributing to open source projects, etc. I’ve also been awarded the Microsoft MVP award in the Microsoft Azure category. I already mentioned the best day of the year was when my daughter was born, but being awarded the Microsoft MVP award certainly was a great day also! Thanks to Tina for awarding me and Gerald for nominating me!

## So what’s up for 2019?

I’m not sure yet.

There’s still lots of stuff I want to learn and investigate further in the serverless & Azure Functions space. But, just as last year, I also want to spend some time on Cosmos DB. From a technical perspective, there’s enough to learn in the Azure space and being awarded the Microsoft MVP award really helps me with this learning.

Of course, there’s the global MVP Summit in March. I’m certainly going to be over there. It’s the first time I’ll be at the Microsoft campus and meet hundreds of other MVPs and Microsoft FTE’s.

As mentioned, I love speaking in front of a group. However, I have had the feeling I’ve been gone from home a lot the past year. Speaking abroad, living in hotels for the day job, etc. In 2019 I’m focusing on speaking a bit more in the Benelux so I won’t have to be gone from home for long (if at all). Hopefully, this will work out better with having 2 small kids at home.

For leisure? There’s a lot of books I still have to read. I’m still in book 3 of the 14-book Wheel of Time series. I’ve also bought a couple of new games for my Xbox One, which will take up about 800 hours of gameplay. Then there’s also my other hobby, cooking and barbecuing, which I also want to spend some more time with.

So lots of stuff to do in 2019 and so little time.

What are you all planning to do and have you finished everything you wanted to do in 2018?

# My new desktop build

I’ve been using my current desktop for almost 8 years now and it’s still running quite fine! In order to support 3 monitors, including at least one 4K, the graphics card did get an update to a GTX 950 a while back. But other than that it’s still exactly the same and quite performant. Development is snappy enough, browsing still superb and doing some light modifications in Lightroom or Photoshop is doable.

So, why do I want to upgrade? Well, I want to focus a bit more on semi-pro photography. While it is quite doable to import and modify photo’s on the current system, it’s too slow if you need to do this often.

Being a proper nerd I have read almost every news article, announcement and review of all of the hardware which was released or will be released in the near future. All in order to make the ‘best’ build I can afford.
From the information I had gathered, the 'smart' choice for a processor appears to be AMD. Those processors are rather cheap, have got quite a bit of threads and a decent clock speed. The main problem with them is…they aren’t Intel. You can call me stupid, but I still prefer to have an Intel processor in my system.
Doing stuff in Lightroom and Photoshop requires a high (single core) clock speed. The Intel processors range as high as 5GHz, which is amazing in my mind. However, for development purposes, having multiple cores/threads is also rather useful.
When taking a look at the latest generation of Intel's processors, this will automatically point me to the Intel i9 9900K. A rather expensive processor (at the moment), but it has all the features I want for now. A clock speed of 5GHz (Turbo or overclocked) and with the hyperthreading feature enabled it will have 16 threads to use.

While reading up on the i9 9900k processor, I had read they can become quite warm. Stock coolers aren’t something I’d advice anyone and water-cooling seemed appropriate for this system. The all-in-one systems are also rather affordable nowadays, so I felt the need to buy one of these systems. One advantage of having an AIO cooler, they hardly make any sound!

Storage is rather cheap, so going full-SSD is recommended for both development and photography. In my current workflow I don’t need more as 1TB of storage space.
The current NVME drives aren't -that- expensive anymore, which pointed me to a nice 500GB NVME drive and a 500GB SATA SSD drive. I like having 2 drives (or more) separate drives. Installing 1 1TB drive might give me more of a performance boost, but also more ‘risk’ in loosing *everything* when it crashes. Now I only loose half of my files (yeah, I do create backups and yes, I know this isn’t very sane reasoning)

Graphics aren't very important to me. My only demand is the graphics card has to be able to support at least 3 4K monitors (preferably Cinema 4K) at 60Hz in Windows. Having support for higher resolutions and more monitors is nice, but not necessary at the moment.

A motherboard is just one of those pieces of hardware I don't care about much. Having the availability for multiple NVME drives and overclocking features are nice-to-haves.

Memory still is rather expensive, especially if you want those packs which have the potential to overclock a bit. Because my funding is limited, I went for a 16GB pack which should be able to run at 4266MHz. Having 32GB available would only be useful for large Photoshop edits, but for now I hope 16GB will be enough. For development purposes 16GB will certainly be enough (for me) as I’m not creating virtual machines anymore. If I do, I just spin them up in Azure or use Docker containers locally.

Putting it all together, this is what I came up with in the end.

CPU
Intel i9 9900K
CPU Cooler
NZXT Kraken X62
Motherboard
Asus ROG STRIX Z390-E GAMING
Graphics card
GeForce® GTX 1060 G1 Gaming 6G
PSU
BitFenix Formula Gold 550W
Hard drives
SSD 970 EVO NVMe M.2 500GB
Samsung 860 EVO 500GB
Case
Antec P110 Luce
Memory
G.Skill F4-4266C19D-16GTZSW

After having assembled the system the first thing I noticed how quiet it is! It doesn't make any sound. This also has a downside, because now I hear my NAS all the time which is standing like 4 meters away from me.

As mentioned, the memory should be able to run at 4266MHz with the XMP profile loaded. So far I haven’t been able run the system stable at this speed. In order to get this working I might have to start tweaking the voltages a bit, but I haven't tried this yet.
For now I want this system to run stable for some time and once it has proven itself, I might start tweaking the settings a bit more.

So, how does it perform you wonder?
Well, let's run the real-world performance test I read at Scott Hanselman's post some time ago.

From what I’ve read in the comment section of Scott’s post, this might get a bit better when I start tweaking and overclocking the system a bit more. But, for development & photo editing purposes this is fast enough for now.

I don’t really have performance tests for my Lightroom & Photoshop work, but after having imported and edited a couple of photos I can say the performance gain is real! I never have to wait anymore, so doing all of this photo editing work is finally fun again.

All in all, quite a happy camper! Let’s see how long this system will be able to keep up with all of the my work.

# Deploying your ARM templates via PowerShell

You might have noticed I’ve been doing quite a bit of stuff with ARM templates as of late. ARM templates are THE way to go if you want to deploy your Azure environment in a professional and repeatable fashion.

Most of the time these templates get deployed in your Release pipeline to the Test, Acceptance or Production environment. Of course, I’ve set this up for all of my professional projects along with my side projects. The thing is, when using the Hosted VS2017 build agent, it can take a while to complete both the Build and Release job via VSTS Azure DevOps.
Being a reformed SharePoint developer, I’m quite used to waiting on the job. However, waiting all night to check if you didn’t create a booboo inside your ARM template is something which became quite boring, quite fast.

So what else can you do? Well, you can do some PowerShell!

The Azure PowerShell cmdlets offer quite a lot of useful commands in order to manage your Azure environment.

One of them is called New-AzureRmResourceGroupDeployment. According to the documentation, this command will “Adds an Azure deployment to a resource group.”. Exactly what I want to do, most of the time.

So, how to call it? Well, you only have to specify the name of your deployment, which resource group you want to deploy to and of course the ARM template itself, along with the parameters file.

New-AzureRmResourceGroupDeployment
-Name LocalDeployment01
-ResourceGroupName my-resource-group
-TemplateFile C:\path\to\my\template\myTemplate.json
-TemplateParameterFile C:\path\to\my\template\myParameterFile.test.json


This script works for deployments which you are doing locally. If your template is located somewhere on the web, use the parameters TemplateParameterUri and TemplateUri.

Keep in mind though, if there’s a parameter in the template with the same name as a named parameter of this command, you have to specify this manually after executing the cmdlet. In my case, I had to specify the value of the resourcegroup parameter in my template manually.

cmdlet New-AzureRmResourceGroupDeployment at command pipeline position 1
Supply values for the following parameters:
(Type !? for Help.)
resourcegroupFromTemplate: my-resource-group


As you can see, this name gets postfixed with FromTemplate to make it clearer.

When you’re done, don’t forget to run the Remove-AzureRmDeployment a couple of times in order to remove all of your manual test deployments.