Stuff I like from this year’s Build conference

.NET Core 3 Preview 4 can be downloaded right now. This new preview brings the Chart control to .NET Core & some WPF improvements. What I like most is the versioning change and the Tiered Compilation. I’m still reading up on this second thing, but it sounds awesome a major reason to upgrade to the latest & greatest! Some more information on TC can be found on a Microsoft blog post dedicated to it: https://devblogs.microsoft.com/dotnet/tiered-compilation-preview-in-net-core-2-1/
The Docker improvements are nice also, but I’m not using it a lot these days so don’t have much of an opinion on these things.

Something which struck me as odd is the announcement of .NET 5, which is .NET Core vNext. The explanation of this version number clears things up a bit though.

We’re skipping the version 4 because it would confuse users that are familiar with the .NET Framework, which has been using the 4.x series for a long time. Additionally, we wanted to clearly communicate that .NET 5 is the future for the .NET platform.

From what I gather from this we can all (.NET Framework and .NET Core) to .NET 5 without much hassle. I don’t know if this is true, but it would make sense from my perspective.

Another announcement was the integration of GitHub and Azure DevOps, which is great! Where Azure DevOps shines at being a great enterprise/company working environment, GitHub shines in collaboration with external people. With this integration, we’ll be able to leverage the power of both platforms where needed more easily. Also, signing in with your GitHub account on Azure DevOps or the Azure Portal is big! You can now add your GitHub identity to the Azure Active Directory and be done with it. Awesome!

The same post mentions being able to do deployment pipelines via YAML now. It’s called Unified Pipelines, which is a great improvement!
By chance we were also looking to Azure Artifacts at the project I’m working on now and with the announcement of yesterday this seems appears like a no-brainer now as the service is now included in the Basic license!

Then there’s the Kubernetes news, which. is. the. best!
I’ve never felt much for the container ecosystem. I do understand what it does and how it’s an improvement to IaaS, but I live mostly in a PaaS, FaaS and SaaS environment, so moving back to ‘the virtual metal’ feels like a step back.
But, the serverless Kubernetes and Kubernetes-based Event-driven Autoscaling (KEDA) is amazing! While it might not be fully serverless, it’s still something which can bring a lot of value to our customers and if we’re honest, that’s what matters the most. I do wonder if this will also become available on theOpenShift platformas KEDA is a partnering between Microsoft & Red Hat. KEDA will also be available on the OpenShift platform!
It’s also open source and can be found on GitHub: https://github.com/kedacore/keda

Are you using Windows Subsystem for Linux? Well, version 2 is announced also, which is even better! It’s faster, has more system calls available and Docker images run right away!
Something which goes hand in hand with the WSL is the command-line terminal and it has had some major improvement! Scott Hanselman has written a nice post on it and how you can tune the terminal. While I was just getting used to ConEmu, I think I’ll just use the new Terminal experience right from the start.

There were also a lot of announcements on machine learning, blockchain, artificial intelligence, mixed reality, etc. While all of this sounds great, it’s not stuff I work with on a daily basis. If you’re interested in this stuff, I’d advise you to check out all of the news posts on these other subjects.
I’ll be sure to check out KEDA once I get around to it, it’s just too awesome not to play with it.

Update

While scrolling through my Twitter feed today I saw a lot of other stuff which peeked my interest.

Apparently dependency injection in Azure Functions is a thing now, for quite some while. I probably missed this announcement in March, as the documentation dates from a couple of months ago. Still, very useful!

The Durable Functions also have gotten quite a bit of love for stateful actor-like capabilities. While I haven’t worked with the Actor model (yet), it is something I would love to dig into. With this announcement working with the Actor model will be even more awesome!
And of course, the API Management integration of Azure Functions is great! There already was some basic Proxy functionality in Azure Functions, but with the full integration of APIM this will work even better!

For App Services we now have a Free Linux tier. Even though I don’t do a lot of development which will be published to a Linux App Service, I do recognize this is a great improvement for a lot of projects. Connecting to a vNet is now also supported for Linux App Services, which is great if you’re working in a hybrid scenario, or have some ‘secured’ resources somewhere.
From the UX improvements it also looks like Docker integration can be found everywhere nowadays. So if you haven’t looked into containers & Docker yet, now is a time to do so. It’ll become very important for us developers to at least know the basics off of it.

Let’s not forget one major improvement in the security space. We now have support for passwords with a length of 256 characters! Finally!

Update 2

It appears the full Entity Framework will become available to .NET Core 3.0! This is major, because the full Entity Framework has a lot more capabilities compared to the Entity Framework Core version. People still stuck on the .NET Framework because of EF should now be able to upgrade. Still, the packages are in preview, so don’t migrate just yet.

Something I like very much is the enhanced syntax highlighting for ARM templates in VS Code. Also, the enhanced schema ‘intellisense’ is very useful when working with these large JSON files.

Have you also noticed the improved Azure Portal? Lots of small improvements which makes the navigation in the portal a lot more slick. Especially the improved resource graphs are useful. Now I don’t have to create my own dashboards in PowerBI or something like it anymore.

With the announced GitHub integration, we also get Azure Boards integration to GitHub. This will make managing your code & project a lot easier when working with both systems on a single project.

And last, but certainly not least, the React Native for Windows repository. Not sure if this was announced at //build/, but I saw it appear in my timeline and it’s awesome enough to mention over here. Now you can create a React Native application and deploy it as a Windows 10 application, which makes it supported for devices like PC’s, tablets, Xbox, Hololens, etc. Great stuff indeed!

I’ve written about empowering your Teams with Azure Functions a while back, but this isn’t the only way to create value. You can also use Azure Logic Apps.

Logic Apps are a way to express powerful integrations with (several different) systems in a visual workflow based way. It has a lot of similarities with other (Microsoft) workflow systems from the past, so it should strike very familiar to most (Enterprise) developers.

Being a visual workflow solution, it doesn’t warm the heart of most developers. However, the world doesn’t consist solely of developers and this solution being visual is a very big advantage if you’re not a coder or like to deliver value instead of just more code.

First step

The first step you need (or actually, WANT) to take is create a Webhook connector on a channel. You can check my previous post on how to do this.

Posting to this channel has to be done in a similar way. You will still need to post some JSON in a predefined format to this webhook.

Next step: Setting up Alerts

In order to make your DevOps process a bit easier, it’s very useful to leverage the power of Application Insights and Alerts. For this to work, you need to know what metrics you actually want to be alerted for. I’m going to assume you already have some monitoring in place with appropriate metrics. If not, you should definitely define some. They can be tuned afterward.

Adding or modifying new alerts is as easy as clicking on the `Alerts` option in your service.

clip_image001

On this blade, you’ll see an overview of all alerts which are already defined and can create new ones.

clip_image001[7]

When creating new alert rules you have to specify which signal type you want to create an alert for. At the moment there are three different types you can choose from Metrics, Log Search and Activity Log.

The other filter you can use is the Monitor Service.

If you leave both options to `All`, you’ll see all possible type of signals to create alert rules for.

clip_image001[9]

In my case, I like to receive an alert when my service plan is hitting its limits, like a high CPU, Memory usage or low response times. You can configure all of this, and more, on this page. The one on the image below shows you how to set an alert when the CPU has an average usage of at least 60%.

clip_image001[5]

By selecting the proper resource group and condition you want to get alerted for, you can specify one or more so-called Action Groups.

clip_image001[1]

A single action group is responsible for handling one specific action, like sending data to a webhook, as shown over here.

clip_image001[3]

Keep in mind though, you should NOT fill out the webhook from the Teams channel over here. Posting a message to a Teams channel requires a specific JSON message, which isn’t compatible with the JSON sent via an Alert. The webhook you want to specify over here is the location to the handler of your JSON message, like the logic app we’ll create in the next step.

Handling the Alerts

After having set up your alerts and having made sure they actually work, it’s time to handle them. In order to check if the alerts worked, I’ve lowered the thresholds a bit in order to receive alerts in an orderly fashion.

You can view which alerts have been triggered via the Alerts page in Azure Monitor.

clip_image001[7]

What we need to do now, is receive the JSON of the Alert and send it in a different format to our Teams webhook. As I already mentioned, the easiest way to do this is via Azure Logic Apps. You can even make external calls to other systems, Azure Functions, etc.

The first thing you need to do in your logic app is to specify the JSON scheme which is sent to the app. There is quite a bit of documentation available on this, but I find the easiest way is to fail fast and go from there.
What I mean with this is, create the Logic App without a good schema and save it.
By doing so you will now have the webhook address of your Logic App. You can now go back to your Alert Action Group and fill out this address in the webhook textbox.

Going back to the Logic App, you will now probably see a couple of failed events.

clip_image001[9]

If not, make sure to trigger one or two Alerts in order to get these failed events. What’s great about this is the complete context of this event is stored, including the JSON message.

clip_image001[11]

For now, the only thing you need to do is to copy the contents of the `body` element.

This content can be pasted inside your step `When a HTTP request is received` on the link `Use sample payload to generate schema`.

clip_image001[13]

This saves you from going through the docs, only to discover something is missing or something even worse. The schema of your message is now auto-created.

This enables us to create a new HTTP-step in order to POST a message to our Teams channel

image

Of course, you can make this message as fancy as you’d like, but this is about all the basics you need in order to create a basic alert on Teams.

image

This all looks very complex

Well, if you gloss over it, it might look like this. Especially if you know there are also out of the box Teams actions which you can leverage in a Logic App.

clip_image001[15]

The ‘downside’ (or maybe it’s an upside) to these actions is they need an Identity known in Teams (= your Office 365 tenant). While it’s possible to create a special identity for this, it’s not something I like much for this specific case.

One other thing, you still need to do everything yourself when using the default Teams actions. The only thing it’ll make a bit easier is POST’ing the message to Teams. While the JSON body might look a bit hard at first, it’ll grow on you and will enable you to create messages with a bit more flexibility.

But if you’re not a developer or operations person, the out of the box actions might be good enough for you.


That’s it for now. I’ll continue this series with some other posts on how to use all of this in your production environment and save you some time on repetitive operational work.

In today’s world we’re receiving an enormous amount of e-mail.
A lot of the e-mail I’m receiving during the day (and night) is about errors happening in our cloud environment and sometimes someone needs to follow up on this.

At the moment this is a real pain because there’s a lot of false-positives in those e-mails due to the lack of configuration and possibilities in our monitoring software. The amount of e-mails is so painful, most of us have created email rules so these monitoring emails ‘go away’ and we only check them once per day. Not really an ideal solution.

But what if I told you all of this pain can go away with some serverless magic and the power of Microsoft Teams. Sounds great, right?

How to integrate with Microsoft Teams?

This is actually the easiest part if you’re a developer.

If you’re already running Microsoft Teams on your Office 365 tenant, you can add a channel to a team to which you belong and add a Webhook connector to it. I’ve created a channel called `Alerts` on which I added an `Incoming Webhook` connector.

image

After having saved the connector you’ll be able to copy the actual webhook URL which you need to use in order to POST messages to the channel.

image

In order to test this webhook, you can spin up Postman and send a POST request to the webhook URL.

The only thing you need to specify is the `text` property, but in most cases adding a `title` makes the message a bit prettier.

{
	"title": "The blog demo",
	"text": "Something has happened and I want you to know about it!"
}

When opening up the Teams you’ll see the message added to the channel.image

That’s all there is to it in order to set up integration from an external system to your Team.

How will this help me?

Well, think about it. By sending a POST to a webhook, you can alert one (or more) teams inside your organization. If there’s an event which someone needs to respond to, like an Application Insights event or some business logic which is failing for a non-obvious reason, you can send this message real-time to the team responsible for the service.

Did you also know you can create so-called ‘actionable messages’ within Teams? An actionable message can be created with a couple of buttons which will invoke an URL when pressed. In Teams this looks a bit like so:

image

By pressing either one of those buttons a specified URL gets invoked (GET) and as you can probably imagine, those URL’s can be implemented to resolve the event automatically which has triggered the message in the first place.

A schematic view on how you can implement such a solution is shown below.


image

Over here you’re seeing an Event Grid, which contains events of stuff happening in your overall Azure solution. An Azure Function is subscribed to a specific topic and once it’s triggered a message is being posted on the Teams channel. This can be an actionable message or a plain message.
If it’s an actionable message, a button can be pressed which in its turn also sends a GET-request to a different Azure Function. You want this Function to be fast, so the only thing it does is validate the request and stores the message (command) on a (Service Bus) queue. A different Azure Function will be triggered, which will make sure the command will be executed properly by invoking an API/service which is responsible for ‘solving’ the issue.
Of course, you can also implement the resolving logic inside the last Azure Function, this depends a bit on your overall solution architecture and your opinion on decoupling systems.

How will my Azure Function post to Teams?

In order to send messages to Teams via an Azure Function, you will have to POST a message to a Teams webhook. This works exactly the same as making a HTTP request to any other service. An example is shown over here.

private static readonly HttpClient HttpClient = new HttpClient();

[FunctionName("InvokeTeamsHook")]
public static async Task Run(
    [HttpTrigger(AuthorizationLevel.Anonymous, "post", Route = "InvokeTeams")]
    HttpRequestMessage req,
    ILogger log)
{
    var message = await req.Content.ReadAsAsync<IncomingTeamsMessage>();

    var address = Environment.GetEnvironmentVariable("WebhookUrl", EnvironmentVariableTarget.Process);
    var plainTeamsMessage = new PlainTeamsMessage { Title = message.Title, Text = message.Text };
    var content = new StringContent(JsonConvert.SerializeObject(plainTeamsMessage), Encoding.UTF8, "application/json");
    
    await HttpClient.PostAsync(address, content);
}

public class IncomingTeamsMessage
{
    public string Title { get; set; }
    public string Text { get; set; }
}

private class PlainTeamsMessage
{
    public string Title { get; set; }
    public string Text { get; set; }
}

This sample is creating a ‘plain’ message in Teams. When POSTing a piece of JSON in the `IncomingTeamsMessage` format to the Azure Function, for example, the following.

{
	"title": "My title in Teams",
	"text": "The message which is being posted."
}

It will show up as the following message within Teams.

image

Of course, this is a rather simple example. You can extend this by also creating actionable messages. In such a case, you need to extend the model with the appropriate properties and POST it in the same way to Teams.

Even though Teams isn’t something I develop a lot for (read: never), I will spend the upcoming weeks investigating on how to update our DevOps work to the 21st century. By leveraging the power of Teams I’m pretty sure a lot of ‘manual’ operations can be made easier, if not automated completely.

The default Azure Functions runtime comes with quite a lot of bindings and triggers which enable you to create a highly scalable solution within the Azure environment. You can connect to service buses, storage accounts, Event Grid, Cosmos DB, HTTP calls, etc.

However, sometimes this isn’t enough.
That’s why the Azure Functions team has released functionality which enables you to create your own custom bindings. This should make it easy for you to read and write data to any service or location you need to, even if it’s not supported out of the box.

There is some documentation available on how to create a custom binding at this time and even a nice sample on GitHub to get you started. The thing is…this documentation and samples are written for Version 1 of the Azure Functions runtime. If you want to use custom bindings in Azure Functions V2, you need to do some additional stuff. There are still changes being made on this subject, so it’s quite possible the current workflow will be broken in the future.

For this post, I’ve created a sample binding which is capable of reading data from a local disk. Nothing fancy and definitely not something you want in production, but it’s easy to test and shows you how the stuff has to be set up.

The first step you need to take is to create a new class library (NetStandard 2) in which you will add all the files necessary to create a custom binding. This class library is necessary because it’s loaded inside the runtime via reflection magic.

Once you’ve created this class library, you can continue creating a `Binding`, which is also mentioned in the docs. A binding can look like this.

[Extension("MySimpleBinding")]
public class MySimpleBinding : IExtensionConfigProvider
{
    public void Initialize(ExtensionConfigContext context)
    {
        var rule = context.AddBindingRule<MySimpleBindingAttribute>();
        rule.BindToInput<MySimpleModel>(BuildItemFromAttribute);
    }

    private MySimpleModel BuildItemFromAttribute(MySimpleBindingAttribute arg)
    {
        string content = default(string);
        if (File.Exists(arg.Location))
        {
            content = File.ReadAllText(arg.Location);
        }

        return new MySimpleModel
        {
            FullFilePath = arg.Location,
            Content = content
        };
    }
}

Implement the `IExtensionConfigProvider` and specify a proper `BindingRule`.

And of course, we shouldn’t forget to add an attribute.

[Binding]
[AttributeUsage(AttributeTargets.Parameter | AttributeTargets.ReturnValue)]
public class MySimpleBindingAttribute : Attribute
{
    [AutoResolve]
    public string Location { get; set; }
}

Because we’re using a self-defined model over here called `MySimpleModel` it makes sense to add this to your class library as well. I like to keep it simple, so the model only has 2 properties.

public class MySimpleModel
{
    public string FullFilePath { get; set; }
    public string Content { get; set; }
}

According to the docs, this is enough to use the new custom binding in your Azure Functions like so.

[FunctionName("CustomBindingFunction")]
public static IActionResult RunCustomBindingFunction(
    [HttpTrigger(AuthorizationLevel.Anonymous, "get", "post", Route = "custombinding/{name}")]
    HttpRequest req,
    string name,
    [MySimpleBinding(Location = "%filepath%\\{name}")]
    MySimpleModel simpleModel)
{
    return (ActionResult) new OkObjectResult(simpleModel.Content);
}

But, this doesn’t work. Or at least, not at this moment.

When starting the Azure Function emulator you’ll see something similar to the following.

[3-1-2019 08:51:37] Error indexing method 'CustomBindingFunction.Run'

[3-1-2019 08:51:37] Microsoft.Azure.WebJobs.Host: Error indexing method 'CustomBindingFunction.Run'. Microsoft.Azure.WebJobs.Host: Cannot bind parameter 'simpleModel' to type MySimpleModel. Make sure the parameter Type is supported by the binding. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).

[3-1-2019 08:51:37] Function 'CustomBindingFunction.Run' failed indexing and will be disabled.

[3-1-2019 08:51:37] No job functions found. Try making your job classes and methods public. If you're using binding extensions (e.g. Azure Storage, ServiceBus, Timers, etc.) make sure you've called the registration method for the extension(s) in your startup code (e.g. builder.AddAzureStorage(), builder.AddServiceBus(), builder.AddTimers(), etc.).

Not what you’d expect when following the docs line by line.

The errors do give a valid pointer though. It’s telling us we should have registered the `Type` on startup via the `IWebJobsBuilder builder`. Makes sense, if you’re using Azure App Service WebJobs.
Seeing Azure Functions are based on Azure App Services, it kind of makes sense there’s also some/a lot of shared logic between Azure Functions and Azure Web Jobs.

So, what do you need to do now?
Well, add an `IWebJobsStartup` implementation and make sure to add your extension to the `IWebJobsBuilder`. The startup class should look a bit like this.

[assembly: WebJobsStartup(typeof(MySimpleBindingStartup))]
namespace MyFirstCustomBindingLibrary
{
    public class MySimpleBindingStartup : IWebJobsStartup
    {
        public void Configure(IWebJobsBuilder builder)
        {
            builder.AddMySimpleBinding();
        }
    }
}

To make stuff pretty, I’ve created an extension method to add my simple binding.

public static IWebJobsBuilder AddMySimpleBinding(this IWebJobsBuilder builder)
{
    if (builder == null)
    {
        throw new ArgumentNullException(nameof(builder));
    }

    builder.AddExtension<MySimpleBinding>();
    return builder;
}

Having added these classes to your class library will make sure the binding will get picked up via reflection when starting up the Azure Function. Don’t forget to add the assembly-attribute at the top of the startup class. If you do, the binding won’t get resolved (ask me how I know…).

If you want to see all of the code and how this interacts with each other, please check out my GitHub repository on this subject. Or, if this post has helped you feel free to add a ‘Thank you’-comment or upvote my question (and answer) on Stack Overflow.

Happy 2019 all!

Just like every other blogger on the world, I also want to write a small retrospective on 2018 and prospective on 2019.

Let’s start with the retrospective first. As I mentioned last year, we were expecting a daughter somewhere in January of 2018. As it happens, she is born on January 24th and very healthy. Even though this was still early in the year, I knew for a fact this day would be the best one of the entire year.
Having 2 small kids growing up in the house is hard, but so gratifying. Being able to work remote is also a great way to see your kids grow up. I think this is very important because everything is happening just too fast!

Aside from blogging quite a lot last year, I’ve also been active in the speaking world. Speaking at a meetup, user group or conference is a great way to spread knowledge to a group of people, interact with them and make new friends. I really love doing this and am grateful for my employer, 4DotNet, is very keen in providing support whenever needed. The best thing about speaking at conferences in different countries is to meet new people and cultures. It’s something I can recommend to everyone.

Because I’ve spent a lot of time blogging, speaking, contributing to open source projects, etc. I’ve also been awarded the Microsoft MVP award in the Microsoft Azure category. I already mentioned the best day of the year was when my daughter was born, but being awarded the Microsoft MVP award certainly was a great day also! Thanks to Tina for awarding me and Gerald for nominating me!

So what’s up for 2019?

I’m not sure yet.

There’s still lots of stuff I want to learn and investigate further in the serverless & Azure Functions space. But, just as last year, I also want to spend some time on Cosmos DB. From a technical perspective, there’s enough to learn in the Azure space and being awarded the Microsoft MVP award really helps me with this learning.

Of course, there’s the global MVP Summit in March. I’m certainly going to be over there. It’s the first time I’ll be at the Microsoft campus and meet hundreds of other MVPs and Microsoft FTE’s.

As mentioned, I love speaking in front of a group. However, I have had the feeling I’ve been gone from home a lot the past year. Speaking abroad, living in hotels for the day job, etc. In 2019 I’m focusing on speaking a bit more in the Benelux so I won’t have to be gone from home for long (if at all). Hopefully, this will work out better with having 2 small kids at home.

For leisure? There’s a lot of books I still have to read. I’m still in book 3 of the 14-book Wheel of Time series. I’ve also bought a couple of new games for my Xbox One, which will take up about 800 hours of gameplay. Then there’s also my other hobby, cooking and barbecuing, which I also want to spend some more time with.

So lots of stuff to do in 2019 and so little time.

What are you all planning to do and have you finished everything you wanted to do in 2018?