In the past couple of years the software industry has come a long way in professionalizing the development environment. One of the things which has improved significantly is automating the builds and being able to continuously deploy software.

Having a continuous integration and -deployment environment is the norm nowadays, which means I (and probably you as a reader also) want to have this when creating Azure Functions also!

There are dozens of build servers and deployment tools available, but because Azure Functions are highly likely being deployed in Microsoft Azure, it makes sense to use Visual Studio Team Services with Release Management. I’m not saying you can’t pull this off with any of the other deployment environment, but for me it doesn’t make sense because I already have a VSTS environment and this integrates quite well.

In order for you to deploy your Function App, the first thing you have to make sure is to have an environment (resource group) in your Azure subscription to deploy to. It is advised to use ARM templates for this. There is one big problem with ARM templates though, I genuinely dislike ARM templates. It’s something about the JSON, the long list of variables and ‘magic’ values you have to write down all over the place.
For this reason I first started checking out how to deploy Azure Functions using PowerShell scripts. In the past (3 to 4 years ago) I used a lot of PowerShell scripts to automatically set up and deploy my Azure environments. It is easy to debug, understand and extend. A quick search on the internet showed me the ‘new’ cmdlets you have to use nowadays to spin up a nice resource group and app service. Even though this looked like a very promising deployment strategy, it did feel a bit dirty and hacky. 
In the end I have decided to use ARM templates. Just because I dislike ARM templates doesn’t mean they are a bad thing per se. Also, I noticed these templates have become first-class citizens if you want to deploy software into Azure.

Creating your ARM template

If you are some kind of Azure wizard, you can probably create the templates by yourself. Most of us probably don’t have that level of expertise, so there’s an easier way to get you started.

What I do is head down to the portal, create a resource group and everything which is necessary, like the Function App and extract the ARM template afterwards. Downloading the ARM template is somewhat hidden in the portal, but lucky for us, someone has already asked on Stack Overflow where to find this feature. Once you know where this functionality resides, it makes a bit more sense on why the portal team has decided put it over there.

First of all, you have to navigate to the resource group for which you want to extract an ARM template.


On this overview page you’ll see a link beneath the headline Deployments. Click on it and you’ll be navigated to a page where all the deployments are listed which have occurred on your resource group.

Just pick the one you are interested in. In our case it’s the deployment which has created and populated our Function App.

On the detail page of this deployment you’ll see some information which you have specified yourself while creating the Function App. There’s also the option to view the template which Azure has used to create your Function App.


Just click on this link and you will be able to see the complete template, along with the parameters used and most important, there’s the option to download the template!


After downloading the template you’ll see a lot of files in the zip-file. You won’t be needing most of them as they are helper files to deploy the template to Azure. Because we will be using VSTS, we only need the parameters.json and template.json files.

The template.json file contains all the information which is necessary for, in our case, the Function App. Below is the one used for my deployment.

    "$schema": "",
    "contentVersion": "",
    "parameters": {
        "name": {
            "type": "String"
        "storageName": {
            "type": "String"
        "location": {
            "type": "String"
        "subscriptionId": {
            "type": "String"
    "resources": [
            "type": "Microsoft.Web/sites",
            "kind": "functionapp",
            "name": "[parameters('name')]",
            "apiVersion": "2016-03-01",
            "location": "[parameters('location')]",
            "properties": {
                "name": "[parameters('name')]",
                "siteConfig": {
                    "appSettings": [
                            "name": "AzureWebJobsDashboard",
                            "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
                            "name": "AzureWebJobsStorage",
                            "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
                            "name": "FUNCTIONS_EXTENSION_VERSION",
                            "value": "~1"
                            "name": "WEBSITE_CONTENTAZUREFILECONNECTIONSTRING",
                            "value": "[concat('DefaultEndpointsProtocol=https;AccountName=',parameters('storageName'),';AccountKey=',listKeys(resourceId('Microsoft.Storage/storageAccounts', parameters('storageName')), '2015-05-01-preview').key1)]"
                            "name": "WEBSITE_CONTENTSHARE",
                            "value": "[concat(toLower(parameters('name')), 'b342')]"
                            "name": "WEBSITE_NODE_DEFAULT_VERSION",
                            "value": "6.5.0"
                "clientAffinityEnabled": false
            "dependsOn": [
                "[resourceId('Microsoft.Storage/storageAccounts', parameters('storageName'))]"
            "type": "Microsoft.Storage/storageAccounts",
            "name": "[parameters('storageName')]",
            "apiVersion": "2015-05-01-preview",
            "location": "[parameters('location')]",
            "properties": {
                "accountType": "Standard_LRS"

A fairly readable JSON file, aside from all the magic api versions, types, etc.

The contents of the parameters.json file are a bit more understandable. It contains the key-value pairs which are being referenced in the template file.

    "$schema": "",
    "contentVersion": "",
    "parameters": {
        "name": {},
        "storageName": {},
        "location": {},
        "subscriptionId": {}

The template file uses the format parameters('name') to reference a parameter from the parameters.json file.

These files are important, so you want to add somewhere next to or inside your solution where your functions also reside. Be sure to add them to source control because you’ll need these files in VSTS later on.

For now the above template file is fine, but it’s more awesome to add something to it for a personal touch. I’ve done this by adding a new appSetting in the file.

"appSettings": [
    // other entries
        "name": "MyValue",
        "value": "[parameters('myValue')]"

Also, don’t forget to add myValue to the parameters file and in the header of the template file, otherwise you won’t be able to use it.

In short, if you want to use continuous deployment for your solution, use ARM templates and get started by downloading them from the portal. Now let’s continue to the fun part!

Set up your continuous integration for the Functions!

Setting up the continuous integration of your software solution is actually the easy part! VSTS has matured quite a lot over time, so all we have to do is pick the right template, point it to the right sources and you are (almost) done.

Picking the correct template is the hardest part. You have to pick the ASP.NET Core (.NET Framework). If you choose a different template you will struggle setting it up, if you are unfamiliar with VSTS.


This template contains all the useful steps and settings you need to build and deploy your Azure Functions.


It should be quite easy to configure these steps. You can integrate VSTS with every popular source control provider. I’m using GitHub, so I’ve configured it so VSTS can connect to the repository.


Note I’ve also selected the Clean options because I stumbled across some issues when deploying the sources. These errors were totally my fault, so you can just keep it turned off.

The NuGet restore step is pretty straightforward and you don’t have to change anything on it.

The next step, Build solution, is the most important one, because it will not only build your solution, but also create an artifact from it. The default setting is already set up properly, but for completeness I’ve added it below. This will tell MSBuild to create a package called after building the solution.

/p:DeployOnBuild=true /p:WebPublishMethod=Package /p:PackageAsSingleFile=true /p:SkipInvalidConfigurations=true /p:DesktopBuildPackageLocation="$(build.artifactstagingdirectory)\" /p:DeployIisAppPath="Default Web Site"

Next step which is important is Publish Artifact.
You don’t really have to change anything over here, but it’s good to know where your artifacts get published after the build.


Of course, you can change stuff over here if you really want to.

One thing I neglected to mention is the build agent you want to use. The best agent to build your Azure Function on (at the moment) is the Hosted VS2017 agent.


This agent is hosted by Microsoft, so you don’t have to configure anything for it which makes it very easy to use. Having this build agent hosted by Microsoft also means you don’t have any control over it, so if you want to build something on a .NET framework which isn’t supported (yet), you just have to set up your own build agent.

When you are finished setting up the build tasks be sure to add your repository trigger to the build.


If you forget to do this the build will not be triggered automatically whenever someone pushes to the repository.

That’s all there is to it for setting up your continuous integration for Azure Functions. Everything works out of the box, if you select the correct template in the beginning.

Deploy your Azure Functions continuously!

Now that we have the continuous integration build in place we are ready to deploy the builds. If you are already familiar with Release Management it will be fairly easy to deploy your Azure Functions to Azure.

I had zero experience with Release Management so had to find it out the hard way!

The first thing you want to do when creating a new release pipeline is adding the necessary artifacts. This means adding the artifacts from your CI build, where the source type will be Build and all other options will speak for themselves.


Next, not so obvious, artifact is adding the repository where your parameters.json and template.json files are located. These files aren’t stored in the artifact file from the build, so you have to retrieve them some other way.

Lucky for us we are using a GitHub repository and there’s a source type available called GitHub in Release Management. Therefore we can just add a new Source type and configure it to point to the GitHub location of choice.


This will make sure the necessary template.json and parameters.json files are available when deploying the Azure Functions.

Next up is adding the environments to your pipeline. In my case I wanted to have a different environment for each slot (develop & production), but I can imagine this will differ per situation. Most of the customers I meet have several Azure subscriptions, each meant to facilitate the work for a specific state (Dev, Test, Acceptance, Production). This isn’t the case in my setup, everything is nice and cozy in a single subscription.

Adding an environment isn’t very hard, just add a new one and choose the Azure App Service Deployment template.


There are dozens of other templates which are all very useful, but not necessary for my little automated deployment pipeline.

Just fill out the details in the Deploy Azure App Service task and you are almost done.


Main thing to remember is to select the zip-file which was created as an artifact from our CI build and to check the Deploy to slot option, as we want to deploy these Azure Functions to the develop slot.

If you are satisfied with this, good! But remember we still have the ARM template?

Yes, we want to make sure the Azure environment is up and running before we deploy our software. Because of this, you have to add 1 task to this phase which is called Azure Resource Group Deployment.


This is the task where we need our linked artifacts from the GitHub repository.

The path to the Template and Template parameters are the most important in this step as these will make sure your Azure environment (resource group) will be set up correctly.

Easiest way to get the correct path is to use the modal dialog which appears if you press the button behind the input box.



One thing you might notice over here is the option to Override template parameters. This is the main reason why you want to use VSTS Release Management (or any other deployment server). All this boilerplating is done so we can specify the parameters (secrets) for each environment, without having to store them in some kind of repository.

Just to test it I’ve overridden one of the parameters, myValue, with the text “VSTS Value” to make sure the updating actually happens.

Another thing to note is I’ve selected the Deployment mode to Incremental as I just want to update my deployments, not create a completely new Function App.

All of the other options don’t need much explanation at this time.

One thing I have failed to mention is adding the continuous deployment trigger to the pipeline. In your pipeline click on the Trigger-circle and Enable it, like you can see below.


This will make sure each time a build has succeeded, a new deployment will occur to the Development slot (in my case).

This is all you need to know to deploy your Azure Functions (or any other Azure App Service for that matter). For the sake of completeness it would make sense to add another Environment in your pipeline, call it Production and make sure the same artifacts get deployed to the production slot. This Environment & Tasks will look very similar to the Develop environment, so I won’t repeat the steps over here. Just remember to choose the correct override parameters when deploying to the production slot. You don’t want to mess this up.

Now what?

The continuous integration & deployment steps are finished, so we can start deploying our software. If you already have some CI builds, you can create a new release in the releases tab.


This will be a manual release, but you can also choose to push some changes to your repository and make everything automated.

I’ve done a couple releases to the develop environment already, which are al shown in the overview of the specific release.


Over in the portal you will also notice your Azure Functions will be in read only mode, because continuous integration is turned on.


But, remember we added the the MyValue parameter to our ARM template? It is now also shown inside the Application settings of our Functions App!


This is an awesome way of storing secrets inside your release pipeline! Or even better, store your secrets in Azure Key Vault and adding your Client Id and Client Secret to the Application Settings via the release pipeline, like I described in an earlier post.

I know I’ll keep using VSTS for my Azure Functions deployment from now on. It’s awesome and can only recommend you do it also!

I’ve just started setting up some continuous deployment for my personal websites. All of the sites are hosted within Azure App Services and the sources are located on either GitHub or BitBucket. By having the source code located on a public accessible repository (be it private or public), it’s rather easy to connect Azure to these locations.

On my day-job I come across a lot of web- and desktop applications which also need continuous integration and deployment steps in order for them to go live. For some of these projects I’ve used Octopus Deploy and currently looking towards Azure Release Management. These are all great systems, but they offer quite a lot of overhead for my personal sites. Currently my, most important, personal sites are so called static websites using MiniBlog (this site) and Hugo (for Some of the other websites I have aren’t set up with a continuous deployment path yet.

I don’t really want to set up an Octopus Deploy server or a path in Azure Release Management for these two sites. Lucky for me, the Azure team has come up with some great addition in order to provide some custom deployment steps of your Azure App Service. In order to set this up, you need to enable the automatic deployments via the `Deployment Options` blade in the Azure portal.


Normally, when you have set up your site to be deployed every time some change occurs in a specific branch of your repository the Azure App Service deployment system tries to build your site and place the output to the `wwwroot` folder on the file system. Because I don’t need any msbuild steps whatsoever, I need to override this step and create my own, custom, deployment step.

Setting up such a thing is quite easy, you just have to create a `.deployment` file in the root of your repository and specify the build/deployment script which should be executed. This functionality is provided by Kudu, which Azure uses in order to deploy Git repositories to the Azure App Service.

You can specify a custom script in this deployment file, this can either be a ‘normal’ command script (cmd or bat) or a PowerShell script. I have chosen for PowerShell as it offers me a bit more flexibility compared to a normal command script.

The contents of the deployment file aren’t very exciting. For my scenario it looks like the following:

command = powershell -NoProfile -NoLogo -ExecutionPolicy Unrestricted -Command "& "$pwd\deploy.ps1" 2>&1 | echo"

This will activate the custom deployment step within the Azure App Service as you can see in the following picture (Running custom deployment command…).



The contents of my PowerShell script, deploy.ps1, aren’t very exciting either. The MiniBlog project is just a normal ASP.NET Website, so I just have to copy the contents from the repository folder to the folder of the website.


You can do some more advanced stuff in your deployment script. For my Hugo website I had to tell the Hugo assembly to build my website. So the contents of this deploy.ps1 script are similar to this.

# 1. Variable substitutions
if ($env:HTTP_HOST -ne "") {
    echo "doing substitutions on $Env:DEPLOYMENT_SOURCE\config.toml"
    gc "$Env:DEPLOYMENT_SOURCE\config.toml" | %{ $_ -replace '%%HTTP_HOST%%', $env:WEBSITE_HOSTNAME } | out-file -encoding ascii "$Env:DEPLOYMENT_SOURCE\"
    mv "$Env:DEPLOYMENT_SOURCE\config.toml" "$Env:DEPLOYMENT_SOURCE\config.old.toml"
    mv "$Env:DEPLOYMENT_SOURCE\" "$Env:DEPLOYMENT_SOURCE\config.toml"
    rm "$Env:DEPLOYMENT_SOURCE\config.old.toml"
} else {
    echo "not doing any substitutions"

# 2. Hugo in temporary path
& "$Env:DEPLOYMENT_SOURCE/bin/hugo.exe" -s "$Env:DEPLOYMENT_SOURCE/" -d "$Env:DEPLOYMENT_TARGET/public" --log -v

# 3. Move the web.config to the root
mv "$Env:DEPLOYMENT_SOURCE/web.config" "$Env:DEPLOYMENT_TARGET/public/web.config"

Still not very exciting of course, but it shows a little what can be achieved. I’m not aware of any limitations for these deployment scripts, so anything can be placed inside it. If you need to do something with specific assemblys, like the hugo.exe, you will need to put them in your repository, or some other location which can be accessed by the script.

You can also view the output of your script in the Azure Portal. All data which is outputted by the script (Write-Host, echo, etc.) is shown in this Activity Log.


Useful when debugging your script.

If you have any secrets in your web application/site (like connection strings, private keys, passwords, etc.), it might be a good idea to use this custom deployment step to substitute the committed values to the actual values. If these values are stored in the Azure Key Vault, you can just access the key vault and make sure the correct values are placed within your application before it’s deployed.

Using these deployment scripts can help you out when you have some simple scenarios. If your system is a bit more complex or are working in a professional environment, I’d advise to check out one of the more sophisticated deployment systems, like Octopus Deploy or Azure Release Management. These systems offer a quite a bit more options out of the box and it’s easier to manage the steps, security and insights of a deployment.

Next I’ll try to update an Umbraco site of mine to make use of this continuous deployment scenario. This should be rather easy also as it only needs to call msbuild, which is the default action the Azure App Service deployment option invokes.

In my previous post I’ve talked about creating new projects in Octopus Deploy in order to deploy projects to different environments. In this post I’ll explain a bit on how to create Octopus Deploy packages for your Visual Studio projects via Teamcity.

To enable packaging for Octopus you’ll need to include the Octopack NuGet package to the project you are packaging. In my case this will be the Worker project since I’m only working with Microsoft Azure solutions at the moment.

Once this package is successfully installed you will have to modify the project file also to enforce adding files in the Octopus package.

The following line will have to be added to the propertygroup of your build configuration


For reference, a complete propertygroup:

<PropertyGroup Condition=" '$(Configuration)|$(Platform)' == 'Release|AnyCPU' ">

Your project is now ready to start packing. In my case I had to add a nuspec file also, because the worker project contained an application which has to be deployed to on an Azure Cloud Service.

A nuspec file for Octopus Deploy looks exactly the same as one you would create for NuGet itself. Mine looks like this:

<?xml version="1.0"?>
<package xmlns="">
		<authors>Jan de Vries</authors>
		<owners>Jan de Vries</owners>
		<title>Jan de Vries his cool product</title>
		<description>The cool product used by the Jan de Vries software.</description>
		<copyright>Jan de Vries 2015</copyright>
		<!-- Add the files (.cscfg, .csdef) from your Azure CS project to the root of your solution  -->
		<file src="..\MyCustomer.Azure.MyProduct\ServiceDefinition.csdef" />
		<file src="..\MyCustomer.Azure.MyProduct\ServiceConfiguration.Dev.cscfg" />
		<file src="..\MyCustomer.Azure.MyProduct\ServiceConfiguration.Acc.cscfg" />
		<file src="..\MyCustomer.Azure.MyProduct\ServiceConfiguration.Prod.cscfg" />
		<!-- Add the files .wadcfg file to the root to get the diagnostics working  -->
		<file src="..\MyCustomer.Azure.MyProduct\MyCustomer.Azure.MyProduct.WorkerContent\*.wadcfg" />
		<file src="..\MyCustomer.Azure.MyProduct\MyCustomer.Azure.MyProduct.WorkerContent\*.wadcfgx" />
		<!-- Add the service/console application to the package -->
		<file src="..\MyCustomer.Azure.MyProduct.Worker\Application\*.dll" target="Application"/>
		<file src="..\MyCustomer.Azure.MyProduct.Worker\Application\*.exe" target="Application"/>
		<file src="..\MyCustomer.Azure.MyProduct.Worker\Application\*.config" target="Application"/>

Keep in mind, the nuspec file name should be exactly the same as the project name. If you fail to do so, the file will be ignored.

This is all you’ll have to to in Visual Studio. You can of course create your packages via the commandline, but it’s better to let Teamcity handle this.

In order to use Octopus Deploy from within Teamcity you’ll need the teamcity plugin which can be downloaded from the Octopus Deploy download page. After installing the plugin, three new runner type features will be available when creating a new build step.


I’m just using the Create release option, since this one is also capable of deploying a release to an environment. We don’t use the Promote release option as we want this to be a conscious, manual, step in the process.

There’s also a new section in the Visual Studio (sln) runner type which enables you to run OctoPack on the solution/projects.


If you want to use this for a build step, just enable the checkbox and be sure to set a proper build number in the OctoPack package version box.

On the image below you can check out the settings I’m using to create a new release with Octopus Deploy.


As you can see, I’ve added an additional command line argument telling the deployment timeout to be set on 30 minutes.


This is because the current builds take about 12 to 17 minutes to be deployed to a Cloud Service and the default (configured) Teamcity build timeout is set lower amount. Therefore all deployment builds will be marked as failed if you don’t add this argument.

I’ve also marked the Show deployment progress. This will make sure all Octopus Deploy output will get printed into Teamcity. If something fails, you’ll be able to check it out within Teamcity, assign someone to the failed build and he/she will have all information necessary to fix the failure.

Well, that’s about it on creating a nice starter continuous deployment environment. You can expand this in a way of your liking of course, but these are the basics.

The latest project I was working on didn’t have a continuous integration and continuous deployment environment set up yet. Creating a continuous integration environment was rather easy as we were already using Teamcity and adding a couple of builds isn’t much of a problem. The next logical step would be to start setting up continuous deployment.

I started out by scripting a lot of PowerShell to manage our Azure environment like creating all SQL servers, databases, service busses, cloud services, etc. All of this worked quite well, but managing these scripts was quite cumbersome.

A colleague of mine told me about Octopus Deploy. They had started using this deployment automation system on their project and it sounded like the exact same thing I was doing, just a lot easier!

Setting up the main server and it’s agents (tentacles) was quite easy and doesn’t need much explanation. The pricing of the software isn’t bad either. You can start using it for free and when you need to use more as 10 tentacles or 5 projects, you’ll start paying $700 for the professional license. Spending $700 is still a lot cheaper as paying the hourly rate of a PowerShell professional to create the same functionality.

One of the first things you want to do when starting out with Octopus Deploy is creating your own deployment workflow. Even though this is possible it’s better to navigate around a bit and think about how you want to deploy your software.

The basis of any deployment is having a deployment environment. So, the first thing you need to do is create these environments, like Dev, Test, Acc, Prod and assign a tentacle to each of these.


Adding a tentacle to an environment can be done by pressing the Add machine button.

After having created these environments, you can start creating your deployment workflow. The default Octopus experience already provides the most basic steps you might want to use, like running a PowerShell script or deploy a NuGet package somewhere.


To make my life easier while deploying to Azure, I’ve created some steps of my own for deploying a Cloud Service, Swapping VIP, checking my current Azure environment and deploying a website using MSDeploy.


Creating your own build steps is fairly easy. When creating a new build step you have to override one of the existing steps. For my own Cloud service build step I’ve used the default Deploy to Windows Azurestep and made sure I didn’t had to copy paste the generic fields all the time.

The deployment of a website project was a bit harder compared to deploying a Cloud service. I had already discovered this when deploying the complete environment with just PowerShell, so this wasn’t new for me. The linked article on this (above) describes in-depth on which steps you have to undertake to get this working. For reference I’ll share the script I’ve used in this build step.


# A collection of functions that can be used by script steps to determine where packages installed
# by previous steps are located on the filesystem.
function Find-InstallLocations {
    $result = @()
    $OctopusParameters.Keys | foreach {
        if ($_.EndsWith('].Output.Package.InstallationDirectoryPath')) {
            $result += $OctopusParameters[$_]
    return $result
function Find-InstallLocation($stepName) {
    $result = $OctopusParameters.Keys | where {
        $_.Equals("Octopus.Action[$stepName].Output.Package.InstallationDirectoryPath",  [System.StringComparison]::OrdinalIgnoreCase)
    } | select -first 1
    if ($result) {
        return $OctopusParameters[$result]
    throw "No install location found for step: $stepName"
function Find-SingleInstallLocation {
    $all = @(Find-InstallLocations)
    if ($all.Length -eq 1) {
        return $all[0]
    if ($all.Length -eq 0) {
        throw "No package steps found"
    throw "Multiple package steps have run; please specify a single step"

function Test-LastExit($cmd) {
    if ($LastExitCode -ne 0) {
        Write-Host "##octopus[stderr-error]"
        write-error "$cmd failed with exit code: $LastExitCode"

$stepName = $OctopusParameters['WebDeployPackageStepName']

$stepPath = ""
if (-not [string]::IsNullOrEmpty($stepName)) {
    Write-Host "Finding path to package step: $stepName"
    $stepPath = Find-InstallLocation $stepName
} else {
    $stepPath = Find-SingleInstallLocation
Write-Host "Package was installed to: $stepPath"

Write-Host "##octopus[stderr-progress]"
Write-Host "Publishing Website"

$websiteName = $OctopusParameters['WebsiteName']
$publishUrl = $OctopusParameters['PublishUrl']

$destBaseOptions = new-object Microsoft.Web.Deployment.DeploymentBaseOptions
$destBaseOptions.UserName = $OctopusParameters['Username']
$destBaseOptions.Password = $OctopusParameters['Password']
$destBaseOptions.ComputerName = "https://$publishUrl/msdeploy.axd?site=$websiteName"
$destBaseOptions.AuthenticationType = "Basic"

$syncOptions = new-object Microsoft.Web.Deployment.DeploymentSyncOptions
$syncOptions.WhatIf = $false
$syncOptions.UseChecksum = $true

$enableAppOfflineRule = $OctopusParameters['EnableAppOfflineRule']
if($enableAppOfflineRule -eq $true)
    $appOfflineRule = $null
    $availableRules = [Microsoft.Web.Deployment.DeploymentSyncOptions]::GetAvailableRules()
    if (!$availableRules.TryGetValue('AppOffline', [ref]$appOfflineRule))
        throw "Failed to find AppOffline Rule"
        Write-Host "Enabled AppOffline Rule"

$deploymentObject = [Microsoft.Web.Deployment.DeploymentManager]::CreateObject("contentPath", $stepPath)

$deploymentObject.SyncTo("contentPath", $websiteName, $destBaseOptions, $syncOptions)

You can probably find it out yourself, but these are the parameters used in the script.


After having created your own custom steps it’s time to create a deployment workflow. Create a new Project in Octopus Deploy and head down to the Process tab. Over here you can add all necessary steps for your deployment which might look like this in the end.


As you will probably notice, you can configure each step to be executed per environment. In this deployment workflow, we don’t want a manual intervention step before swapping the VIP. We also don’t want to distribute our software across the world, so the East US step is also turned off for development deployments.

The deployment on the Dev environment has ran a couple of times and is up to date with the latest software version. When you are happy with the results, you can choose to upgrade it to a different environment. Depending on the Lifecycle you have configured you can upgrade the deployment to any environment or just to the next stage. We have configured the lifecycle so a package has to be installed on the Development first, then Acceptance and if that environment is proven to be good enough, it will be pushed towards Production. The current status of my testing project looks like this:


Zooming in on a specific release, you can see a button to promote the package to the next environment.


I hope this helps a bit to see what’s possible with Octopus Deploy. Personally I think it’s a very nice system which really helps gaining insights on your software deployments and works a lot better as scripting your own PowerShell deployments from scratch.

If there’s still something which needs a bit more in-depth explanation or detail, let me know so I can add it in an upcoming post. Keep in mind, I’ve only used Octopus Deploy in an Azure environment, but I’m sure an on-premise installation will work about the same.