Dev Intersection 2015 - dag 6

Ook al ben ik inmiddels alweer een tijdje terug uit Las Vegas en inmiddels de jet-lag te boven, wilde ik jullie toch niet mijn laatste dag op Dev Intersection onthouden. Dit was namelijk de dag waar ik het meest naar had uitgekeken.

Ondanks mijn eerdere experimenten met IoT (dotnetFlix aflevering 3) waarbij ik mijn gas- en electriciteitsmeter liet ‘praten’ met het internet en wat andere simpele projectjes, had ik nog steeds niet echt het idee dat ik met IoT bezig was, zo had ik wel een koppeling met Azure gemaakt maar niet de IoT hub gebruikt, geen stream analytics toegepast en geen PowerBI. Dat was nou precies waar mijn laatste workshop over ging: IoT, Azure, PowerBI en dat aangestuurd met Node.js!

Gedurende de gehele dag werd ik meegenomen door Doug Seven en zijn team waarbij we in eerste instantie aan de slag gingen met een Particle Photon. Deze mini-module van $19 (zie foto waar hij bovenop een breadboard ligt) is in staat om out-of-the-box te communiceren met wifi en heeft een aantal digitale en analoge poorten aan boord waarmee je kunt communiceren. Plug ‘m in in je PC (of een andere USB power source) en je kunt gaan zodra je jouw particle hebt ‘geclaimt’ via hun cloud service.

Tijdens de workshop wordt uitgelegd dat je op verschillende manieren om kunt gaan met je devices, zo kun je rechtstreeks met het internet communiceren, of je kunt via een gateway-device werken. Zo doen wij dat ook deze dag: via onze pc. Gewapend met een text-editor (ik koos voor Visual Studio Code), de Johnny Five node module en de Particle-cli module, kon ik aan de slag met Node.js. Aangezien er geen ‘hello world’ te outputten was op de module aangezien er geen display op zit, moest een knipperend lampje het doen (dat mijn lampje in morse alsnog ‘hello world’ seinde, laten we maar even buiten beschouwing ;-)). Probeer overigens ook vooral het particle-cli commando ‘nyan’ en ik geef je alvast als tip dat je ook ‘particle-cli nyan off’ kunt doen zonder een reboot te geven.

Gedurende de dag kwamen we steeds verder met onze particles en koppelden we deze aan een SparkFun weathershield waarmee een simpel weerstation werd gebouwd. Door deze metrieken vervolgens met behulp van de Azure IoT node-module naar Azure te pushen en deze met een stream analytics job in een Power BI DataSet te gieten, kun je in Power BI vervolgens een mooi dashboard er overheen gieten. Let er hierbij op dat je om PowerBI als output voor je Stream Analytics Job te selecteren, je in de oude huidige Azure portal moet kijken!

Zie hieronder mijn resultaat met op de horizontale as de tijd en verticaal de temperatuur :-)

Al met al was dit een leerzame workshop waar je in korte tijd met een hoop informatie tot je neemt, en je de kans krijgt te werken met de mannen die hier achter zitten en ze vragen te stellen. Krijg je dus de kans om een workshop van Doug en de mannen te volgen: grijp ‘m! Kijk op hun github voor de code, guides en workshop planning.

Share Comments

Dev Intersection 2015 - dag 3

Samen met Mark Rexwinkel en Edwin van Wijk ben ik deze week aanwezig op de Dev Intersection 2015 conferentie in Las Vegas. Via een dagelijkse blog proberen wij jullie op de hoogte te houden van wat we hier zien en horen. Na Edwin en Mark ben ik vandaag aan de beurt.

De derde dag van de conferentie was de eerste dag waarop ‘reguliere’ sessies werden gegeven. Na een goed ontbijt, begon de dag met een keynote van Scott Guthrie. Hij vertelde voornamelijk over de ‘Journey to the Cloud’ en deelde Microsoft’s visie op DevOps met Visual Studio Online, enkele indrukwekkende cijfers over Azure (wat te denken van 777 biljoen storage queries per dag?!) en de manier waarop Microsoft’s Clouddiensten zoals Office 365 in het grote plaatje van de moderne IT-industrie passen.

Na de keynote zijn we elk onze eigen kant uit gegaan. Ik heb sessies gevolgd van Julie Lerman (Domain Driven Design for the Database Driven Mind), welke erg goed wordt samengevat in een drietal blogposts, een sessie van Steven Murawski (Survive and Thrive in a DevOps World) die erop neer kwam dat het invoeren van DevOps voornamelijk een cultuur-shift is waarbij men de ‘fear culture’ en ‘blame game’ moet laten varen. Hij heeft een flink aantal tips op zijn blog staan om met DevOps aan de slag te gaan.

In de middag ben ik verder gegaan met een sessie van Troy Hunt (Securing ASP.NET in an Azure Environment). Nadat ik zijn workshop had gevolgd op maandag, was ik erg benieuwd wat hij over dit onderwerp had te zeggen en ik werd niet teleurgesteld. Alhoewel het in het begin voornamelijk om no-brainers ging zoals het least-privileged-account principe, kwam hij uiteindelijk tot tips omtrent dynamic data masking in Azure SQL databases, stipte hij nog even het belang van application settings en connection strings in de Azure portal aan en dat je eigenlijk altijd two step verification aan moet zetten als je met jouw account een Azure subscription gaat beheren. Dit laatste kun je instellen via accountbeheer.

Al met al was dit weer een geslaagde dag en kijk ik al uit naar morgen!

Share Comments

Custom build tasks in TFS 2015

Since I upgraded my team’s private TFS instance to TFS 2015 RC1, followed by RC2, the whole team has been working with TFS 2015 quite a lot. Of course one of the major features is the new build engine and we’ve given that quite a ride. From cross platform builds on Mac and Linux to custom build tasks, we’ve accomplished quite a lot. Seeing as during yesterday’s Visual Studio 2015 launch, Brian Harry stated that it was ‘quite easy’ to build your own tasks, I figured I’d give a short write-down of our experiences with custom tasks.

Preface

From the moment I upgraded our R&D server to RC1, we’ve been working with the new build system. Up until RC2 it was only possible to add custom build tasks, but we weren’t able to remove them. On top of that, the whole process isn’t documented quite yet. Seeing as we quite often add NuGet packages to a feed and didn’t want to add a, not very descriptive, PowerShell task to all of our build definitions, we decided to use this example for a custom task and see how it would fare.

Prerequisite one: What is a task?

To make a custom build task, we first need to know what it looks like. Luckily Microsoft has open-sourced most of the current build tasks in https://github.com/Microsoft/vso-agent-tasks which gave us a fair idea of what a build task is:

  1. a JSON file describing the plugin
  2. a PowerShell or Node.JS file containing the functionality (this post will focus on PowerShell)
  3. an (optional) icon file
  4. optional resources translating the options to another language

Now the only thing we needed to find out was: how to upload these tasks and in what format?

Good to know:

  1. To make sure your icon displays correctly, it must be 32×32 pixels
  2. The task ID is a GUID which you need to create yourself
  3. The task category should be an existing category
  4. Visibility tells you what kind of task it is, possible values are: Build, Release and Preview. Currently only Build-type tasks are shown

Prerequisite two: How to upload a task?

We quickly figured out that the tasks were simply .zip files containing the aforementioned items, so creating a zip was an easy but then we needed to get it there. By going through the github repository’s, we figured out there was a REST-API which controls all the tasks and we figured that by doing a PUT-call to said endpoint we could create a new task, but also overwrite tasks.

The following powershell-script enables you to upload tasks:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
param(
[Parameter(Mandatory=$true)][string]$TaskPath,
[Parameter(Mandatory=$true)][string]$TfsUrl,
[PSCredential]$Credential = (Get-Credential),
[switch]$Overwrite = $false
)

# Load task definition from the JSON file
$taskDefinition = (Get-Content $taskPath\task.json) -join "`n" | ConvertFrom-Json
$taskFolder = Get-Item $TaskPath

# Zip the task content
Write-Output "Zipping task content"
$taskZip = ("{0}\..\{1}.zip" -f $taskFolder, $taskDefinition.id)
if (Test-Path $taskZip) { Remove-Item $taskZip }

Add-Type -AssemblyName "System.IO.Compression.FileSystem"
[IO.Compression.ZipFile]::CreateFromDirectory($taskFolder, $taskZip)

# Prepare to upload the task
Write-Output "Uploading task content"
$headers = @{ "Accept" = "application/json; api-version=2.0-preview"; "X-TFS-FedAuthRedirect" = "Suppress" }
$taskZipItem = Get-Item $taskZip
$headers.Add("Content-Range", "bytes 0-$($taskZipItem.Length - 1)/$($taskZipItem.Length)")
$url = ("{0}/_apis/distributedtask/tasks/{1}" -f $TfsUrl, $taskDefinition.id)
if ($Overwrite) {
$url += "?overwrite=true"
}

# Actually upload it
Invoke-RestMethod -Uri $url -Credential $Credential -Headers $headers -ContentType application/octet-stream -Method Put -InFile $taskZipItem

Good to know:

  1. Currently only ‘Agent Pool Administrators’ are able to add/update or remove tasks.
  2. Tasks are server-wide, this means that you will upload to the server, not to a specific collection or project.

Creating the actual task

So like I said, we’ll be creating a new task that’s going to publish our NuGet packages to a feed. So first we need to decide what information we need to push our packages:

  1. The target we want to pack (.csproj or .nuspec file relative to the source-directory)
  2. The package source we want to push to

For this example I’m assuming you’re only building for a single build configuration and single target platform, which we’ll use in the PowerShell-script.

First we’ll make the task definition. As I said, this is simply a JSON file describing the task and its inputs.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
{
"id": "61ed0e1d-efb7-406e-a42b-80f5d22e6d54",
"name": "NuGetPackAndPush",
"friendlyName": "Nuget Pack and Push",
"description": "Packs your output as NuGet package and pushes it to the specified source.",
"category": "Package",
"author": "Info Support",
"version": {
"Major": 0,
"Minor": 1,
"Patch": 0
},
"minimumAgentVersion": "1.83.0",
"inputs": [
{
"name": "packtarget",
"type": "string",
"label": "Pack target",
"defaultValue": "",
"required": true,
"helpMarkDown": "Relative path to .csproj or .nuspec file to pack."
},
{
"name": "packagesource",
"type": "string",
"label": "Package Source",
"defaultValue": "",
"required": true,
"helpMarkDown": "The source we want to push the package to"
}
],
"instanceNameFormat": "Nuget Pack and Push $(packtarget)",
"execution": {
"PowerShell": {
"target": "$(currentDirectory)\\PackAndPush.ps1",
"argumentFormat": "",
"workingDirectory": "$(currentDirectory)"
}
}
}

This version of the task will be a very rudimentary one, which doesn’t do much (any) validation, so you might want to add that yourself.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
[cmdletbinding()]
param
(
[Parameter(Mandatory=$true)][string] $packtarget,
[Parameter(Mandatory=$false)][string] $packagesource
)

####################################################################################################
# 1 Auto Configuration
####################################################################################################

# Stop the script on error
$ErrorActionPreference = "Stop"

# Relative location of nuget.exe to build agent home directory
$nugetExecutableRelativePath = "Agent\Worker\Tools\nuget.exe"

# These variables are provided by TFS
$buildAgentHomeDirectory = $env:AGENT_HOMEDIRECTORY
$buildSourcesDirectory = $Env:BUILD_SOURCESDIRECTORY
$buildStagingDirectory = $Env:BUILD_STAGINGDIRECTORY
$buildPlatform = $Env:BUILDPLATFORM
$buildConfiguration = $Env:BUILDCONFIGURATION

$packagesOutputDirectory = $buildStagingDirectory

# Determine full path of pack target file
$packTargetFullPath = Join-Path -Path $buildSourcesDirectory -ChildPath $packTarget

# Determine full path to nuget.exe
$nugetExecutableFullPath = Join-Path -Path $buildAgentHomeDirectory -ChildPath $nugetExecutableRelativePath

####################################################################################################
# 2 Create package
####################################################################################################

Write-Host "2. Creating NuGet package"

$packCommand = ("pack `"{0}`" -OutputDirectory `"{1}`" -NonInteractive -Symbols" -f $packTargetFullPath, $packagesOutputDirectory)

if($packTargetFullPath.ToLower().EndsWith(".csproj"))
{
$packCommand += " -IncludeReferencedProjects"

# Remove spaces from build platform, so 'Any CPU' becomes 'AnyCPU'
$packCommand += (" -Properties `"Configuration={0};Platform={1}`"" -f $buildConfiguration, ($buildPlatform -replace '\s',''))
}

Write-Host ("`tPack command: {0}" -f $packCommand)
Write-Host ("`tCreating package...")

$packOutput = Invoke-Expression "&'$nugetExecutableFullPath' $packCommand" | Out-String

Write-Host ("`tPackage successfully created:")

$generatedPackageFullPath = [regex]::match($packOutput,"Successfully created package '(.+(?<!\.symbols)\.nupkg)'").Groups[1].Value
Write-Host `t`t$generatedPackageFullPath

Write-Host ("`tNote: The created package will be available in the drop location.")

Write-Host "`tOutput from NuGet.exe:"
Write-Host ("`t`t$packOutput" -Replace "`r`n", "`r`n`t`t")

####################################################################################################
# 3 Publish package
####################################################################################################

Write-Host "3. Publish package"
$pushCommand = "push `"{0}`" -Source `"{1}`" -NonInteractive"

Write-Host ("`tPush package '{0}' to '{1}'." -f (Split-Path $generatedPackageFullPath -Leaf), $packagesource)
$regularPackagePushCommand = ($pushCommand -f $generatedPackageFullPath, $packagesource)
Write-Host ("`tPush command: {0}" -f $regularPackagePushCommand)
Write-Host "`tPushing..."

$pushOutput = Invoke-Expression "&'$nugetExecutableFullPath' $regularPackagePushCommand" | Out-String

Write-Host "`tSuccess. Package pushed to source."
Write-Host "`tOutput from NuGet.exe:"
Write-Host ("`t`t$pushOutput" -Replace "`r`n", "`r`n`t`t")

To finish up, don’t forget to add a .png logo to your task ;-)
You should now be able to add a custom task to your build pipeline from the “Package” category:

Words of warning

Tasks can be versioned, use this to your advantage. All build definitions use the latest available version of a specific task, you can’t change this behavior from the web interface, so always assume the latest version is being used.

If you don’t change the version number of your task when updating it, the build agents that have previously used your task will not download the newer version because the version number is still the same. This means that if you change the behavior of your task, you should always update the version number!

When deleting a task, this task is not automatically removed from current build definitions, on top of that you won’t get a notification when editing the build definition but you will get an exception on executing a build based on that definition.

Tasks are always available for the entire TFS instance, this means that you shouldn’t include credentials or anything that you don’t want others to see. Use ‘secret variables’ for this purpose:

Further Reading

If you’ve followed this post so far, I recommend you also check out my team member Jonathan’s post/videos (in Dutch) out:

Blog Post about Invoke SQLCmd in build vNext
Video on build vNext (in Dutch)

Share Comments

Load testing from the Azure portal

Before you launch a new web application, you make sure you have thoroughly tested it, you have performed unit-, integration-, usability- and load-tests but for some reason when the application goes into production, it comes to a grinding halt and you’re left puzzled as to why this happened.

Back in 2013 Microsoft released a solution for this issue: Azure-based load testing which is able to simulate real-world load-testing on your application from Azure with unlimited resources (well, the only real limiting factor is your wallet). The only strange thing here was that in order to use this Azure-based load testing, I had to go to my VSO account to start a test instead of just starting a load test in the Azure portal where I published my web application.

This has changed now.

Introducing Azure load testing from the portal

Yesterday I stumbled onto this post (which contains way more pictures than this post will) by Charles Sterling, where he revealed that as an ‘nascent feature PM’ he more or less accidentally released a new feature into the wild. As of now it’s possible to start a load test from the Azure portal right from where you control your web application. It’s as easy as adding a tile to your web app and starting the test. Or even better, by enabling a feature flag and simply adding a new load test.

To get started, load up your Azure Portal (the new one!) and navigate to one of your web apps and then follow these steps:

  1. Right-click the space in between any of the tiles already displayed and click ‘Add Tiles’
  2. Now choose the ‘Operations’ category and select ‘Cloud Load Test’
  3. You will now get a new tile in your web app panel
  4. Click ‘Done’ on the top left
  5. Click the tile and add a new Load Test, enter the VSO account you want to use, the URL and a name for the test. Mind you, the test name can’t contain any spaces or non-alphanumeric characters.

In case you don’t want to add a new tile, you can also include the following feature flag in the portal URL: ?websitesextension_cloudloadtest=true turning the URL into something like: https://portal.azure.com/?websitesextension_cloudloadtest=true
After doing so, you will be able to access load testing from your web app’s settings option.

Summarizing

You now have a new way to perform load testing in the Azure portal, snugly in your Web App blade. It is currently lacking some of the features that VSO does offer, such as browser distribution and think time, but who knows, they might just add them before the final version:

All in all it’s a nice time-saver and the tests are now in a place where I’d actually expect them to be.

Share Comments