Monday, September 28, 2015

PSScriptAnalyzer - Automate code checking for your PowerShell projects as part of TFS build

With WMF 5.0 we can make use of the PowerShell static code checker utility PSScriptAnalyzer to perform code analysis on PowerShell script files and modules. PSScriptAnalyzer checks the quality of Windows PowerShell code by running a set of rules. The rules are based on PowerShell best practices identified by PowerShell Team and the community. PSScriptAnalyzer generates DiagnosticResults (errors and warnings) to inform users about potential code defects and suggests possible solutions for improvements.

You can download the entire module from PowerShell gallery using the Install-Module cmdlet or create a pull request on the project at Github and build it yourself. Once you have the module available on the workstation you can perform code checking on PowerShell files by using the Invoke-ScriptAnalyzer cmdlet.

By making use of this cmdlets and the post-build script options from TFS build, you can now perform code checking on the PowerShell projects from the build server and choose to fail the build based on the results. Let’s see how this works.

First we need a function to do code checking using the Invoke-ScriptAnalyzer cmdlet and parse the results. The below content describes the process

function Get-DiagnosticIssue
{
       param
       (
              [Parameter(Mandatory=$true)]
              [ValidateNotNullOrEmpty()]
              [ValidateScript({Test-Path $_})]
              [string] $Path,

              [Parameter(Mandatory=$true)]
              [ValidateNotNullOrEmpty()]
              [string] $Severity,

              [bool] $FailOnIssues = $false
       )

       $logDirectory = Join-Path $Env:TF_BUILD_DROPLOCATION "Logs"
       if(-not(Test-Path $logDirectory -ErrorAction SilentlyContinue))
       {
              New-Item -ItemType Directory -Force -Path $logDirectory
       }
       $logFile = Join-Path $logDirectory "PSScriptAnalyzerLogs.log"

       Invoke-ScriptAnalyzer -Path $Path -Recurse |? {$_.Severity -ge $Severity} |fl | Out-File $logFile

       if($FailOnIssues)
       {
              if((Get-Content $logFile) -ne $null)
              {
                     $Host.UI.WriteErrorLine("PSScriptAnalyzer check failed. Please refer to the file $($logFile) for more details.")
              }
       }
}

Export-ModuleMember *

The Get-DiagnosticIssue function accepts a Severity parameter to filter the issues logged and a Path variable to perform analysis on all the scripts files and modules under that location. The FailOnIssues parameter can be used to decide whether the build should be failed or not. This module is called from a script file that is used as the post-build script path in the TFS build definition.

param
(
       [string] $Severity = "Information",
       [string] $Filter,
       [string] $FailOnIssues = "False"
)

$exitCode = 0
trap
{
    $Host.UI.WriteErrorLine($error[0].Exception.Message)
    $Host.UI.WriteErrorLine($error[0].Exception.StackTrace)
    if ($exitCode -eq 0)
    {
        $exitCode = 1
    }
}

$scriptName = $MyInvocation.MyCommand.Name
$scriptPath = Split-Path -Parent (Get-Variable MyInvocation -Scope Script).Value.MyCommand.Path

Push-Location $scriptPath
Import-Module .\PSScriptAnalyzerExtensions.psm1

$source = $env:TF_BUILD_BUILDDIRECTORY

if(-not ([string]::IsNullOrWhiteSpace($Filter)))
{
       $source = Get-ChildItem -Directory $env:TF_BUILD_BUILDDIRECTORY -Filter $Filter -Recurse |? {$_.FullName.ToUpper().Contains("Solution\$Filter".ToUpper())} | select -expand FullName
}

$failBuild = $FailOnIssues -eq "True"

Get-DiagnosticIssue -Path $source -Severity $Severity -FailOnIssues:$failBuild

Pop-Location

exit $exitCode

This script file is used in the process template as post-build script path with arguments as given below.



Next time the build is queued, it will trigger a code check using PSScriptAnalyzer and you can see the results in the log file generated.


Docker for windows on Azure VM : Securing the host and TLS

In the previous post about docker on windows, we looked into the details of creating a Windows 2016 TP3 VM in Azure and looked into the details of managing containers on the host VM that is running the Docker daemon. This post we’ll look into the security considerations when running Docker and how to secure the Docker daemon with TLS. We’ll be using openssl to create and manage certificates for SSL. 

If you don’t have openssl installed on the machines, download the binaries for windows from the location http://gnuwin32.sourceforge.net/packages/openssl.htm.

To setup TLS for Docker, we need to follow the below given steps.
  1. Create certificate authority (CA)
  2. Setup the server private key
  3. Create certificate signing request for the server (CSR)
  4. Sign the server key with the CSR against the CA
  5. Create client private key and CSR
  6. Sign the client key with the CSR against the CA
  7. Copy the server certificates to the docker host machine
  8. Add firewall rule for allowing communication to port 2376

Before we start using the openssl executable, we need to ensure that the configuration file for openssl is available. The version 1.0 of OpenSSL requires an "openssl.cnf" configuration file. Openssl reads the location of the configuration file by using the environment variable OPENSSL_CONF. For this example, we can download and use the configuration file from https://www.tbs-certificates.co.uk/openssl-dem-server-cert-thvs.cnf.

Creating the certificates.

Download and extract the openssl binaries from location http://gnuwin32.sourceforge.net/packages/openssl.htm to the C:\OpenSSL folder

  • Download and copy the openssl configuration file to C:\OpenSSL\openssl.cnf file
  • Setup the environment variable for opensssl configuration

param
(
       [string] $Path = "C:\OpenSSL",
       [string] $CertLocation = "C:\Docker\Certs"
)
$opensslExe = Join-Path $Path "openssl.exe"
$opensslCnf = Join-Path $Path "openssl.cnf"

if(-not (Test-Path $opensslExe -ErrorAction SilentlyContinue))
{
       throw "openssl.exe not found at location $Path"
}

$env:OPENSSL_CONF= $opensslCnf

  • During certificate generation, there is an .rnd file that OpenSSL needs to write to. We need to set the RANDFILE environment variable to a directory at the certificates location

$env:RANDFILE = Join-Path $CertLocation ".rnd"

  • First we need to create the certificate authority private key

& $opensslExe genrsa -aes256 -out ca-key.pem 2048


  • Using the CA private key, create the CA certificate

& $opensslExe req -new -x509 -days 365 -key ca-key.pem -subj "/C=NL/ST=UT/L=Amersfoort/O=Prajeesh" -sha256 -out ca.pem

  • Next we’ll create the server private key

& $opensslExe genrsa -aes256 -out server-key.pem 2048


  • After this we need to create the certificate signing request (CSR) for the server key. Use the host server IP address while creating the server key

& $opensslExe req -subj "/C=NL/ST=UT/L=Amersfoort/O=Prajeesh" -new -key server-key.pem -out server.csr

  • Before we sign the server key we need to define the certificate extension to specify the subjectAltName. The subjectAltName allows us to specify things such as the IP addresses we will allow connections on.

"subjectAltName = IP:10.10.10.20,IP:127.0.0.1,DNS.1:*.cloudapp.net,DNS.2:*.*.cloudapp.azure.com" | Out-File extfile.cnf -Encoding ASCII

  • Now we can sign the server key

& $opensslExe x509 -req -days 365 -in server.csr -CA ca.pem -CAkey ca-key.pem -CAcreateserial -out server-cert.pem -extfile extfile.cnf


  • Next we’ll create the client keys. First we’ll create the client’s private key

& $opensslExe genrsa -out client-key.pem 2048


  • Then we’ll create the client certificate signing request

& $opensslExe req -subj "/CN=client" -new -key client-key.pem -out client.csr

  • To sign the client key we need an extensions config file with the extendedKeyUsage extension in order to make the key suitable for client authentication

"extendedKeyUsage = clientAuth" | Out-File extfile.cnf -Encoding ASCII

  • Now we can sign the client key

& $opensslExe x509 -req -days 365 -in client.csr -CA ca.pem -CAkey ca-key.pem -CAcreateserial -out client-cert.pem -extfile extfile.cnf

Configuring the docker host

  • We’ve successfully created all the certificates needed for our setup. Now we need to copy the server certificates to the VM where the docker daemon is running. You can either use AzCopy utility to copy the certificates that are added to the container as blob or download the certificates from an Uri using Invoke-WebRequest cmdlet in the machine. The server certificates should be copied to the directory C:\ProgramData\Docker\certs.d folder



  • Next we have to open the connections to port 2376 using the New-NetFirewallRule cmdlet as given below.



  • Restart the docker service on the host
  • Copy the client certificates to the .docker folder in the $Env:USERPROFILE folder.

Connecting using TLS

  • Now connect to docker host using the ip address of the host and port 2376 using –tlsverify option from the client machine


docker --tlsverify -H tcp://.westeurope.cloudapp.azure.com:2376 ps -a


Sunday, September 20, 2015

PowerShell Gallery - Create a module for the community

With PowerShellGet, the new package manager for PowerShell, you can easily perform discovery and installation of software packages, PowerShell modules etc. in a unified way. What makes it more attractive is the ease to contribute your modules to the community so that it’s discoverable and used by users who want to.
All you need is a Microsoft account and login to the PowerShell Gallery to download your API key to publish your module. Let’s see the step by step process for the same.
  • Login to the PowerShell gallery at https://www.powershellgallery.com
  • Once you have signed in, navigate to the account page to download the API Key (https://www.powershellgallery.com/account )
  • Once you have the account key, you can use the Publish-Module command to publish your modules to the gallery.
  • For e.g. I have a module to install Visual Studio 2015 on my machine.
  • First we need to ensure that the module is available in our module path
get-module -ListAvailable
  • Now we can use the Publish-Module command as given below.
 Publish-Module -Name VisualStudio2015 -NuGetApiKey $ApiKey -Verbose


  • That’s it, it’s that simple to contribute!!!