Update 1/8/2016: You no longer need a folder with the same name as the module .zip file in your module .zip file. If you have the .psd1, .psm1, etc files described below directly inside the zip, module import will now work.
If you’re familiar with Azure Automation, you’re probably aware that PowerShell is the fundamental technology that makes Azure Automation great. But PowerShell itself is great due to its extensibility through PowerShell modules. Since Azure Automation is built on PowerShell, PowerShell modules are of course key to the Azure Automation extensibility story as well. This blog post will guide you through the specifics of Azure Automation’s “spin” on PowerShell modules – “Integration Modules” – and best practices for creating your own PowerShell modules to make sure they work great as Integration Modules within Azure Automation.
What’s a PowerShell Module?
Before we even get into Integration Modules specifically, what are PowerShell modules? Well, if you’ve ever called a cmdlet in PowerShell, such as Get-Date or Copy-Item, you’ve used a PowerShell module. A PowerShell module is a group of PowerShell cmdlets that can be used from the PowerShell console, as well as from PowerShell scripts, workflows, and runbooks. All of the functionality of PowerShell is exposed through cmdlets, and every cmdlet is backed by a PowerShell module, many of which ship with PowerShell itself. For example, the Get-Date cmdlet is part of the Microsoft.PowerShell.Utility PowerShell module, and Copy-Item cmdlet is part of the Microsoft.PowerShell.Management PowerShell module. Both of these modules ship with PowerShell. But many PowerShell modules do not ship as part of PowerShell, and are instead distributed by the vast PowerShell community to make complex tasks simpler through encapsulated functionality, such as the Windows Update PowerShell module (there is also an Azure Automation runbook that shows how to use this module if you are interested in patching scenarios). You can learn more about PowerShell modules on MSDN.
What’s an Azure Automation Integration Module?
Ok, so now that we’re familiar with regular old PowerShell modules, what are these Azure Automation “Integration Modules”, and how are they different from a standard PowerShell module? Turns out, there really isn’t much difference! An Integration Module is just a PowerShell module that optionally contains one additional file – a metadata file specifying an Azure Automation connection type to be used with this module’s cmdlets in runbooks. Optional file or not, these PowerShell modules can be imported into Azure Automation to make their cmdlets available for use within runbooks. Behind the scenes, Azure Automation stores these modules, and at job execution time loads them into the Azure Automation sandboxes where runbooks are executed. You may have noticed we ship one PowerShell module out of the box in Azure Automation for you to use -- the Azure PowerShell module. Another tiny caveat of PowerShell modules versus Integration Modules – in order to import one of these modules into Azure Automation, you need to zip up the module folder so the module can be imported as a single file. The zip file should have the same name as the module folder it contains. The module folder in the zip needs to contain at least a .psd1, .psm1, or PowerShell module .dll file with the same name as the module folder. For example, for the Twilio module discussed below, the proper structure is:
- Twilio folder
- Twilio folder
It ends up looking like this: You can find more information on how to package an Integration Module for import into Azure Automation on TechNet, under the “Building an Integration Module” section. Fun fact: We Azure Automation folks often refer to Integration Modules as just “modules” for short, since they are basically just PowerShell modules with an optional extra metadata file. Together, a PowerShell module + an Integration Module metadata file is conceptually equivalent to the Integration Pack concept of System Center Orchestrator. In fact, the term “Integration Module” comes from a combination of the term “Integration Pack” from Orchestrator, and “PowerShell module,” from PowerShell.
The Azure Automation Integration Module Metadata File
So what are the specifics of this extra, optional file, which holds an Azure Automation connection type for the module it is a part of? The file is named based on the name of the module, in the form <ModuleName>-Automation.json, and should be placed within the module folder (within the module zip file). For an example of a PowerShell module that has the Integration Module metadata file, check out the Twilio PowerShell module on Script Center. As you can see below, it contains the Integration Module metadata file: The contents of this file is a JSON-formatted object. This object contains the fields of a “connection” that is required to connect to the system or service the module represents. This will end up creating a connection type in Azure Automation. Using this file you can set the field names, types, and whether the fields should be encrypted and / or optional, for the connection type of the module. Below is an example of the format and contents of this file for the Twilio module mentioned above. Requests to Twilio require passing a Twilio AccountSid and authentication token (AuthToken) in order to authenticate, so the connection type for Twilio contains these fields: After importing the Twilio Integration Module into Azure Automation, when you create a connection, a new connection type will be present – the Twilio connection type. Creating an instance of this connection type (also known as a connection asset) lets you specify the fields needed to connect to Twilio, in this case the AccountSid and AuthToken: Of course, if you have multiple Twilio accounts, you can create a connection asset for each Twilio account so that you can connect to Twilio as any of these accounts (depending on which connection you choose in your runbook for the Twilio cmdlets).
Integration Module Authoring - Best Practices
Now, just because Integration Modules are essentially just PowerShell modules doesn’t mean we don’t have a set of best practices around authoring them. There’s still a handful of things we recommend you think about while authoring a PowerShell module, to make it most usable in Azure Automation. Some of these are Azure Automation specific, and some of them are useful just to make your modules work well in PowerShell Workflow, regardless of whether or not you’re using Automation. 1. Include a synopsis, description, and help URI for every cmdlet in the module In PowerShell, you can define certain help information for cmdlets to allow the user to receive help on using them with the Get-Help cmdlet. For example, here’s how you can define a synopsis, description, and help URI for a PowerShell module written in a .psm1 file: Providing this info will not only show this help using the Get-Help cmdlet in the PowerShell console, it will also expose this help functionality within Azure Automation, for example when inserting activities during runbook authoring: Clicking “View detailed help” will open the help URI in another tab of the web browser you’re using to access Azure Automation. 2. If the module works against a remote system:
a. It should contain an Integration Module metadata file that defines the information needed to connect to that remote system, aka the connection type
You’re an expert on this one already. b. Each cmdlet in the module should be able to take in a connection object as a parameter
Cmdlets in the module become easiest to use in Azure Automation if you allow passing an object with the fields of the connection type as a parameter. This way users don’t have to map parameters of the connection asset to their corresponding parameters each time they call a cmdlet. As an example of what I’m talking about, see below. This runbook uses a Twilio connection asset called joeTwilio to access Twilio and return all my Twilio phone numbers. See how I have to map the fields of the connection to the parameters of the cmdlet? Now compare that with the below, better way of calling Twilio. In this case I am directly passing the connection object to the cmdlet, which is easier: You can enable behavior like this for your cmdlets by allowing them to take a connection object directly as a parameter, instead of just connection fields for parameters. Usually you’ll want a parameter set for each, so that a user not using Azure Automation can call your cmdlets without constructing a hashtable to act as the connection object. Parameter set “SpecifyConnectionFields” below is used to pass the connection field properties one by one. “UseConnectionObject” let’s you pass the connection straight through. As you can see, the Send-TwilioSMS allows passing either way: 3. Define output type for all cmdlets in the module Defining an output type for a cmdlet allows design-time IntelliSense to help you determine the output properties of the cmdlet, for use during authoring. As you can see below, the OutputType cmdlet attribute allows you to get “type ahead” functionality on a cmdlet’s output, without having to run it. While Azure Automation doesn’t use this data today, in the future we hope to enable it to help you construct runbooks more easily. 4. Cmdlets in the module should not take complex object types for parameters PowerShell Workflow is different from PowerShell in that it stores complex types in deserialized form. Primitive types will stay as primitives, but complex types are converted to their deserialized versions, which are essentially property bags. For example, if you used the Get-Process cmdlet in a runbook (or just PowerShell Workflow for that matter), it would return an object of type [Deserialized.System.Diagnostic.Process], not the expected [System.Diagnostic.Process] type. This type has all the same properties as the non-deserialized type, but none of the methods. And if you try to pass this value as a parameter to a cmdlet, where the cmdlet expects a [System.Diagnostic.Process] value for this parameter, you’ll get a nasty error: Cannot process argument transformation on parameter 'process'. Error: "Cannot convert the "System.Diagnostics.Process (CcmExec)" value of type "Deserialized.System.Diagnostics.Process" to type "System.Diagnostics.Process". This is of course because there is a type mismatch between the expected [System.Diagnostic.Process] type and the given [Deserialized.System.Diagnostic.Process] type. The way around this issue is to ensure the cmdlets of your module do not take complex types for parameters. Here’s the wrong way to do it -- as you can see, the cmdlet takes in a complex type as a parameter: And here’s right way -- taking in a primitive that can be used internally by the cmdlet to grab the complex object and use it. Since cmdlets execute in the context of PowerShell, not PowerShell Workflow, inside the cmdlet $process becomes the correct [System.Diagnostic.Process] type. For advanced users out there – you may have noticed that connection assets in runbooks are hashtables, which are a complex type, and yet these hashtables seem to be able to be passed into cmdlets for their –Connection parameter perfectly, with no cast exception. Technically, some PowerShell types are able to cast properly from their serialized form to their deserialized form, and hence can be passed into cmdlets for parameters accepting the non- deserialized type. Hashtable is one of these. It’s possible for a module author’s defined types to be implemented in a way that they can correctly deserialize as well, but there are some tradeoffs to make. The type needs to have a default constructor, have all of its properties public, and have a PSTypeConverter. However, for already-defined types that the module author does not own, there is no way to “fix” them, hence the recommendation to avoid complex types for parameters all together. Runbook Authoring tip: If for some reason your cmdlets need to take a complex type parameter, or you are using someone else’s module that requires a complex type parameter, the workaround in runbooks and PowerShell Workflow is to wrap the cmdlet that generates the complex type and the cmdlet that consumes the complex type in the same InlineScript activity. Since InlineScript executes its contents as PowerShell rather than PowerShell Workflow, the cmdlet generating the complex type would produce that correct type, not the deserialized complex type. 5. Make all cmdlets in the module stateless PowerShell Workflow runs every cmdlet called in the workflow in a different session. This means any cmdlets that depend on session state created / modified by other cmdlets in the same module will not work in PowerShell Workflow or runbooks. Here’s an example of what not to do: As you can see, Get-GlobalNumTimesTwo depends on session variable $globalNum to be set by Set-GlobalNum. This won’t work in workflow and $globalNum will always be 0. Get-GlobalNumTimesTwo should either take in the number as a parameter, so that it doesn’t depend on session state, or else Set-GlobalNum and Get-GlobalNumTimesTwo should be wrapped in the same InlineScript activity in order to run both cmdlets in the same session, in PowerShell context. 6. The module should be fully contained in an Xcopy-able package Because Azure Automation modules are distributed to the Automation sandboxes when runbooks need to execute, they need to work independently of the host they are running on. What this means is that you should be able to zip up the module package, move it to any other host with the same or newer PowerShell version, and have it function as normal when imported into that host’s PowerShell environment. In order for that to happen, the module should not depend on any files outside the module folder (the folder that gets zipped up when importing into Azure Automation), or on any unique registry settings on a host, such as those set by the install of a product. If this best practice is not followed the module will not be useable in Azure Automation.
By now you’re fully up to speed on what an Azure Automation Integration Module is, how you’d write one, and what best practices to follow to make your Integration Modules truly shine in Automation. Integration is core to a successful orchestration strategy, and you should now have everything you need to build Azure Automation integrations into any system. Until next time, Keep Calm and Automate On. Just getting started with Azure Automation? Learn about the service here, and follow Azure Automation on Twitter. Want to get in contact with me, personally? Reach out via my blog or follow me on Twitter.