P3.NET

Creating PowerShell Modules

PowerShell has become the defacto scripting language for Windows platforms. It brings the power of .NET to the command line. One of the benefits of PowerShell is creating scripts or functions that can be shared with others. This allows teams to simplify repetitive tasks by putting them into scripts and then sharing them with each other. The downside to this is that PowerShell, by default, makes it hard to run scripts (for security reasons). Additionally each user has to copy the scripts someplace where PowerShell will find them. In this article I’ll talk about an easier way to deploy scripts such that they can be run just like the built in commands. Unfortunately it requires a little more effort upfront but once you have the pattern down it is really easy.

Note: Before getting started be aware that this blog is based upon PowerShell 5. Some things in this blog do not work with older versions of PowerShell or behave differently. PowerShell 5 can be installed on any supported Windows machine so older machines can be updated if needed.

Scripts vs Functions

Before digging too far into PowerShell it is important to know the difference between scripts, functions and cmdlets. A cmdlet is a class written in a .NET language like C# or Visual Basic. The generated assembly can then be loaded by PowerShell. A function is a PowerShell script that is wrapped in a function block and given a name. Functions allow the script (or other scripts) to reuse functionality. Other than the language they are written in, functions and cmdlets behave the same and you really cannot tell which they are when you run them.

Scripts are a little different. A script is simply a list of PowerShell statements contained in a .ps1 file. They are most useful for automating a series of steps.

param(
    [Parameter(Mandatory)][string] $text,
    [bool] $decode = $false
)

if ($decode) 
{
    $bytes = [Convert]::FromBase64String($text)
    $data = [System.Text.Encoding]::Unicode.GetString($bytes)
} else
{
    $bytes = [System.Text.Encoding]::Unicode.GetBytes($text)
    $data = [Convert]::ToBase64String($bytes)    
}

$data

This script takes a string as input and, depending upon the parameter, encodes or decodes it to Base64. To execute the script you simply specify the script file name.

# Encode
$data = .\Base64Encode.ps1 "Hello World"

# Decode
.\Base64Encode.ps1 $data -decode $true

Notice that you have to specify a relative path. If you simply specify the filename then PowerShell will look in the standard search paths but not the local directory. Also note that to run scripts at all you need to enable running them. To enable local scripts to be run without signing change the Execution Policy.

# Must be run as an administrator
Set-ExecutionPolicy RemoteSigned

The above command will require that remote scripts be signed but local scripts not. This is a reasonable balance between security and convenience. It only needs to be set in PowerShell once and then will persist until changed.

PowerShell Editing

PowerShell versions through 5 come with an integrated environment called PowerShell ISE. This tool can be used to edit and run PowerShell scripts. I recommend that you do not use it. It is not much more useful than Notepad for doing anything beyond simple scripting. Instead I recommend using Visual Studio Code. This free IDE is a ways away from beating Visual Studio but for PowerShell it is great. I use it for PowerShell editing over all other tools. The only issue is getting used to how VSCode works.

For this article create a folder in the file system to store the script you’ll be working with. It doesn’t matter where the folder is. Then go into VSCode and use the Open Folder command to open the folder. Don’t select the script file directly, use the folder. Working with the entire folder makes it much easier to manipulate the multiple files we’ll be using. There is no need to create a solution or anything else. Once the folder is open you can open the script file, add/remove files and folders and do most other file management work you’ll need.

Note: If you have not yet installed the PowerShell extension for VSCode you should be prompted to do so. Install it as you will need the debugger support.

Once the script is open you can debug just like you would for other file types. You can set breakpoints in the code, view variable values, etc. If this is the first time you have loaded a PowerShell file then you may find that debugging does not work. VSCode uses a launch.json file to identify what debugger(s) to use. If you installed the PowerShell extension then it comes with several debuggers that you can choose from. For most debugging choosing the option for the current file is sufficient. You can opt to either create a new session each time or reuse the same session.

Once the debugger is running you can interact with the PowerShell session using the terminal that is available in VSCode. From here you can play around with your script, adjust variables, step through code, etc.

Converting a Script to a Function

Scripts are the drivers of PowerShell. Functions and cmdlets don’t do anything by themselves until they are called. Before we can start packaging up our code into reusable components we need to convert the code to a function. This will allow scripts (or other functions) to use the functionality later. The simplest way to convert the script is to simply wrap it in a function block.

function Base64Encode {
   param(
      [Parameter(Mandatory)][string] $text,
      [bool] $decode = $false
   )

   if ($decode) {
      $bytes = [Convert]::FromBase64String($text)
      $data = [System.Text.Encoding]::Unicode.GetString($bytes)
   }
   else {
      $bytes = [System.Text.Encoding]::Unicode.GetBytes($text)
      $data = [Convert]::ToBase64String($bytes)    
   }

   return $data
}

Beyond the addition of a function block the code was also changed to use the return keyword on the $data variable. This technically isn’t necessary because PowerShell automatically returns the last expression as the result of the function but for someone from a procedural background this makes it a little more readable.

If you run the debugger now it will not do anything. Because the code is in a script nothing executes. To execute the function you need to either make an explicit call to the function in the script or from the terminal. Either approach works but note that calling it from the terminal will not trigger any breakpoints because the debugger just runs the script file and exits.

# Test
$expected = "Hello World"
$result = Base64Encode $expected

$actual = Base64Encode $result -decode $true

if ($actual -ne $expected) {
   throw "Actual does not equal expected"
} else {
   Write-Host "Test Passed"
}

Converting to a Module

Functions are reusable components inside scripts. But our goal is to create reusable functions that can be imported into PowerShell. For that we need to move up to modules. A PowerShell module is simply a collection of functions (or cmdlets, variables, etc). Modules are the deployable unit of reuse in PowerShell. Converting a function to a module is simple, just rename the file from .ps1 to .psm1. You now have a module.

Unfortunately once you’ve renamed the file you can no longer run it. Modules are not designed to be run, they must be imported first. Trying to run the module will generate an error about cannot run a document in the middle of a pipeline. To work around this we need a driver script. We’ll create a new script (RunTests.ps1) that contains the test code we wrote earlier. This code can be moved from the module to the script. The module should now only contain the original function. We can now debug the test script instead. If we set breakpoints in the module file they will be hit correctly.

Trying to run the test script at this point will generate an error abuot the function not being recognized. As mentioned earlier, modules must be imported. The test script needs to first import the module using the import-module command.

import-module .\Base64Encode.psm1 -Force

The Force parameter tells PowerShell to reimport the module even if it is already loaded. Running the debugger now should work.

Naming Functions

For regular functions and scripts the named used does not matter too much. PowerShell does define a set of approved verbs to use for naming functions. This is to help enforce a consistent naming scheme for commands. Since our goal is to generate a reusable module we have to use one of the existing verbs. If you try to use an unknown verb then the module can be created but it won’t be usable.

In general you use the list of approved verbs to find the verb closest to what your command does. In our example our command encodes and decodes text to Base64 so the ConvertTo and ConvertFrom verbs are most appropriate. We now have a choice to make. In our case our function is doing 2 different things based upon a parameter. It is probably better to break this out into 2 separate functions: one for encoding and one for decoding. We will also rename the functions to use one of the approved verbs.

At the same time we will also break the module into 2 separate files, one for each function. This is technically not needed but, again, procedural programmers will relate to breaking things up by functionality. For consistency we will name each file to match the function in it. But let’s talk about documentation before we show the final files.

Documenting the Module

Now that the functions are converted to modules the user’s won’t have direct access to the code anymore. It is important to properly document each function inside the module. PowerShell has a convention for that using the comments, similar to how C# does it. The help system in PowerShell will use this documentation to provide help for the command. You can also provide help external to the module if you want it to be updatable but we’ll just put it in the code. Here is the final version of each of the modules.

<#
.SYNOPSIS
    Converts from Base64 to text.

.DESCRIPTION
    Converts from Base64 to text.

.PARAMETER data
    The Base64 data to convert.

.OUTPUTS
    The text converted from Base64.

.EXAMPLE
    Converts the data from Base64 back to text.

    $text = ConvertFrom-Base64 $data

.LINK
    ConvertTo-Base64
#>
function ConvertFrom-Base64 {
   param(
      [Parameter(Mandatory)][string] $data
   )

   $bytes = [Convert]::FromBase64String($data)
   $text = [System.Text.Encoding]::Unicode.GetString($bytes)

   return $text
}
<#
.SYNOPSIS
    Converts text to Base64.

.DESCRIPTION
    Converts text to Base64.

.PARAMETER text
    The text to convert.

.OUTPUTS
    The Base64 text.

.EXAMPLE
    Encode the string to Base64 and store the result in a variable

    $result = ConvertTo-Base64 "Hello World"

.LINK
    ConvertFrom-Base64
#>
function ConvertTo-Base64 {
   param(
      [Parameter(Mandatory)][string] $text
   )

   $bytes = [System.Text.Encoding]::Unicode.GetBytes($text)
   $data = [Convert]::ToBase64String($bytes)    

   return $data
}

Creating a Manifest

We have the modules (functions) defined that we want to use so now we are ready to generate a manifest to represent them. A manifest (.psd1) is the metadata that PowerShell will use to expose and install the modules. You can create a manifest manually but PowerShell has the New-ModuleManifest that can generate a skeleton for you automatically.

But before we do that it is important to know a few rules about manifests.

  1. The manifest must match the folder name.
  2. All modules in the folder will install together.
  3. The version of the manifest is important.

Structuring the Module

The first rule is pretty easy to implement. Create a folder for the modules that matches what the modules will be exposed as. The name does not matter too much provided it contains only valid path characters, is descriptive enough so users can find it and is unique. Including the company name can help with this since users can find all company-provided modules by using a wildcard when searching later.

  1. Create a new folder called P3Net.Utility. This will be the module folder.
  2. While not necessary, this utility module may have other modules later not related to Base64 encoding so create a child folder called Base64 to group the modules together based upon functionality.
  3. Copy the 2 modules from earlier (ConvertFrom-Base64.psm1, ConvertTo-Base64.psm1) to the new folder.
  4. [Optional] Copy the RunTests.ps1 file to the module folder so the script can be run later if needed. It won’t actually be part of the module.

Generating the Manifest

Now we can generate the manifest, go to the modules folder and run the New-ModuleManifest command.

New-ModuleManifest P3Net.Utility.psd1

Open the manifest in VSCode. This is not a PowerShell script so be very careful with the syntax. There are a couple of properties to pay attention to.

  • ModuleVersion is the version of the module. This should be incremented each time an update to the module is made. It is strongly recommended that you use 3 part versioning (e.g. 1.2.3) instead of 2 part or 4 part versions. This will be discussed more later.
  • GUID is a unique ID for the module.
  • Author is the author of the module.
  • CompanyName is the company who wrote the module.

The full documentation is available here.

Managing Dependencies

A module may have dependencies on other modules or other components. The manifest file has some properties for these. In each case the dependencies are listed in an array and PowerShell will ensure they are available.

  • PowerShellVersion is an optional value that specifies the minimal PowerShell version that the module can use. We’ll set this to 5.0 for our module.
  • DotNetFrameworkVersion is an optional value that specifies the minimal version of .NET that the module requires. This is useful if the module will rely on features not available on older versions.
  • RequiredModules lists the modules that have to be imported before this module can be imported.
  • RequiredAssemblies lists the assemblies that must be loaded before this module is imported. This is useful for ensuring assemblies are loaded without explicitly loading them in the module.
  • ScriptsToProcess lists the scripts (.ps1) to run when the module is imported. This is useful for initializing the module.
  • TypesToProcess lists any custom types (.ps1xml) to load when the module is imported. This is useful when defining custom types for the module to use.
  • FormatsToProcess lists the custom formats (.ps1xml) to load when the module is imported.

Managing Exports

Everything that is exposed by the module must be exported. There are a couple of ways of doing this. The easiest way is to use the Export-ModuleMember command inside the module. This will cause the specified function, cmdlet, variable or alias to be exported. This is most useful when mixing exported and non-exported members but requires that you mix your modules with the export commands. It also makes it harder to identify what was exported unless you search the files.

An alternative approach is to list all the exports in the manifest file. This has the advantage of keeping the exports together while still allowing you to decide on which ones to export. The manifest defines a set of properties to control this. The downside is that the manifest must explicitly list each function. While not recommended you can use a wildcard as well.

Nested Modules

It is now time to talk about a rather challenging aspect of modules, nested modules. If your module is contained in a single file (either because you have a single file with all your functions or you only have a one function) then you don’t really need to do much more than make sure the module name matches the manifest and things just work. But this is unlikely to be true in most cases so we have built our example such that the default behavior would not work.

The problem we have now is that we have 2 modules (.psm1) that we want to expose as 1. This is where nested modules come in. Nested modules are modules that are nested within a parent module. When the parent module is installed or imported the nested modules are automatically installed and imported. Because they are nested the end user won’t see the nested modules but instead the parent module. This is ideal in most cases. But getting the module manifest to behave correctly is a little challenging. There are quite a few blog posts about how to set things up but it works for some people and not others. Here’s what works for me.

  1. Uncomment the NestedModules property.
  2. Add each of the modules (.psm1) to the array. Use the relative path from the manifest to the file.
  3. Export all the functions by setting the FunctionsToExport property to @('*').
NestedModules = @(
              'Base64\ConvertFrom-Base64.psm1', 'Base64\ConvertTo-Base64'
                 )

FunctionsToExport = @('*')

Note that you could list each module function again but the wildcard is easier.

Testing the Manifest

To test the manifest you can use the Test-ModuleManifest command.

Test-ModuleManifest .\P3net.utility.psd1

In my experience this command is mostly useless. Beyond verifying there is no syntactical errors and that any paths are correct it doesn’t actually do anything. The issues you’ll run into with manifests generally revolve around the import not working at all or the commands not being available. The command won’t find those errors.

To better test a module you can go ahead and load it. To do that you’ll want to start a new PowerShell session, navigate to the folder containing the manifest and then import the module.

Import-Module .\P3Net.utility.psd1 -Force

After the module is imported you should be able to run the commands as though they are built in. Help should work as well.

ConvertTo-Base64 "Hello World"

Moving to a Repository

Once you have some modules defined you’ll likely want to share them with others. The PowerShell Gallery can be used to publish modules but they would then be publicly visible to everyone. If you have modules you want to keep internal then you’ll have to set up your own gallery. PowerShell works with NuGet v2 (but not v3) so you can use any v2 NuGet feed to host your modules. We’ll talk about that now.

Creating a Package

To create a NuGet package you need a Nuspec file. Creating one is straightforward and works like it would for any NuGet package.

<?xml version="1.0"?>
<package >
  <metadata>
    <id>P3Net.Utility</id>
    <version>1.0.0</version>
    <authors>Michael Taylor</authors>
    <owners>P3Net</owners>    
    <projectUrl>http://www.michaeltaylorp3.net</projectUrl>    
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>Provides helpful commands.</description>
    <releaseNotes></releaseNotes>
    <copyright>Copyright 2018</copyright>
    <tags>powershell</tags>    
  </metadata>
  <files>
      <file src="*.psd1" target="" />
      <file src="**\*.psm1" target="" />
  </files>
</package>

The file should be at the root with the manifest (.psm1). It should have the same name as the module. The files section includes the manifest and any modules in any folders. Because there is no explicit target path the folder structure will be maintained. This is important because the structure of the module must be consistent when installed otherwise PowerShell won’t be able to locate the files properly.

To generate the package use nuget pack.

nuget pack p3net.utility.nuspec

If successful the package is generated and can be uploaded to any NuGet gallery.

Versioning

Versioning is critical when creating the package. When PowerShell installs a module it will extract the contents of the package onto disk. The structure will be <package name>\<package version>. Later when PowerShell tries to import a module it will use the module name and version that is installed. If they don’t line up the module cannot be imported. This will cause issues. So here are the rules I recommend.

  1. Ensure the Nuspec file, manifest and parent folder all share the same name. That is the module name.
  2. Ensure that the Nuspec file and the manifest file use the same exact version number.
  3. Use a 3 part version number (e.g. major.minor.patch).

NuGet supports 3 and 4 part versions but it normalizes the version number. If a 4 part version is given but the last number is 0 then it is normalized to 3 (e.g. 1.2.3.0 -> 1.2.3). If your module uses 1.2.3.0 then the module will not import because NuGet will use 1.2.3. To avoid issues always use 3 part versions and always ensure they match.

Finally, be sure to increment the version number each time you build the package for deployment. NuGet galleries won’t allow you to add packages that already exist with the same version and most won’t allow you to add older versions. So ensure the number is incrementing each time.

Internal Functions

In more complex modules it is possible you may need to share some functions that you do not want exposed, internal functions. They still have to be shipped as part of the package otherwise the modules will fail. You should follow a convention to make it easier to identify them. For example you may put all internal functions inside an Internals folder at the root of the module. Within this folder you may store your functions in separate script .ps1 files. Be sure to include them in your nuspec file as well.

<files>
   <file src="*.psd1" target="" />
   <file src="**\*.psm1" target="" />
   <file src="Internal\**\*.ps1" target="Internal" />
</files>

Since the functions are not referenced in the manifest they will be deployed with the module but not usable outside it.

PowerShell Repositories

Out of the box PowerShell uses the PowerShell Gallery for modules. You can add your own repositories as well. For testing purposes (or for your company) you can set up your own NuGet repository. Unfortunately a local file share won’t work with PowerShell. Creating a NuGet repository is simple. Refer to NuGet Server for instructions on how to quickly create your won.

To get the list of repositories already registered by PowerShell use the Get-PSRepository command.

Get-PSRepository

Registering the Repository

Once the gallery is set up you can register it with PowerShell. Anyone wanting to use modules in the repository will need to do this and it requires administrative privileges. Using an admin session run the Register-PSRepository command.

Register-PSRepository {repo-name} -SourceLocation {url} -PublishLocation {url} -InstallationPolicy Trusted

The url is the v2 URL to the repository (e.g. http://myserver/nuget/v2). Setting the installation policy to trusted identifies the repository as a trusted repository. For internal repositories this makes sense. Don’t do this for public repositories.

Note: In order to actually upload modules to your NuGet gallery you need to register it like you would any NuGet repository. Refer to NuGet Source for instructions.

Searching the Repository

Once you have the PowerShell repository registered you can search it for modules using the Find-Module command.

Find-Module -Name p3net.* -Repository {repo-name}

Using the Name parameter allows you to do wildcard searches. Using the Repository parameter allows you to search a specific repository.

Installing a Module

To install a module you must have administrative privileges. Then use the Install-Module comamnd.

Install-Module -Name p3net.utility -Repository {repo-name} -Force

Once again use the Repository name to identify the repository. The Name parameter indicates which module to install. While not required it is recommended that you use the Force parameter as well. If a version of the module is already installed then it won’t install a newer version. The Force parameter forces it to install the latest version.

Viewing Modules

To view all the modules that have been installed, and the commands they provide, use the Get-InstallModule command.

Get-InstalledModule

To get the list of modules that are imported or can be imported use the Get-Module command.

Get-Module -ListAvailable

Notice the commands available will be displayed as well.

Importing a Module

After a module is installed it can be imported by anyone using the Import-Module comamnd.

Import-Module P3Net.Utility -Force

It is at this point that you can confirm everything is working correctly. If you get any errors then something is wrong with the manifest or your modules. The most common issue is that the version is wrong or there was nothing exported.

Note: Later versions of PowerShell will automatically import modules when a command is used in them. Therefore you can normally just use a command whether the module has been imported or not.

Using TFS/VSTS Package Feeds

If you are running TFS or VSTS then you may already be using Package Management. This (paid) option can be used to host PowerShell packages as well if you want. I strongly recommend that if you want to use VSTS for this then create a new package feed specifically for PowerShell. You don’t want to mix these with .NET packages.

Setting up a feed is straightforward and the registration process is the same but there are a couple of issues with using VSTS.

Licensing

The bigger issue is licensing. At this time Microsoft requires that anyone using the feed have a license. If you intend to host PowerShell modules in VSTS and share them with everyone in your company then everyone would need a license for Package Management. This is probably a showstopper for most companies. Hosting your own NuGet gallery with NuGet Server or equivalent is probably the cheaper, better option.

Authentication

Related to licensing but far more intrusive is the authentication requirements. VSTS requires that all package access be authenticated. NuGet v2 requires personal access tokens to connect. Therefore everyone using the package feed must have credentials and a PAT. Each user can create their own PAT but then the PAT must be wired up to credentials.

The following code sets up a credential object in PowerShell.

$pwd = ConvertTo-SecureString '{pat}' -AsPlainText -Force
$cred = New-Object PSCredential '{email-address}', $pwd

Once the credential is created it must be passed to any call that accepts the Repository parameter otherwise the call will fail.

Install-Module P3Net.Utility -Repository {repo-name} -Credential $cred

PowerShell 6

PowerShell 6 is a replatformed version of PowerShell running on .NET Core. While there are some differences from PowerShell 5, almost everything discussed in this article applies to PowerShell 6 as well. In fact you can install modules in PowerShell 5 and they will work with PowerShell 6 as well provided you are sharing the module paths.

In PowerShell 5 modules were generally installed to PSModulePath. By default that doesn’t work in PowerShell 6. But you can install the WindowsPSModulePath module and then add the path to the list of paths that PowerShell 6 references. This allows you to install and use modules in both versions.

Note: There is currently one issue with PowerShell 6 to be aware of. Installing a module from a repository other than PowerShell Gallery currently fails. It is a bug in the implementation and will hopefully be fixed. Until then use PowerShell 5 to install the module. You can then use it from either.

Download the code from Github.