Exploring the Azure CLI 2.0 with Windows Subsystem for Linux

The new Azure CLI 2.0 was released a few weeks ago so it was time for me to upgrade and take it for a spin. I [blogged](GHOST_URL/install-and-run-the-azure-cli-on-the-windows-subsystem-for-linux-wsfl" target="_blank) a while ago on how to install the "old" CLI which was based on Node.js but this is a whole new beast so let's get started. This is a true 2.0 in so may ways!

Installing

I decided to install it on my WSL because I can take advantage of my very limited Linux skills and showcase to customers the capabilities of W10 and Linux in one box. Unlike its predecessor, the Azure CLI 2.0 has been written in Python instead of Node.js. The Python SDK is generated from the Azure API and the Python CLI is built around the SDK. All the functionality implemented by the CLI is using the exact same public REST service. There's nothing magical about it, other than the fact that it works great

To install it, I had to update my WSL to the latest (not on Creators update yet) and make sure that my dependencies are all in place. You can find how to install it on the OS of your choice [here](https://docs.microsoft.com/en-us/cli/azure/install-azure-cli" target="_blank).

In my case, I had to run the following commands in order:

sudo apt-get update
sudo apt-get upgrade
sudo apt-get update && sudo apt-get install -y libssl-dev libffi-dev python-dev
sudo apt-get install python
curl -L https://aka.ms/InstallAzureCli | bash

The last command is the one that downloads and installs the CLI. Once that completes successfully [make sure you look for any errors in the logs], make sure that you restart Bash for the changes to take effect.

Using the Azure CLI 2.0

First you need to log in to access your subscription. Or one of them. The login command is:

az login

You can choose to use the interactive mode and complete the authentication on the browser or, ideally, define a service principal username and password

az login --service-principal -u <SP Username> -p <SP Password> --tenant <tenantID>

If you want to know how to create a service principal on Azure, have a look [here](GHOST_URL/service-principals-in-microsoft-azure/" target="_blank)

What's really interesting in the new tool is that the authentication process is much more straightforward. The CLI does all the hard work behind the scenes exchanging information with the portal and storing securely the SSH keys. If you use the web login option then you'll notice that there is no need to provide a password!

Once logged in, make sure you select the right subscription to work against. You can list all your subscription with:
az account list

And then select the appropriate one with the following command:
az account set --subscription <subscription NameOrID>

To find out how to use any command, you can call the built-in help feature (-h or --help). For example, to figure out how to create a new virtual machine, you can start by running the following command:
az vm create -h

Productivity enhancements

One of the coolest features of the new CLI is command auto-completion. For example you can use command + double TAB to find out appropriate values for your parameters. Assuming that I need to create a new VM which needs to be assigned to a Resource Group, I can just run:
az vm create -g (TAB TAB)

The result should look similar to this:

cmatskas@DESKTOP:~$ az vm create -g (TAB TAB)
test      testfunc
cmatskas@DESKTOP:~$ az vm create -g test

The 2 available resource groups are listed directly below my command with the first one already preselected in the next prompt.

When you invoke command completion using double TAB, the CLI makes a restful call to Azure to pull the necessary information to use in command parameters. This information then gets cached to save you subsequent round trips for the same type of data. The command completion experience is pretty basic (yet powerful) at the moment and the team is planning on improving this by providing visual cues, like a spinner.

Working out what arguments your command needs is hard. Therefore, for every command, there are global arguments which are required by default and you'll be prompted to provide them. Optional arguments are inside square brackets and can be omitted since the tool will use defaults instead. If I attempt to create a VM and there are missing arguments, then the CLI will prompt me to supply them. In the example below I've missed the VM name (and OS image but one at a time):

cmatskas@DESKTOP:~$ az vm create -g test
az vm create: error: argument --name/-n is required
usage: az vm create [-h] [--output {json,tsv,table,jsonc}] [--verbose]
                    [--debug] [--query JMESPATH]
                    [--public-ip-address-dns-name PUBLIC_IP_ADDRESS_DNS_NAME]
                    [--image IMAGE] [--no-wait]
                    [--storage-sku {Premium_LRS,Standard_GRS,Standard_LRS,Standard_RAGRS,Standard_ZRS}]
                    [--nsg NSG] [--os-disk-name OS_DISK_NAME]
                    [--storage-container-name STORAGE_CONTAINER_NAME]
                    [--validate] [--size SIZE] [--subnet SUBNET]
                    [--availability-set AVAILABILITY_SET]
                    [--nsg-rule {RDP,SSH}] [--nics NICS [NICS ...]]
                    [--authentication-type {ssh,password}]
                    [--admin-password ADMIN_PASSWORD]
                    [--attach-os-disk ATTACH_OS_DISK]
                    [--storage-caching {ReadWrite,ReadOnly}]
                    [--location LOCATION] --name NAME
                    [--public-ip-address PUBLIC_IP_ADDRESS]
                    [--vnet-name VNET_NAME] [--tags [TAGS [TAGS ...]]]
                    [--ssh-key-value SSH_KEY_VALUE]
                    [--data-disk-sizes-gb DATA_DISK_SIZES_GB [DATA_DISK_SIZES_GB ...]]
                    [--storage-account STORAGE_ACCOUNT] [--use-unmanaged-disk]
                    [--generate-ssh-keys]
                    [--subnet-address-prefix SUBNET_ADDRESS_PREFIX]
                    --resource-group RESOURCE_GROUP_NAME
                    [--public-ip-address-allocation {dynamic,static}]
                    [--secrets SECRETS [SECRETS ...]]
                    [--admin-username ADMIN_USERNAME]
                    [--ssh-dest-key-path SSH_DEST_KEY_PATH]
                    [--vnet-address-prefix VNET_ADDRESS_PREFIX]
                    [--custom-data CUSTOM_DATA]
                    [--private-ip-address PRIVATE_IP_ADDRESS]
                    [--os-type {windows,linux}]
cmatskas@DESKTOP-MK9SQGO:~$

To further enhance productivity, you can use the --no-wait command when executing commands. The goal is to make the experience of asynchronous command execution embedded in the CLI. This is somewhat similar to using Bash's '&' which spins off a separate process to execute the command in the background. You can use either of these to make your scripting experience more fluid and unblock your workflow from any pending operations.

By default, all command output is in JSON format. This is great for programmatic purposes but not so nice when you need to be able to see the results in a easy and readable format. The new CLI comes with a choice of output formats which you can define by passing the output argument:
-o table/json/tsv/jsonc .

You could, however, take it a step further and define a default output format for the whole CLI instead of having to always pass it explicitly every time as an argument. You can configure the default settings by typing:
az configure

This will take you go into configuration mode. You can change:

  • default output format
  • logging to a file
  • sending usage telemetry to MS.

This functionality is available at any time so if you change your mind and want to reset your defaults you can do it using this command.

Finally, I'll touch upon the new clever querying capabilities within the CLI itself. All commands allow you to pass the --query argument that can be used to filter results and select a subset of properties from the resulting dataset. For example, if I wanted to know only the type and name of my resources in my subscriptions, I could filter my results using the following query:

az resource list --query "[].{name:name, type:type}" -o tsv

The query pulls all the objects, i.e [] and then selects the desired properties using the .{displayName: propertyName}. The result looks like this:

osdisk_3RLVONJQaS     Microsoft.Compute/disks
cmtestvm              Microsoft.Compute/virtualMachines
cmtestvmVMNic         Microsoft.Network/networkInterfaces
cmtestvmNSG           Microsoft.Network/networkSecurityGroups
cmtestvmPublicIP      Microsoft.Network/publicIPAddresses
cmtestvmVNET          Microsoft.Network/virtualNetworks
function2faa93389fcf  Microsoft.Storage/storageAccounts
cmtestfunc            Microsoft.Web/serverFarms
cmtestfunc            Microsoft.Web/sites

To further improve the results, I could pipe them with | column -t to "tab them". Finally, you can take advantage of native commands such as grep to provide more flexibility on how you return the data. For example:

az resource list --query "[].{name:name, type:type}" -o tsv | column -t | grep cmtest

osdisk_3RLVONJQaS     Microsoft.Compute/disks
cmtestvm              Microsoft.Compute/virtualMachines
cmtestvmVMNic         Microsoft.Network/networkInterfaces
cmtestvmNSG           Microsoft.Network/networkSecurityGroups
cmtestvmPublicIP      Microsoft.Network/publicIPAddresses
cmtestvmVNET          Microsoft.Network/virtualNetworks
cmtestfunc            Microsoft.Web/serverFarms
cmtestfunc            Microsoft.Web/sites

You'll notice that the storage account is missing since grep excluded it from the result set.

Conclusion

As I work more and more with various customers and requirements, the need to be cross platform and able to adjust to different tools is more and more prevalent. The new Azure CLI 2.0 is an invaluable tool when working with Azure outside the portal and the productivity enhancements now put it in front of race. You can have a look into the CLI documentation here to get a better understanding on how to achieve certain tasks and find sample scripts you can use straight away. I would highly recommend the new CLI and I hope that you download it and use it for your own scripting needs.


  • Share this post on