Atlas

Introducing Atlas – automate anything

This is a topic I’m really excited about. There’s an OSS project named Atlas which came out of our team that has recently been made public. It’s still under development but has hit a usable state. I’ll probably be doing a number of these posts as it goes along.

So – What is Atlas? It revolves around the idea that there are several tasks and procedures involved in setting up a real production system – from CI/CD pipelines to test/staging/production cloud environments – but there isn’t a very solid way we were aware of to capture that setup in a form that could be source controlled and executed, reliably and repeatably, in an unattended fashion.

Yes, there are many command line tools you can script with Bash and PowerShell. And ARM templates are fantastic. But in a lot of cases it still boils down to creating scripts, checked into source control, but at the end of the day are often run from a developer’s workstation. Weeks or months later when it’s time to update something there’s often a problem of dusting those scripts off, figuring out how they are used.

And occasionally if there’s something small to change you’ll just tweak something manually through a web admin interface. That always makes me feel like something’s not quite right, and I always make a mental note when that happens — one of these days that system’s configuration really needs to be source controlled and automated.

Enough talk, let’s see Hello World

There is a huge distance between Hello World and automating your production engineering systems. But you’ve got to start somewhere. 🙂

The first think you’ll need is Atlas on your command-line. It’s currently in a pre-release 0.1 state, but the source code and nightly builds are publicly available and can be installed as a dotnet global tool. You can do that on Windows, Linux, and Mac by running:

dotnet tool install -g dotnet-atlas --add-source https://aka.ms/atlas-ci/index.json

> dotnet tool install -g dotnet-atlas --add-source https://aka.ms/atlas-ci/index.json
You can invoke the tool using the following command: atlas
Tool 'dotnet-atlas' (version '0.1.3488387') was successfully installed.

> atlas
Atlas version 0.1.3488387

Usage:  [options] [command]

Options:
  -h|--help  Show help information

Commands:
  account
  deploy
  generate
  help

Use " [command] --help" for more information about a command.

The primary command we’re interested in is deploy. If you’re familiar with Helm it will seem very similar – when you run deploy the only argument that is required is the path to the folder that contains the blueprint of what is being deployed. The simplest possible workflow can even be created with an echo command.

> mkdir hello

> echo operations: [{message: Hello World}] >hello\workflow.yaml

> atlas deploy hello
Atlas version 0.1.3488387

  - Hello World

There! You’ve just executed your first workflow with a single operation. The file format is YAML, which is essentially JSON where most of the quotes and curly braces are optional. If this workflow was written either of these two ways it would be literally identical in memory when loaded.

{
  "operations": [
    { "message": "Hello World" }
  ]
}
operations:
- message: Hello World

Invoking a REST API

So far we’ve seen that a workflow can be a series of operations which write messages to the console. Another thing an operation can do is invoke any of the Azure Resource Manager, Azure Active Directory, and Azure DevOps REST APIs. This is where the capabilities of the tool really come into play.

Let’s mkdir report a new folder and create a workflow to fetch the list of your Azure subscriptions. First you’ll need to create a YAML file to declare the details of the REST API we’ll be calling.

# docs https://docs.microsoft.com/en-us/rest/api/resources/subscriptions/list
method: GET
url: https://management.azure.com/subscriptions?api-version=2016-06-01
auth:
  tenant: {{ azure.tenant }}
  resource: https://management.azure.com/
  client: 04b07795-8ddb-461a-bbee-02f9e1bf7b46 # Azure CLI

It’s always nice to include the address of the docs for REST API in a comment, and the method and url are exactly what you’d expect. The auth section tells Atlas what it needs to know to acquire the OAuth bearer token for the resource.

Your AAD tenant id will need to be provided as the azure.tenant value. It can be either the guid or the tenant’s domain name like your-domain-name.onmicrosoft.com. For now we’ll be passing it in at the command command line, but there are other ways that can be done.

Second, you’ll need to create a workflow with an operation that calls this API.

operations:
- message: Listing subscriptions
  request: subscriptions-list.yaml

And that’s all we need to go ahead and run it! If you try this at home you’ll need to look up your own tenant id of course.

> atlas deploy report --set azure.tenant=whereslou.onmicrosoft.com
Atlas version 0.1.3488387

  - Listing subscriptions
GET https://management.azure.com/subscriptions?api-version=2016-06-01
To sign in, use a web browser to open the page https://microsoft.com/devicelogin and enter the code BSxxxxxDW to authenticate.
OK https://management.azure.com/subscriptions?api-version=2016-06-01 18658ms

Atlas will use the device code authentication flow as needed when running interactively. The actual login is done via browser, so nothing like login passwords or 2FA codes pass through Atlas itself.

Doing something with the response

Even though the request succeeded, none of the information was written as output. To take a closer look at what happened you can read the log files which are written to a _output/logs subdirectory. You should find a file named _output/logs/001---Listing-subscriptions.yaml which contains all of the request and response data for that operation.

Once we know what the structure of the response looks like, the next thing we can do is capture some of that information into working memory. That’s done by declaring any number JMESPath queries in an output property. If you’ve used the Azure CLI before, this is exactly the same as what you can do in the –query option.

operations:
- message: Listing subscriptions
  request: subscriptions-list.yaml
  output:
    subscriptions: ( result.body.value[].{id:subscriptionId, name:displayName} )

And now when we run this we will see that subscription names and ids are written to the console when the workflow has finished. There is also an _output/output.yaml yaml file writen if you’d like do further processing on the results after the workflow has completed.

> atlas deploy report --set azure.tenant=whereslou.onmicrosoft.com
Atlas version 0.1.0

  - Listing subscriptions
GET https://management.azure.com/subscriptions?api-version=2016-06-01
OK https://management.azure.com/subscriptions?api-version=2016-06-01 720ms
subscriptions:
- id: 3d2exxx-xxx-xxxx-xxxx-xxxxxxxx3c62
  name: whereslou dev/test

There’s clearly a huge gap between listing subscriptions and automating Azure and Azure DevOps. If you’d like to see more right now there are several examples on GitHub, but as things take shape I plan on doing more blog posts about different tricks and techniques that are available. Eventually it will be nice if we can start a common repository to show standard workflows, similar to Helm charts but it’s a bit too soon for that just yet.

One thought on “Introducing Atlas – automate anything

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.