Azure DevOps Pipelines Basics

Get Started. It's Free
or sign up with your email address
Azure DevOps Pipelines Basics by Mind Map: Azure DevOps Pipelines Basics

1. What are pipeline jobs?

1.1. Every pipeline consists of 1 or more jobs

1.2. Each job consists of 1 or more steps

1.3. Jobs can be grouped into stages

1.3.1. A single pipeline can have multiple stages

1.3.2. Each stage can run on different computing platforms This can make sense when you have jobs for running on a Linux platform vs another job that needs a Windows platform

1.4. The language for developing jobs is YAML

1.4.1. A "hello world" sample job for an Azure Pipeline job key is only needed when you want to provide additional job-level properties like timeoutInMinutes pool key with nested vmImage key is needed when you want to run the job against a hosted agent steps key is always required The "-" prefix is interpreted as indentation In the very simple example our steps consists of a single step that runs an echo command in Bash

2. Running pipeline jobs

2.1. Agent pool jobs

2.1.1. A pool of physical or virtual machines running a version of Windows or Linux, with each machine in the pool sharing a common hardware and software configuration Each machine in the pool performs the function of an agent for receiving and running Azure Pipeline jobs

2.1.2. The characteristics of a job determine the required agent configuration Jobs that require running a Windows executable will require a Windows agent Jobs that require running a Debian based application will require a Linux agent Jobs that require running Bash, PowerShell or Python scripts can run on either a Windows or Linux based agent In all cases, you need to ensure that not only does the agent have the right host OS but also the necessary runtimes installed for the job

2.1.3. Jobs can only run if the pool has an agent available

2.2. Server jobs

2.2.1. These are also known as agentless jobs, and they run directly on the Azure DevOps (cloud) or TFS (on prem) server

2.2.2. The range of jobs that can run as server jobs is limited

2.2.3. In the YAML template, use one of the following to specify a server job: pool: server server: true

2.3. Agent demands

2.3.1. Demands specify what capabilities the agent must have in order to satisfy the requirements of a job

2.3.2. Demands are linked to OS, applications and versions

2.3.3. Multiple demands can be specified for each job

2.3.4. Demands can be asserted manually or automatically Manual demands are explicitly specified in the YAML for the job Automatic demands occur for various built-in tasks, such as the Visual Studio build task, which automatically and implicitly asserts demands for msbuild and Visual Studio to be installed on a Windows based agent

2.3.5. Agents will only proceed with running jobs if all demand assertions are satisfied If demands are not met, the pipeline job will fail and abort The server will not attempt to remediate demand assertion failures

2.3.6. See attached example of a YAML pipeline job with demands The demands list includes two: agent.os -equals Linux python3 - equals /usr/bin/python3

2.4. Container jobs

2.4.1. Jobs can run inside a Docker container Both Windows and Linux support Docker containers, so your agent can be either Windows or Linux for running jobs this way

2.4.2. Docker containers provide more control over job execution evironment The agent requires less initial setup and is easier to maintain

2.4.3. When jobs are submitted to an agent as container jobs, the agent will first fetch and run the specified container before submitting each step of the job to run inside that container Container images can be retrieved from Docker Hub or private registries

2.4.4. Container jobs are generally considered for introduction in more mature DevOps environments If just starting out, then stick to agent jobs for time being

3. Developing Azure Pipeline jobs

3.1. The original method for developing pipeline jobs was the Classic UI

3.1.1. This provides a GUI-based drag and drop approach

3.1.2. It's useful for learning YAML as you can convert jobs defined via the classic UI to YAML

3.2. YAML pipelines is a relatively new innovation for Azure DevOps (as of February 2021) but has very rapidly been accepted as the defacto standard for CI/CD pipeline jobs

3.2.1. YAML pipelines enable unified CI/CD pipelines This means there is no longer a need for separate build and release pipelines (which the classic UI requires - the build pipelines produce artifacts that act as inputs to the release pipelines)

3.2.2. Use Visual Studio Code to develop your YAML pipelines and install the Azure Pipelines extension to get syntax highlighting and Intellisense

3.3. As of time of writing (Feb 2021), Azure DevOps is in a transitionary period of going from the Classic UI to YAML Pipelines, so for some time we will need to understand both

3.3.1. See attached for comparison of Classic UI vs YAML Pipelines YAML Pipelines don't yet support all the scenarios the Classic UI does, but already they support some scenarios that Classic UI does not For mature DevOps platforms with heavy investment in pipeline jobs built using Classic UI, you will likely need to use this approach until such time as a decision is made to migrate to YAML Pipelines The Classic UI is gradually being phased out and will eventually be replaced entirely by YAML Pipelines

4. Extending Azure DevOps functionality

4.1. In addition to the built-in tasks provided by Microsoft, you can extend functionality via DevOps extensions that are published via the Visual Studio Marketplace

4.2. Azure DevOps can also be part of an integrated CI/CD framework that includes other external solutions

4.2.1. This enables you to preserve existing investments in CI/CD solutions like Jenkins or Team City

5. Microsoft-hosted Agents

5.1. These are single-use virtual machines provided by Microsoft

5.2. They have a host OS of a specific version that is automatically patched and upgraded by Microsoft

5.3. Pre-defined software packages necessary to run DevOps pipeline job tasks are installed on each VM

5.4. You may need to wait a while for an agent to become available when you trigger a pipeline job, but normally the wait is no more than 1 or 2 minutes

5.5. All tasks run with the highest level of permissions

5.5.1. This means that on Windows machines all tasks run under local administrator credentials and on Linux they run under superuser (root) credentials

5.5.2. There is no need to factor in permission elevation techniques in any of your tasks - e.g. no need to use sudo in Bash scripts running on Linux

5.5.3. All agents in pool are wholly isolated from each other so these high permissions don't constitute a security threat

5.6. Agents are torn down after all the jobs allocated to it in a single pipeline have completed

5.6.1. This means that any data cached on the agent during a pipeline run is lost and does not persist For example, data fetched from a Git repo does not persist on any agent beyond the agent's lifecycle (i.e. the job run window)

5.7. Additional packages can be installed

5.7.1. This allows you to install later or earlier versions of a particular package, which might be required for your task(s)

5.7.2. There is no interactive access to the agents, so any installation technique must be implemented via code that does not require any manual intervention

5.7.3. It's generally a bad idea to have your pipelines spend a lot of time downloading and installing software packages If you find yourself in this position, then you should consider using self-hosted agents

5.8. All the details about the latest images for DevOps Agents are available in GitHub

5.8.1. To see the documentation, navigate into the windows or linux folder and open the file for the particular version of interest and it will tell you everything about what's included in that image

6. Agent pools

6.1. Instead of managing each agent individually, you organize agents into agent pools

6.2. In Azure Pipelines, pools are scoped to the entire organization; so you can share the agent machines across projects

6.3. When you configure an agent, it is registered with a single pool, and when you create a pipeline, you specify which pool the pipeline uses

6.4. When you run the pipeline, it runs on an agent from that pool that meets the demands of the pipeline

6.5. By default, each Azure DevOps organisation gets two agent pools

6.5.1. Agent Pipelines is a hosted pool with various Windows, Linux, and macOS images For details of the images that are available within the hosted pool, see GitHub The Azure Pipelines hosted pool replaces the previous hosted pools that had names that mapped to the corresponding images Although the Azure DevOps UI makes it look like you can have multiple hosted agent pools, in effect there is only 1 hosted pool available, which is the managed service provided by Microsoft in the cloud In some circumstances, you may still see the old pool names, but behind the scenes the hosted jobs are run using the Azure Pipelines pool

6.5.2. Default is to be used to to register self-hosted agents that you've set up

6.6. In a YAML pipeline, you specify that you want to use the hosted Azure Pipelines pool by adding the pool tag with a valid vmImage property

6.6.1. pool: vmImage: 'windows-latest'

6.6.2. There are currently 10 different valid values for vmImage The Azure Pipelines hosted agent pool includes many agents spanning all 10 of these VM images

6.6.3. To specify a self-hosted pool in a YAML pipeline, you name it in the pool tag So, to specify the Default pool: pool: 'Default' Or another self-hosted pool you've provisioned named My Pool pool: 'My Pool'

6.7. In a classic pipeline, you explicitly specify the hosted pool name and pick the agent specification (VM image) via drop down menus

7. Self-hosted Agents

7.1. Provide much greater control over application binaries

7.1.1. You can add custom applications not part of hosted agent images

7.1.2. You can add older versions of some built-in binary that is needed for some compatibility reasons for a legacy application build/deployment

7.2. Data caches and config persist between runs unlike on hosted agents, which tear down after each job run

7.2.1. This can provide some advantages in certain scenarios and can also create some issues in other scenarios Often there is a need to explicitly add a step to clear out cached data and config after a job run is done

7.3. Self-hosted run can be on Windows, Linux, macOS and Docker

7.3.1. Agents can run on non server versions too, such as Windows 10 instead of Windows Server

7.4. Agents can run either interactively or as a service

7.4.1. Interactive makes sense when running an agent hosted on your own machine during dev, but outside of that running the agent as a service is more likely

7.5. User is responsible for all management and configuration, and all major version agent upgrades

7.6. Networking is an important consideration when considering self-hosted agents and their interaction with the Azure Pipelines cloud service

7.6.1. Microsoft hosted agents automatically have access to the Azure Pipelines Service without any effort needed from you, but self-hosted agents need network firewall config and also represent only secure way to make deployments against on-prem targets Self-hosted agents need outbound port 443 (for https protocol) to be allowed in order to communicate with the Azure Pipelines service that encapsulates the self-hosted agent pool definition Microsoft hosted agents have no access by default to any on-prem deployment targets and best security practice dictates that self-hosted agents with "line of sight" access (i.e. on the same Vnet) are used to make those deployments Self-hosted agents don't have to be on-prem, they could be VMs hosted in an Azure Vnet

7.7. Provisioning self-hosted agents

7.7.1. Before installing a self-hosted agent on a host, you should check the prerequisites required for running the agent software Agents on Windows require minimum versions for OS, PowerShell and .NET Framework Agents on Linux support specific flavours of Linux with minimum versions

7.7.2. Identify a user with permission to administer agent pools in your DevOps project Under Project Settings | Pipelines | Agent pools, you can click the Security button and see the groups with permission to administer the agent pools For example, the Build Administrators group has this permission Under Project Settings | Permissions, you can select a group that has permission for the agent pool administration and if required, add a user Create a Personal Access Token for the user, which will be used by the agent in order to register itself into the DevOps agent pool

7.7.3. You also need to consider networking and local/domain user requirements for the self-hosted agent The agent needs to be able to make outbound connections via port 443 (HTTPS connections), and in some organisations this may require configuration using an Internet proxy server If the agent will need to deploy to on-prem targets then it is going to need to run under user credentials that have sufficient permission to connect to those targets and perform the deployment tasks If the agent will only need to deploy to Azure hosted targets then the local user permissions are not important and it's recommended to run the agent under a local service account

7.7.4. Proof of concept: provision self-hosted agent using Docker container On Windows, we can use Docker for Windows Step 1: Run a new container from an community base image We will use PowerShell to run the Docker container Ubuntu 20.04 is a suitable base image at the time of writing (Feb 2021), and available via Docker Hub PowerShell command: Step 2: Create a personal access token in Azure DevOps for the agent to use for authentication Go to User Settings in Azure DevOps and create a personal access token (PAT), and keep hold of the token value for use later Step 3: Add some essentials for the container mkdir /usr/ian/ cd /usr/ian/ apt-get update apt-get install git git config --global "Ian Bradshaw" git config --global "[email protected]" apt-get install python3.8 apt-get install wget Step 4: Create agent in Azure DevOps agent pool In Azure DevOps go to Organization Settings and under Agent pools, select the Default pool and click the New agent button Step 5: Download the agent on the Ubuntu Docker container and install dependencies wget mkdir agent cd agent tar xzf ../vsts-agent-linux-x64-2.181.2.tar.gz ./bin/ Step 6: Save container as a new custom image exit docker ps --all docker commit <container_name_or_id> <new_image_name> docker image ls Step 7: Configure agent and run interactively docker run -it devop-ubuntu-agent-01 bash cd /usr/ian/agent/ ./ ./ Step 8: Verify that Azure DevOps agent pool recognises the self-hosted agent Go to Azure DevOps Organization Settings and verify that new self hosted agent shows online in the Default agent pool If you look at the details of the agent, you should see that its system capabilities have been automatically reported to DevOps and are listed Step 9: Verify that the self-hosted agent is able to successfully execute a simple pipeline Create a very simple pipeline for test purposes, specifying that it should run on the Default agent pool and demand an agent running Linux On very first pipeline job submitted to a new self-hosted agent you will see the job remain queued... if so, click on it and you will be prompted to grant permission Verify that the pipeline job completes successfully

8. Agent capabilities and demands

8.1. Every self-hosted agent has a set of capabilities that indicate what it can do

8.2. Capabilities are name-value pairs that are either automatically discovered by the agent software, in which case they are called system capabilities, or those that you define, in which case they are called user capabilities

8.3. The agent software automatically determines various system capabilities such as the name of the machine, type of operating system, and versions of certain software installed on the machine

8.3.1. Also, environment variables defined in the machine automatically appear in the list of system capabilities

8.4. When you author a pipeline you specify certain demands of the agent

8.4.1. The system sends the job only to agents that have capabilities matching the demands specified in the pipeline If a demand is specified that cannot be met by any available agent in the agent pool, the job will fail immediately with a message to that effect

8.5. When using Microsoft-hosted agents, you select an image for the agent that matches the requirements of the job, so although it is possible to add capabilities to a Microsoft-hosted agent, you don't need to use capabilities with Microsoft-hosted agents

9. Azure DevOps REST API

9.1. Microsoft provides a comprehensive REST API for the Azure DevOps service

9.2. Microsoft also provides a reference for properly forming URLs when working with the REST APIs