Azure DevOps HandBook !

Azure DevOps is an offering from Microsoft which accelerate and simplify the DevOps process. This service comes with version control, reporting, requirements management, project management, automated builds, testing and release management capabilities. It covers the entire application life-cycle, and enables DevOps capabilities. In this post I am trying to cover some important points which can help you while working on pipelines. Before you start reading further I must warn you that this post expects basic knowledge of Azure DevOps pipeline.
Important Predefined variables -
Agent.BuildDirectory [Pipeline.Workspace] — The location on the agent pool machine where all folders for a given build pipeline are created.
Agent.TempDirectory — A temporary folder that is cleaned after each pipeline job. You can use to create some secure file and clean it up after that
Build.ArtifactStagingDirectory — Use it for publishing artifacts, This directory is purged before each new build
System.DefaultWorkingDirectory — As the name says default working dir
Build.BuildId or Build.BuildNumber choice is yours !
Container Jobs -
Sometimes it’s important to run Jobs on container. It will help you to reduce dependency on agent pool nodes. Containers offer a lightweight abstraction over the host operating system.
container:
image: xyz
options: --hostname xyz-host --ip 192.168.0.1steps:
- script: echo hello###############################################################container: xyz:tag
steps:
- script: echo hello
Agent Node Pool
In some cases you may need to select specific node from node pool. This is helpful in those cases when you need to identify node issues or specific node has some package which is required for your Job.
pool:
name: MY-WIN
demands: 'Agent.Name -equals myhost-01'
steps:
- script: echo hello world#https://docs.microsoft.com/en-us/azure/devops/pipelines/process/phases?view=azure-devops&tabs=yaml
Run time Parameters
parameters:
- name: configs
type: string
default: 'x86,x64'trigger: nonejobs:
- ${{ if contains(parameters.configs, 'x86') }}:
- job: x86
steps:
- script: echo Building x86...
- ${{ if contains(parameters.configs, 'x64') }}:
- job: x64
steps:
- script: echo Building x64...
- ${{ if contains(parameters.configs, 'arm') }}:
- job: arm
steps:
- script: echo Building arm...
Specific task
parameters:
- name: runPerfTests
type: boolean
default: falsetrigger: nonestages:
- stage: Build
displayName: Build
jobs:
- job: Build
steps:
- script: echo running Build- stage: UnitTest
displayName: Unit Test
dependsOn: Build
jobs:
- job: UnitTest
steps:
- script: echo running UnitTest- ${{ if eq(parameters.runPerfTests, true) }}:
- stage: PerfTest
displayName: Performance Test
dependsOn: Build
jobs:
- job: PerfTest
steps:
- script: echo running PerfTest- stage: Deploy
displayName: Deploy
dependsOn: UnitTest
jobs:
- job: Deploy
steps:
- script: echo running UnitTest
Type of parameters data type

Pipeline Variables
Agent creates working space on disk to hold the source code, artifacts, and outputs used in the run.
echo '##vso[task.setVariable variable=myVar]myValue'Write-Host "##vso[task.setVariable variable=myVar]myValue"
Other items that the agent can upload include artifacts and test results. These are also available after the pipeline completes. If you want to publish artifact
steps:
- task: PublishPipelineArtifact@1
inputs:
targetPath: $(System.DefaultWorkingDirectory)/bin/WebApp
artifactName: WebApp
Download artifact
steps:
- task: DownloadPipelineArtifact@2
inputs:
artifact: WebApp
By default, files are downloaded to $(Pipeline.Workspace)/{artifact}
, where artifact
is the name of the artifact.
Classic Pipeline vs YAML based
Azure DevOps Classic pipeline is the original mechanism for pipeline building. Then Microsoft came with newer and improved concept of YAML based pipelines. These pipelines allowed to work on pipelines as code and you can keep them in repository. This allowed greater flexibility and control.

Classic pipelines are not going to hold longer and Microsoft is putting less attention to it. Both mechanisms are having their pros and cons but the things is Microsoft is putting effort on YAML based. So I believe its time to move on.

More detailed comparison can be referred here —
Share Variable Across Stages
YAML pipelines have powerful mechanism of stages. Variables between stages can not be accessed directly. You have to perform few steps to achieve that. Check the below URL for detailed steps.
Release Notes for Azure DevOps features-
I will keep updating this post with new and updated details !