1. Ready the Sourcecode
  2. Upload Sourcecode to Azure Repo
  3. Create a build pipeline in azure – Creation of YAML Pipeline
  4. Creating service connection for the project
  5. Building release pipeline for deployment
  6. Compliancy check for build and release pipeline

There are two pipelines. Build and release pipeline. Build pipeline would be mostly one and there would be multiple release pipelines. There would be multiple config files appended to the release pipeline. These are basically Yaml files that are displayed in stages. In artifact there would be no details regarding the environment and DB config details. The Environment and config details are picked from the Stages which has multiple YAML file containing details of various envs and config which would be appended to the artifact at the time of deployment.

Creating a New Build Pipeline for Project

  1. Create a new repository and add readme.txt file which creates a master branch. Add simple spring boot project
  2. Create a new pipeline. While creating pipeline it asks to select repo.On Successful creation of pipeline new azure-pipeline.yml would be created and added as new file along with project file in repo.
  3. Make below changes in azure-pipeline.yml file(applicable for basic spring boot project)
    1. Step to Create Revision number mostly from environment variables
    2. Step to Build Spring boot project
    3. Step to Copy the JAR file and manifest.yml created at end of build
    4. Step to publish artifact and put in location drop

Creating a New Build Pipeline for Project

  1. From the Drop location the files would be picked by release pipeline. This is configured in manifest.yml.The name of the JAR created should be same as one specified in manifest or else it would complain as file not found error
  2. Release pipleine contains 2 things Artifact and Stages
  3. Artifact is the one copied from Build Pipeline. Azure Devops would be configured to pick the latest artifact from Branch
  4. The Trigger attached to Artifact tells from which branch the arifact should be copied and whether new release should be created
  5. Stages contains Jobs and Tasks. For running jobs we need agent. This is again configurable. By Default it would be set to some Ubuntu Linux agent
  6. The Artifact available in previous step now needs to be pushed in PCF, which would be done by creating new task. For this Clound Foundary endpoint and commands would be defined.Incase you are using PCF you can use Cloud Foundary CLI. In the arguments the location of the manifest.yml should be specified. Reading this manifest helps to locate the
    name of the JAR file which should be pushed into cloud environment. For the same reason we copy both JAR and Manifest in Step 3(3) in build pipeline. Now this would be picked from drop location
  7. There would be predeployment condition which checks for the availability of Artifact. This is again similar to trigger which runs checking for the availability of new release(artifact) for deployment

Irrespective of Devops tool you use the below are the generic steps which would be carried out from
code changes to deployment in environment

Build Pipeline

  1. Step 1-Preparation of Environment for Build. This includes initializing buildno, commitid, buildname which would be used internally by the devops tool
  2. Step 2-Build Script for creating artifact
  3. Step 3-Copy Script for Copying Artifact to Staging Directory. Environment Directory(Directory on which new artifact would be created after build) to Staging Directory(Directory from which artifact would be created to drop location)
  4. Step 4-Publishing Script – to publish the artifact to DROP location. This is like common place which would be refered by release pipeline

Release Pipeline

  1. From DROP location the artifact would be picked. For this Trigger should be used.
  2. Trigger should detect the availability of new release and in which branch should be checked for new release.
  3. New artifact should be pushed to Deployment Environment(Dev, Testing or Prod). For this push Command should be used
  4. There should be agent to carry out the above tasks

Simple Pipeline

 Build -> Tests(QA) -> Deployment(Prod)

Trigger – One which starts an action. On Commit it should trigger a pipeline
Stage – Pipeline contains multiple stages. Stages are logical boundaries. Build, Test and Deployment above are all stages
Job – Stage contains multiple jobs. Build stage contains multiple jobs like asking for VM agent, Copying Target file to drop location
Steps – Job contains multiple steps. Steps can be a task or script which would be run.

Agents would be used to build artifacts in the pipeline. Agents may be based on windows, Linux or mac images which would be available on the fly when the build stage is started during pipeline execution

trigger:
- master

pool:
  vmImage: ubuntu-latest

stages:
- stage: Build
  jobs:
  - job: build
    steps:
    - script: 
        echo Displayed from Script
    - task: Npm@1
      inputs:
        command: 'install'    

Stages

stages:
- stage: string  # name of the stage, A-Z, a-z, 0-9, and underscore
  displayName: string  # friendly name to display in the UI
  dependsOn: string | [ string ]
  condition: string
  pool: string | pool
  variables: { string: string } | [ variable | variableReference ] 
  jobs: [ job | templateReference]