This documentation provides the steps to integrate the AWS CodePipeline, Azure DevOps, SonarQube and JFrog services.
Following are the pre-requisites setup must be completed before the setting up the AWS CodePipeline.
- Azure DevOps Repository
- Sonarqube Project
- Sonarqube Configuration
- Azure DevOps WebHooks with AWS Services
- Configure Azure DevOps Repo WebHook Trigger
In this section, we will setup the Azure DevOps Repository by cloning it from an existing GitHub spring boot project.
- Log into the Azure Devops Portal https://dev.azure.com/
- Click
New Organization
link to create an organization - Enter the oranization name and select
Central US
from the list underWe'll host your projects in
, and clickNext
- In the
Create a project to get started', enter the project name and click
+ Create Project`button. - Import the existing GitHub project into the new repo following the steps in the link
- Click the
Files
under the repo and you should see project files listed
After the repository is created, you must create a personal access token for the AWS CodePipeline to communicate with the Azure DevOps repository to get the latest code, when there is a code change happens and triggers the pipeline. Follow the below section to generate the token.
- Click on the
User Settings
icon and selectPersonal Access Token
to create the token for AWS CodePipeline to download the repo codebase as zip file - Click
+ New Token
button and provide a user-friendly name (for ex, aws-codepipeline-access-token) - Under the Scopes, select
Read
under Code section to provide read access to the token consumer. - Click
Create
button to complete the setup.
Now you have completed the AzureDevOps Repository configuration part. Let's move to the next section to setup the SonarQube project for the same repository.
In this section, we will use the sonarcloud online version of the SonarQube to create an account and setup the project to capture the repository code quality details.
- Go to https://sonarcloud.io
- Login using any of the GitHub, BitBucket etc credentials
After the account setup is completed, let's create the sonarqube project by following the below steps:
- Click the
+
icon in top right corner - Select
Analyse New Project
option - Select the repo from the list and Click
Setup
button - Under
Configure
tab, select theWith Other CI Tools
option from the Choose another analysis method options. - Select appropriate codebase
language
under the 'What option best describes your build?' - Select the
Operation System
name from the list under the 'What is your OS?' - Click icon next to
+
and select organization name underMy Organizations
- Click
Administration
and selectOrganization settings
- Right corner, copy the value of the label
Key:
and store it somewhere.
After the project setup is completed, we need to generate the token for the pipeline to publish the code analysis result files for the code quality analysis.
- Click on the
Profile
icon and select theaccount name
- Click on the
Security
tab - Enter a friend name for the repo access token in the
Generate Token
field and clickGenerate
. - Copy the token string by clicking the
copy
button and save it somwhere. You cannot retrieve this again.
Let's store the sonarcloud endpoint configuration details in the AWS Secret Manager. It is always a best practice to store the sensitive details in the secret manager to prevent hard-coding and leaking the credentails.
In this section, we will create a new secret to store the sonarcloud endpoint details like url, token and the organization details.
- Log into AWS Management Console and select
Secrets Manager
- Click on the
Store a new secret
button - Select
Other types of secrets
in the Select secret type - Under the secret key/value, add the following key/value pairs
a. key: token, value: <paste the sonarqube token value from the section
SonarQube Token Setup
step 4. b. key: host, value: https://sonarcloud.io c. key: organization, value: (obtained from the section Sonarqube Project Setup step 9) - Click
Next
button - Enter the secret name value
dev/sonarcloud
and clickNext
button - Click
Next
button - Click
Store
button to complete the setup
- Log into AWS Management Console
- Click on this link
- Click
Next
button - Enter the Output S3 Bucket Name as `azure-repo-codebase'
- In the Allowed IPs, enter the
Azure DevOps Services IPs for the Regional Identity Service - Central United States
value 13.89.236.72,52.165.41.252,52.173.25.16,13.86.38.60,20.45.1.175,13.86.36.181,52.158.209.56 (Refer the link for different region) - In the Git Personal Access Token, paste the AzureDevOps Personal Access Token value created in the sub-section
Generate Personal Access Token
step 7. - In the Quick Start S3 Bucket Name, enter the value
Azure-DevOps-WebHooks
- In the Quick Start S3 Key Prefix, enter the value
Assets/
- Click
Next
- Click
Next
- In the Review screen, under Capabilities section select the checkbox for
I acknowledge that AWS CloudFormation might create IAM resources.
- Click
Create stack
to complete the setup - After the stack creation is completed, go to the
Outputs
tab and copy the value of key nameZipDownloadWebHookApi
- Go to the section
Configure Azure DevOps Repo WebHook Trigger
and follow the steps 1-12. - Go to
Lambda
service and select theAzureRepo-to-Amazon-S3-ZipDlLambda
lambda function to edit. - In the code editor, replace the following existing code
if 'X-Hub-Signature' in event['params']['header'].keys():
hostflavour = 'githubent'
elif 'X-Gitlab-Event' in event['params']['header'].keys():
hostflavour = 'gitlab'
elif 'User-Agent' in event['params']['header'].keys():
if event['params']['header']['User-Agent'].startswith('Bitbucket-Webhooks'):
hostflavour = 'bitbucket'
elif event['params']['header']['User-Agent'].startswith('GitHub-Hookshot'):
hostflavour = 'github'
elif 'Bitbucket-' in event['params']['header']['User-Agent']:
hostflavour = 'bitbucket-server'
elif event['body-json']['publisherId'] == 'tfs':
hostflavour='tfs'
with this new code snippet
if event['body-json']['publisherId'] == 'tfs':
hostflavour='tfs'
elif 'X-Hub-Signature' in event['params']['header'].keys():
hostflavour = 'githubent'
elif 'X-Gitlab-Event' in event['params']['header'].keys():
hostflavour = 'gitlab'
elif 'User-Agent' in event['params']['header'].keys():
if event['params']['header']['User-Agent'].startswith('Bitbucket-Webhooks'):
hostflavour = 'bitbucket'
elif event['params']['header']['User-Agent'].startswith('GitHub-Hookshot'):
hostflavour = 'github'
elif 'Bitbucket-' in event['params']['header']['User-Agent']:
hostflavour = 'bitbucket-server'
- Similarly, replace the following line
archive_url = event['body-json']['resourceContainers']['account']['baseUrl'] + 'DefaultCollection/' + event['body-json']['resourceContainers']['project']['id'] + '/_apis/git/repositories/' + event['body-json']['resource']['repository']['id'] + '/items'
with this code
archive_url = event['body-json']['resourceContainers']['account']['baseUrl'] + event['body-json']['resourceContainers']['project']['id'] + '/_apis/git/repositories/' + event['body-json']['resource']['repository']['id'] + '/items'
- Click the
Save
button to complete the code change
- Log into the Azure DevOps portal and select the Repo
- Click the
Project Settings
bottom of the left navigatiojn - Select
service hooks
- Click
+
to add a new webhook - Select
Web Hooks
from the list of Service and clickNext
- Select
Code Pushed
from the list underTrigger on this type of event
- Under the
Repository
, select the repo name from the list - Select
Master
from the list under the branch - Leave the default value
[Any]
for the Pushed by member of group and clickNext
button - In the
Action
screen, pase the value obtained from the section Azure DevOps WebHooks with AWS Services step 14. - Click
Test
button for test - Click
Finish
button to complete the setup
- Log into the AWS Account and select
CodePipeline
service - Click
Create Pipeline
button - Under the
Pipeline Settings
, enter the pipeline name. - Expand the
Advanced settings
- Make sure the
Default Location
option is selected underArtifact Store
andDefault AWS Managed Key
option under theEncryption key
section
- Under the
Source
, select Amazon S3 and enter the bucket nameazure-repo-codebase
(specified in the step 4 under the sectionAzure DevOps WebHooks with AWS Services
) and paste the S3 object key as<Azure Repo Organization Name>/<repo name>/master/<repo name>.zip
- Under the
Build
, selectAWS CodeBuild
option. - Click
create project
button (follow the steps under the sectionCodeBuild Project Setup for Unit Test
) - Click the
Next
button - Click
skip deploy stage
button to skip the deployment step - Review the pipeline details and click
Create Pipeline
button to complete the initial pipeline step - Goto IAM service and search for the service role associated with Unit Test build project.
- Click + Add inline policy to add inline policy to give access to read the secret manager key
- Click JSON tab and paste the following json snippet:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "secretsmanager:GetSecretValue",
"Resource": "<secret ARN>"
}
]
}
- To verify the Unit Test setup, goto azure repo, edit and update readme.md file and click commit.
- Check the pipeline must be triggered per above code change.
- Select the pipeline name link from the list
- In the pipeline screen, select
Edit
to add additional stages - Below Unit Test stage, click the
+ Add Stage
button to addQuality-Gate
stage - In the Add Stage, enter the stage name
Quality-Gate
. - Click
+ Add action group
button to add the steps - Enter
code-quality
in theAction Name
- Select
AWS CodeBuild
in theAction provider
- Select the region where the pipline s3 bucket is located
- Select
OutputArtifact
fromt the list underInput Artifacts
- Click
Create project
to createCodeBuild Project Setup for Quality Gate
- Enter the build project name (preferrably, prefix
-unit-test
with the pipeline name) - Under
Environment
section, selectManaged Image
- Select
Ubuntu
for the operation system - Select
Standard
for the runtime - Select
aws/codebuild/standard:3.0
for the image. (Refer the link for which OS and image should be selected based on the lanaguge version.) - Under buildsepc, select
insert build commands
option and clickswitch to editor
link - In the
build commands
text editor, update with the following code
version: 0.2
env:
variables:
Project: "<Sonarqube project name goes here>"
secrets-manager:
LOGIN: dev/sonarcloud:token
HOST: dev/sonarcloud:host
Organization: dev/sonarcloud:organization
phases:
install:
#If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
#If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
runtime-versions:
java: openjdk8
pre_build:
commands:
- apt-get update
- wget https://binaries.sonarsource.com/Distribution/sonar-scanner-cli/sonar-scanner-cli-4.3.0.2102-linux.zip
- unzip ./sonar-scanner-cli-4.3.0.2102-linux.zip
- export PATH=$PATH:/sonar-scanner-cli-4.3.0.2102-linux/bin/
build:
commands:
- mvn clean install
- mvn sonar:sonar -Dsonar.login=$LOGIN -Dsonar.host.url=$HOST -Dsonar.projectKey=$Project -Dsonar.organization=$Organization -Dsonar.jacoco.reportPath=target/coverage-reports/jacoco-unit.exec
artifacts:
files:
- '**/*'
base-directory: 'target'
- Click
continue to pipeline
button. It will take you back to the sectionCodePipeline Setup
step 21.
- Enter the build project name (pipeline name prefix with -quality-gate)
- Under
Environment
section, selectManaged Image
- Select
Ubuntu
for the operation system - Select
Standard
for the runtime - Select
aws/codebuild/standard:4.0
for the image. (Refer the link for which OS and image should be selected based on the lanaguge version.) - Under buildsepc, select
insert build commands
option and clickswitch to editor
link - In the
build commands
text editor, update with the following code and update the Project variable with the sonarqube project name
version: 0.2
env:
variables:
Project: "<Sonarqube project name goes here>"
phases:
install:
#If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
#If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
runtime-versions:
java: corretto8
build:
commands:
- curl https://sonarcloud.io/api/qualitygates/project_status?projectKey=$Project >result.json
- if [ $(jq -r '.projectStatus.status' result.json) = ERROR ] ; then $CODEBUILD_BUILD_SUCCEEDING -eq 0 ;fi
- Click
continue to pipeline
button. It will take you back to the sectionCodePipeline Setup
step 8.
- Log into the Azure DevOps portal and select the Repo
- Select any file (for ex, readme.md) and click
Edit
button - Make some changes (doesn't matter what change it is)
- Click
Save
button - Click
Commit
button in the commit dialog screen to trigger the pipeline - Goto AWS CodePipeline and select the pipeline name
- Check for the build is triggered
In this section, follow the steps to configure JFrog instance using the Open Source JFrog Artifactory from the AWS Marketplace and start the artifactory instance.
- Log into the AWS Account and select the Region where the AWS CodePipeline project will be created.
- Goto AWS Marketplace and click Discover products
- Search for
JFrog Open Source
and select the one from the publisherMiri Infotech
from the list - Click
continue to subscribe
button - Click
continue to configuration
button - Leave the default selection under
Delivery Method
andSoftware Version
- Select the desired
region
from the list and clickContinue to Launch
button. Wait for the instance to status change to READY state. - Select the ec2 instance and copy the private ip address and host name.
- click Connect button and follow the chmod and ssh commands (use ubuntu instead of root) to remote log into the ec2 instance.
- Type the command
sudo vi /etc/hosts
to open the hosts file - After the below first line, type the private ip and host name separated by tab space to the hosts file, save and exit from the vi.
- Start the artifactory by entering the following commands
sudo su
cd /home/ubuntu/artifactory-oss-6.8.2/bin
./artifactory.sh
- After the
Artifactory successfully started
message, open the browser and type the URL http://ec2-instance-public-hostname:8081/artifactory. (Replace the ec-instance-public-hostname with the actual value) - Login with the admin credentials, admin/password.
- Open the ec2 instance security group and check the inbound rules for ports 22 and 8081 must be opened. Port 22 must be allowed from specific ip and 8081 must be opened for all (0.0.0.0).
After the JFrog Artifactory is setup and running, a Maven repostiory needs to setup to resolve and deploy artifacts and plugins.
- Login with the admin credentials, admin/password.
- Click the link
Welcome, admin
and selectQuick Setup
to create maven repository - Select
Maven
from the repositories and clickcreate
button. - Under
Set Me Up
section, you should see the following repository keys for snapshot and release- lib-snapshot
- lib-snapshot-local
- lib-release
- lib-release-local
- Now the Maven repository setup is completed.
Maven project configurations are stored in pom.xml file. In this section, we will configure the maven-artifactory sections to integrate with JFrog Artifactory.
First, we will configure the plugin and dependency for the artifactory-maven.
- Go to the Azure DevOps Repo, select the pom.xml file and click Edit button.
- Add the following maven configuration inside the tag to specify the JDK 1.8 version for build and specify the code coverage plugin jacoco configuration
<maven.compiler.source>1.8</maven.compiler.source>
<maven.compiler.target>1.8</maven.compiler.target>
<jacoco.version>0.8.3</jacoco.version>
<sonar.java.coveragePlugin>jacoco</sonar.java.coveragePlugin>
<sonar.dynamicAnalysis>reuseReports</sonar.dynamicAnalysis>
<sonar.jacoco.reportPath>${project.basedir}/../target/jacoco.exec</sonar.jacoco.reportPath>
<sonar.language>java</sonar.language>
- Add the below plugin configurations for the maven plugins jacoco-maven-plugin and maven-compiler-plugin under the section.
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-compiler-plugin</artifactId>
<configuration>
<source>1.8</source>
<target>1.8</target>
</configuration>
</plugin>
<plugin>
<groupId>org.jacoco</groupId>
<artifactId>jacoco-maven-plugin</artifactId>
<version>${jacoco.version}</version>
<configuration>
<skip>${maven.test.skip}</skip>
<destFile>${basedir}/target/coverage-reports/jacoco-unit.exec</destFile>
<dataFile>${basedir}/target/coverage-reports/jacoco-unit.exec</dataFile>
<output>file</output>
<append>true</append>
<excludes>
<exclude>*MethodAccess</exclude>
</excludes>
</configuration>
<executions>
<execution>
<id>jacoco-initialize</id>
<goals>
<goal>prepare-agent</goal>
</goals>
<phase>test-compile</phase>
</execution>
<execution>
<id>jacoco-site</id>
<phase>verify</phase>
<goals>
<goal>report</goal>
</goals>
</execution>
</executions>
</plugin>
- Add the following XML configuration below the
</properties>
line to configure build artifact publishing artifactory details for SNAPSHOT and RELEASE stages
<distributionManagement>
<snapshotRepository>
<id>snapshots</id>
<name>ip-172-31-42-114-snapshots</name>
<url>http://${internal.repo.server.url}/artifactory/libs-snapshot-local</url>
</snapshotRepository>
<repository>
<id>central</id>
<name>ip-172-31-42-114-releases</name>
<url>http://${internal.repo.server.url}/artifactory/libs-release-local</url>
</repository>
</distributionManagement>
- Final step, create a new empty file names
settings.xml
in the root directory of the repository and add the following xml configuration to specify the server and repository configuration
<settings>
<servers>
<server>
<username>${internal.repo.username}</username>
<password>${internal.repo.password}</password>
<id>central</id>
</server>
<server>
<username>${internal.repo.username}</username>
<password>${internal.repo.password}</password>
<id>snapshots</id>
</server>
</servers>
<profiles>
<profile>
<repositories>
<repository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>libs-release</name>
<url>http://${internal.repo.server.url}/artifactory/libs-release</url>
</repository>
<repository>
<snapshots />
<id>snapshots</id>
<name>libs-snapshot</name>
<url>http://${internal.repo.server.url}/artifactory/libs-snapshot</url>
</repository>
</repositories>
<pluginRepositories>
<pluginRepository>
<snapshots>
<enabled>false</enabled>
</snapshots>
<id>central</id>
<name>libs-release</name>
<url>http://${internal.repo.server.url}/artifactory/libs-release</url>
</pluginRepository>
<pluginRepository>
<snapshots />
<id>snapshots</id>
<name>libs-snapshot</name>
<url>http://${internal.repo.server.url}/artifactory/libs-snapshot</url>
</pluginRepository>
</pluginRepositories>
<id>artifactory</id>
</profile>
</profiles>
<activeProfiles>
<activeProfile>artifactory</activeProfile>
</activeProfiles>
</settings>
In this section, new stage for building and publishing artifacts to JFrog Artifactory will be added in the existing pipeline.
- Log into the AWS Management Console and select Code Pipeline service,
- Select the pipeline name link from the pipelines list and click
Edit
button. - Goto the Code-Quality step and click the below button
+ Add Stage
to add new build step. - Enter the action name
Publish-Artifacts' and click
Add Stage` button - Click
+ Add Action Group
button. - Enter the Action name
publish-artifacts
, selectAWS CodeBuild
under Action Provider, selectSource Artifact
under Input Artifacts and clickCreate Project
button to createCodeBuild Project Setup for Publish Artifacts
- Enter the build project name (pipeline name prefix with -publish-artifacts)
- Under
Environment
section, selectManaged Image
- Select
Ubuntu
for the operation system - Select
Standard
for the runtime - Select
aws/codebuild/standard:3.0
for the image. (Refer the link for which OS and image should be selected based on the lanaguge version.) - Under buildsepc, select
insert build commands
option and clickswitch to editor
link - In the
build commands
text editor, update with the following code and update the Project variable with the sonarqube project name
version: 0.2
env:
secrets-manager:
USER: dev/artifactory:user
PASSWORD: dev/artifactory:password
URL: dev/artifactory:url
phases:
install:
#If you use the Ubuntu standard image 2.0 or later, you must specify runtime-versions.
#If you specify runtime-versions and use an image other than Ubuntu standard image 2.0, the build fails.
runtime-versions:
java: openjdk8
pre_build:
commands:
- cp settings.xml ~/.m2
build:
commands:
- mvn -Dinternal.repo.username=$USER -Dinternal.repo.password=$PASSWORD -Dinternal.repo.server.url=$URL clean deploy
artifacts:
files:
- '**/*'
base-directory: 'target'
In this section, we create secret manager configuration to store the JFrog Artifactory configurtion.
In this section, follow the steps to get the admin user encrypted password to configure in the secret manager.
- Log into the Artifactory using admin credentials.
- Click on the
Welcome,admin
link right corner and selectEdit Profile
- Enter the admin password in the
Current Password
field and clickUnlock
button - Under the
Authentication Settings
, copy the text in the Encrypted Password text field and save it somewhere to use it during the secret manager configuration
- Log into AWS Management Console and select
Secrets Manager
- Click on the
Store a new secret
button - Select
Other types of secrets
in the Select secret type - Under the secret key/value, add the following key/value pairs
- key: url, value: <paste the JFrog Artifactory ec2 Public DNS (IPv4) name and port 8081>. For ex, ec2-54-146-7-13.compute-1.amazonaws.com:8081
- key: user, value: admin
- key: password, value: (Note: Paste the admin Encrypted Password field value)
- Click
Next
button - Enter the secret name value
dev/artifactory
and clickNext
button - Click
Next
button - Click
Store
button to complete the setup
- Goto IAM service and search for the service role associated with
Publish Artifacts
build project. - Click + Add inline policy to add inline policy to give access to read the secret manager key
- Click JSON tab and paste the following json snippet:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "VisualEditor0",
"Effect": "Allow",
"Action": "secretsmanager:GetSecretValue",
"Resource": "<dev/artifactory secret ARN>"
}
]
}