Today I’m giving a talk on GreenOps in the cloud at Bristol DevOps & Cloud Native Meetup, the slides of my presentation are here.
Category Archives: DevOps
DevOpsDays London 2023
Today I’m giving a talk on GreenOps in the cloud at DevOpsDays London, the slides of my presentation are here.
It’s a great conference with very interesting presentations and a relaxed atmosphere. Kudos to the organizers for the organization and for making this conference such an inclusive event!
Update: the organisers published the videos of the conference on their YouTube channel, you can watch my presentation on YouTube>.
DevOpsDays Geneva 2022
Today I’m giving a talk on Docs-as-Code at DevOpsDays Geneva, the slides of my presentation are here. It’s a great conference with lots of interesting presentations on DevOps culture and tooling and I’m very glad to be part of it. Kudos to the organizers for the organization and for making it such a friendly event!
Update: the organisers published the videos of the conference on their YouTube channel, you can find my presentation here>.
How to Validate a Jenkinsfile
As I’m using more and more often Jenkins Pipelines, I found the need to validate a Jenkinsfile in order to fix syntax errors before committing the file into version control and running the pipeline job in Jenkins. This can saves some time during development and allows to follow best practices when writing a Jenkinsfile as validation returns both errors and warnings.
There are several ways to validate a Jenkinsfile and some editors like VS Code even have a built-in linter. Personally, the easiest way I found is to validate a Jenkinsfile is to run the following command from the command line (provided you have Jenkins running somewhere):
curl --user username:password -X POST -F "jenkinsfile=<Jenkinsfile" http://jenkins-url:8080/pipeline-model-converter/validate
Note the following:
- If your Jenkins is authenticating users, you need to pass the username and password otherwise you can omit that part.
- By default, this command expects your Jenkinsfile to be called Jenkinsfile. If not, change the name in the command.
- Replace jenkins_url and possibly port 8080 based on the URL and port where you are running Jenkins. You can also use localhost as URL if you are running Jenkins on your machine.
If the Jenkinsfile validates, it will show a message like this one:
Jenkinsfile successfully validated.
Or, if you forgot to use steps within stage in your Jenkinsfile, the validation will flag an error like this:
Errors encountered validating Jenkinsfile: WorkflowScript: 10: Unknown stage section "git". Starting with version 0.5, steps in a stage must be in a ‘steps’ block. @ line 10, column 9. stage('Checkout Code') { ^ WorkflowScript: 10: Expected one of "steps", "stages", or "parallel" for stage "Checkout Code" @ line 10, column 9. stage('Checkout Code') { ^
Happy validation!
CD Summit and Jenkins Days 2016
This week I’m giving a talk about Continuous Security with Jenkins, Docker Bench, and Amazon Inspector at CD Summit & Jenkins Days in Amsterdam and in Berlin. CD Summit & Jenkins Days are a series of conferences in the US and in Europe focusing on Continuous Integration (CI) and Continuous Delivery (CD).
This is the abstract of my talk:
Security testing is often left out from CI/CD pipelines and perceived as an ad hoc and one-off audit performed by external security experts. However, the integration of security testing into a DevOps workflow (aka DevSecOps) allows to achieve security by design and to continuously assess software vulnerabilities within a CI/CD pipeline. But how does security fit in the world of cloud and microservices?
In this talk I show how to leverage tools like Jenkins, Docker Bench , and Amazon Inspector to perform security testing at the operating system and container levels in a cloud environment and how to integrate them into a typical CI/CD workflow. I discuss how these tools can help assessing the risk of security vulnerabilities during development, improving security and compliance, and lower support costs in the long term.
I also present two demos showing how to integrate Docker Bench with Jenkins and how to run Amazon Inspector from Jenkins.
The slides of my talk are available here.
Continuous Security with Jenkins and Amazon Inspector
Amazon Inspector is an automated security assessment service on Amazon Web Services (AWS). It allows to identify security vulnerabilities at operating system and network levels by scanning the host against a knowledge base of security best practices and rules.
I recently integrated Amazon Inspector to run in a Jenkins job so that security testing can be automated and performed prior to deployment to production.
AWS Configuration
The first thing to do is to set up the assessment target and assessment template in Amazon Inspector. An assessment target allows to select the EC2 instances via their tags in order to include them in the security scan. Here is an example of my assessment target for the EC2 instances tagged as gadictionaries-leap-ogl-stage-v360 :
The assessment template allows to specify the type of scan and its duration and is linked to the assessment target set up above. Here is an example of my assessment template (ARN is masked for security reasons). I selected the Common Vulnerabilities and Exposures (CVE) rule package scanning for 15 minutes (one hour is the recommended duration time to reliable results).
Jenkins Configuration
We now move to the Jenkins configuration in order to run the security scan via a Jenkins job instead of using the AWS console.
The first thing to do is to make sure that openssh is installed on the instance where Jenkins is running and on the host you want to check. For example, on Ubuntu you can install openssh with:
sudo apt-get install openssh-server
Then install the SSH Agent plugin in Jenkins. This will provide Jenkins with SSH credentials to automatically login into a machine on the cloud. Add the credentials in Jenkins -> Credentials -> System -> Global credentials (unrestricted) -> Add credentials -> SSH Username with private key. This is an example of my credentials for user jenkins (private key details are obfuscated):
Then create a Jenkins job and select the SSH agent credentials for user jenkins in Build Environment:
This will allow Jenkins to ssh into the machine with the private key stored securely (make sure you only grant permission to configure Jenkins to administrators otherwise your private keys are not safe).
I like to parameterize my builds so that I can run Amazon Inspector on a specific EC2 instance within a given Elastic Beanstalk stack:
Then we set up build and post-build actions. The build executes a shell script invoke_aws_inspector.sh pulled from the version control system. The post-build action provides the location of the JUnit file.
The shell script invoke_aws_inspector.sh looks like this:
# check parameters expected from Jenkins job if [[ -n "$HOSTNAME" ]] && [[ -n "$STACK" ]]; then # install and start AWS Inspector agent on host ssh -T -o StrictHostKeyChecking=no ec2-user@$HOSTNAME << 'EOF' curl -O https://d1wk0tztpsntt1.cloudfront.net/linux/latest/install sudo bash install sudo /etc/init.d/awsagent start sudo /opt/aws/awsagent/bin/awsagent status exit EOF # run AWS Inspector from localhost export AWS_DEFAULT_REGION=us-east-1 python execute_aws_inspector.py # stop and uninstall AWS Inspector agent on host ssh -T -o StrictHostKeyChecking=no ec2-user@$HOSTNAME << 'EOF' sudo /etc/init.d/awsagent stop && sudo yum remove -y AwsAgent EOF else echo "ERROR! Parameters HOSTNAME and STACK required from Jenkins job security_checks_aws_inspector" exit 1 fi
The shell script works as follows:
- line 4 allows Jenkins to ssh into a host (I’m using AWS EC2 as you can guess by the username ec2-user, replace it with your default username but do not user root). Note that the environment variable $HOSTNAME is passed from the parameter we set up earlier. The EOF allows to run a sequence of commands directly on the host so that you don’t have to disconnect every time. The single quotes are important, don’t skip them!
- lines 5-8 install and start the Amazon Inspector agent on the host
- lines 12-13 configure and set up a Python script execute_aws_inspector.py for running Amazon Inspector (we’ll see it in a minute)
- lines 16-18 remove the Amazon Inspector agent so that no trace is left on the host
- the final EOF disconnect Jenkins from host
The Python script execute_aws_inspector.py uses the Boto3 library for interacting with AWS services. The script looks like this:
import boto3 import os, sys import datetime, time import xml.etree.cElementTree as etree # initialize boto library for AWS Inspector client = boto3.client('inspector') # set assessment template for stack stack = os.environ['STACK'] if stack == 'gadictionaries-leap-ogl-stage-v360': assessmentTemplate = 'arn:aws:inspector:us-east-1:XXXXXXXXXXXXX:target/XXXXXXXXXXXXX/template/XXXXXXXXXXXXX' elif stack == 'gadictionaries-leap-odenoad-stage-v350': assessmentTemplate = 'arn:aws:inspector:us-east-1:XXXXXXXXXXXXX:target/XXXXXXXXXXXXX/template/XXXXXXXXXXXXX' else: sys.exit('You must provide a supported stack name (either gadictionaries-leap-ogl-stage-v360 or gadictionaries-leap-odenoad-stage-v350') # start assessment run assessment = client.start_assessment_run( assessmentTemplateArn = assessmentTemplate, assessmentRunName = datetime.datetime.now().isoformat() ) # wait for the assessment to finish time.sleep(1020) # list findings findings = client.list_findings( assessmentRunArns = [ assessment['assessmentRunArn'], ], filter={ 'severities': [ 'High','Medium', ], }, maxResults=100 ) # describe findings and output to JUnit testsuites = etree.Element("testsuites") testsuite = etree.SubElement(testsuites, "testsuite", name="Common Vulnerabilities and Exposures-1.1") for item in findings['findingArns']: description = client.describe_findings( findingArns=[ item, ], locale='EN_US' ) for item in description['findings']: testcase = etree.SubElement(testsuite, "testcase", name=item['severity'] + ' - ' + item['title']) etree.SubElement(testcase, "error", message=item['description']).text = item['recommendation'] tree = etree.ElementTree(testsuites) tree.write("inspector-junit-report.xml")
The Python script works as follows:
- lines 10-17 read the environment variable set in the parameterized build and select the correct template (I set up two different template for two different stacks, the ARNs are obfuscated for security reasons)
- lines 20-26 run the assessment template and waits a bit longer than 15 minutes so that the scan can finish
- lines 29-39 filters findings with severities High and Medium
- lines 42-58 serialize the findings into a JUnit report so that they can be automatically read by Jenkins
Finally, here is an example of Test Result Trend and JUnit test results showing security vulnerabilities on an EC2 instance running unpatched packages:
Happy security testing with Jenkins and Amazon Inspector!
Continuous Security with Jenkins and Docker Bench
Docker Bench is an open source tool for automatically validating the configuration of a host running Docker containers. The tool has been written among others by Diogo Mónica, security lead at Docker, and performs security checks at the container level following Docker’s CIS Benchmark recommendations.
As you would expect, the easiest way to run Docker Bench is via a Docker container. Just make sure you have Docker 1.10 or better, download the Docker image:
docker pull docker/docker-bench-security
and run the Docker container as follows:
docker run -it --net host --pid host --cap-add audit_control \ -v /var/lib:/var/lib \ -v /var/run/docker.sock:/var/run/docker.sock \ -v /usr/lib/systemd:/usr/lib/systemd \ -v /etc:/etc --label docker_bench_security \ docker/docker-bench-security
This will automatically generate some output as in the animated gif above with an assessment of possible Docker security issues.
I recently combined Docker Bench with Jenkins in order to integrate security testing into a typical DevOps workflow on the cloud – call it DevSecOps if you like buzzwords… This requires a little bit of Jenkins configuration but it’s not too difficult to follow.
The first thing to do is to make sure that openssh is installed on the instance where Jenkins is running and on the host you want to check. For example on Ubuntu you can install openssh with:
sudo apt-get install openssh-server
Then install the SSH Agent plugin in Jenkins. This will provide Jenkins with SSH credentials to automatically login into a machine on the cloud. Add the credentials in Jenkins -> Credentials -> System -> Global credentials (unrestricted) -> Add credentials -> SSH Username with private key. This is an example of my credentials for user jenkins (private key details are obfuscated):
Then create a Jenkins job and select the SSH agent credentials for user jenkins in Build Environment:
This will allow Jenkins to SSH into the machine with the private key stored securely (make sure you only grant permission to configure Jenkins to administrators otherwise your private keys are not safe).
I like to parameterize my builds so that I can run Docker Bench on any host reachable with the private key:
Finally, select Execute shell in the build and paste this shell script (you may want to put it under version control and retrieve it from there via Jenkins):
ssh -T -o StrictHostKeyChecking=no ec2-user@$HOSTNAME << 'EOF' sudo docker pull docker/docker-bench-security && \ sudo docker run --net host --pid host --cap-add audit_control -v /var/lib:/var/lib -v /var/run/docker.sock:/var/run/docker.sock -v /usr/lib/systemd:/usr/lib/systemd -v /etc:/etc --label docker_bench_security docker/docker-bench-security && \ sudo docker rm $(sudo docker ps -aq -f status=exited) && \ sudo docker rmi docker/docker-bench-security EOF
It works likes this:
- the first command allows Jenkins to ssh into a host (I’m using AWS EC2 as you can guess by the username ec2-user, replace it with your default username but do not user root). Note that the environment variable $HOSTNAME is passed from the parameter we set up earlier. The EOF allows to run a sequence of commands directly on the host so that you don’t have to disconnect every time. The single quotes are important, don’t skip them!
- the second command pulls the Docker image for Docker Bench directly on the host
- the third command runs Docker Bench on the host
- the forth command removes all exited containers from the host, including Docker Bench once it has finished its job
- the fifth command remove the Docker image for Docker Bench so that you don’t leave any trace on the host
- the final EOF disconnect Jenkins from host
The Jenkins console output shows the result of running Docker Bench on a specific host. Now you have to assess the results as you may see several warnings and they may just be false positives. For example, this warning may be acceptable for you:
[1;31m[WARN][0m 1.5 - Keep Docker up to date [1;31m[WARN][0m * Using 1.11.1, when 1.12.0 is current as of 2016-07-28
This means you are not running the latest version of Docker. This may not be an issue (unless Docker released a security release) especially if your Linux distribution hasn’t got the latest version of Docker available in its repositories.
In my case this warning was a false positive:
[1;31m[WARN][0m 2.1 - Restrict network traffic between containers
In fact, I need several containers to communicate between them so that restriction does not apply to my use case.
This warning should be taken much more seriously:
[1;31m[WARN][0m 4.1 - Create a user for the container [1;31m[WARN][0m * Running as root: container-name-bbf386c0b301
This means you are running a container as root. This is unsecure as if a nasty intruder manages to get inside the containers s/he can run any command in it. Basically, it’s like running a Linux system as root which is a bad security practice.
Once you have assessed your warnings, you may want to filter out the false positives. For example, you can use the Post build task plugin to make the build fail if the build log output contains a warning that you assessed as a security risk. You can use a regular expression to match the pattern identified above.
It would be good to get the Docker Bench output in JUnit format so that Jenkins can understand it natively but this option is currently not implemented in Docker Bench.
Happy security testing with Jenkins and Docker Bench!
DevOps Oxford Meetup
Today I’m presenting at DevOps Oxford Meetup at the Oxford Centre for Innovation in my ‘home town’ Oxford. This is a short version of the talk Continuous Integration with Jenkins, Docker and Compose I gave at DockerCon Europe 2015 but with a few updates. The slides of the talk are here and the sample code is on GitHub.
API Days Nordic 2016
This week I’m attending API Days Nordic at the University of Tampere (Finland), a technical conference on heavy industry and APIs, government driven API platforms, and APIOps (APIs, DevOps, containers, testing, micro services, and monitoring). This is part of the API Days series of conferences and it is the first time that it is organised in Finland.
My presentation focuses on how to deploy an API gateway with Docker. The slides are available here and the code is on GitHub. This is the summary of my presentation:
An API gateway is a single entry point for APIs in a microservices infrastructure. It provides authentication and authorization layers, routes and load balances requests to API services, and caches previous requests. Being the first entry point of the API, it is crucial to manage and provision it through code rather than using a manual process. Furthermore, replicating its configuration on development and staging environments allows to load test the API gateway and to anticipate issues before it is deployed to production.
I demonstrate the deployment of an API gateway using Docker. Technologies used include:
- Docker
- openresty/nginx
- 3scale API management
- AWS EC2
I discuss the benefits of using Docker and how it simplifies changes of configuration and deployment to multiple environments. Sample code and brief documentation are available on GitHub.
DockerCon Europe 2015
This week I am attending DockerCon Europe 2015 in Barcelona where I am going to talk about Continuous Integration with Jenkins, Docker and Compose. This is a reviewed version of a previous blog post on Jenkins and Docker. Here are the slides in PDF.
Update: DockerCon put online the full video of the presentation. Sample code is available on my GitHub account.