A simple CD using AWS Pipeline
CD is a friend in software development: there is no other way to speed up feedback from your application without having a solid Continuous Deployment.
I am skipping continuous integration because all of this examples are related to single repo.. where the only developer allowed to commit code it's me: the integration is with my self :)
In this example we will create an automation related to a simple situation where we have a LambdaFunction exposed through an ApiGateway, where the lambda function is running a docker image.
There are infinite ways of creating and managing Lambda and pipelines of course, but in this example we will use:
- CodeCommit to store the code
- ECR to store the code
- CloudFormation to create and update all the infrastructure.
- CodeBuild to create the artifacts we need
Let's see it
CODE COMMIT
To easily have a repository to store the code and to have a natural integration with AWS echosystem, I decided to use Code commit.
Code is structured as other examples:
The HelloWorldFunction contains all the lambda code and the template.yaml contains the stack structure for cloud Formation
The advantage of using code commit is that the pipeline easily access the code without using connector or managing credentials.
CLOUD FORMATION
CloudFormation is used to create the lambda and the gateway. We said that every time we push code, a new image should be hooked to the lambda. So I created a parameter into the template which will be filled with the latest image we will build. We can refer it always as the "latest" image, but for troubleshooting this is not the best way: having tagged version with specific tag will easily give you the power to verify what you committed and what changed
To manage the parameter I added the Parameters section with ImageUriParameter. A defaul is set (the latest version)
Parameters:
ImageUriParameter:
Type: String
Default: 058970626207.dkr.ecr.us-east-1.amazonaws.com/ecr-rametta:helloworldfunction-latest
And the parameter is referenced into function's ImageUri
HelloWorldFunction:
Type: AWS::Serverless::Function
Properties:
PackageType: Image
FunctionName: HelloWorldFunctionSimple
ImageUri: !Ref ImageUriParameter
This means that if I want to execute the stack passing a new value of ImageUri, i need to execute something like this
aws cloudformation deploy --stack-name xxxxx --template-file template.yaml --capabilities CAPABILITY_IAM --no-disable-rollback --parameter-overrides ImageUriParameter=yyyyy
where xxxx is the name of the stack and yyyy is the uri of the image I want to use. We need to pass the capabilities because it will create and use implicit iam roles. The stack change show is disabled by default and rollback is not disabled
The stack was created under the name simple-deploy-with-image-uri
ECR
Ecr was created at early stage to have a repository to store all the image I need to use. It is the managed service and can be integrated with Docker
Suppose we need to store an Image using Docker, the software needs to have an authentication Token to have access to ECR.
If you run on your local machine you can simply have a login into the repo using docker interacting with the command line.
With automation this is not possible so we need to have a way to login programmatically:
aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin xxxx
where xxxx our ECR repository.
This command performs a docker login passing the token obtained with ecr get-login-password. This command is possible only if the role that is executing the code has the ECR full access or by the way the getAuthToken permission.
CODE BUILD
Now it's time to bring all together to:
- compile the new code
- create the new image based on commit id
- updating the image to ECR
- update the stack with the new image
To have a unique identifier of our build, we will use the commit id. It can be easily retrieved using
git rev-parse HEAD
and it shows the last commit id. We need to use it later in the code.
The code you will find is taken from BuildSpec.yaml which is the equivalent what you can do using for example Jenkins: there are stages and actions for execution and you can easily perform all the basic aws commands (and of course maven)
First we need to perform the login with docker into our ECR repo: we will perform it into pre-build phase
pre_build:
commands:
- echo Logging in to Amazon ECR...
- aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin xxxxx
where xxx is always our ECR
Then we need to compile the code. Remember that the code build is configured to download the source code (we set it as code commit repo) and after downloading the folder execution is moved into the folder which contains the code
It is a simple maven code and we need to include the dependencies as we are working with image
build:
commands:
- mvn clean install -f HelloWorldFunction/pom.xml
- mvn dependency:copy-dependencies -DincludeScope=compile -f HelloWorldFunction/pom.xml
Now we can build the image and tag it. As you will see in the code we will use the commit id as parameter
- echo Build started on `date`
- echo Building the Docker image...
- docker build HelloWorldFunction/. -t ecr-rametta:helloworldfunction-$(git rev-parse HEAD)
- docker tag ecr-rametta:helloworldfunction-$(git rev-parse HEAD) xxxx:helloworldfunction-$(git rev-parse HEAD)
where xxx is always the ECR.
All of this commands are possible because during code build creation we specified a Linux standard machine provided by AWS where everything is pre installated and where bash command can be executed. Of course every interaction with other systems need to be achieved using the right role
Now it is time to execute the deploy of the CloudFormation stack
post_build:
commands:
- echo Build completed on `date`
- echo Pushing the Docker image...
- docker push 058970626207.dkr.ecr.us-east-1.amazonaws.com/ecr-rametta:helloworldfunction-$(git rev-parse HEAD)
- aws cloudformation deploy --stack-name simple-demploy-with-image-uri --template-file template.yaml --capabilities CAPABILITY_IAM --no-disable-rollback --parameter-overrides ImageUriParameter=xxxx:helloworldfunction-$(git rev-parse HEAD)
We are passing the image uri as parameter. So the stack find that the only change in the stack is the uri and it will update the lambda function with that.
PIPELINE
You can combine all using the pipeline. I configured all using only two stages: code commit and code build, no deploy is needed because it is under our post_build commands.
This means of course that the gateway is always the same and you are removing the Lambda history because you are not publishing a new version of lambda (which is a specific feature of Lambda himself)
The pipeline is automatically triggered when something is committed into the code, but it is useful only when you change the java code, because you can create new image and release it.
CONCLUSION
The hard part for this examples is the configuration of role execution because code deploy interact with different services so it needs to have the correct policies.
As I said at the start of the article, this is not the best way to deploy changes but it is one of the ways.
It is easy if you need to experiment because starting from that you can easily add for example the code to interact with SQS or Dynamo and then the value inside the IAM role inside the template (if you see there are no roles except the implicit ones which are the creation of logs)
Let's the code begin
Commenti
Posta un commento