Starting with SAM on AWS
We don't know if Serverless will be the future of development, but we can't wait too much to starting code in that way: it's already time to experiment.
INFINITE WAY OF BUILD
As you know AWS offers always many ways to build something: there is the UI, there is the command line, there are templates, there are various tools to reach the same goal.
For serverless we have different way, but today I just want to focus on using SAM framework, a nice way to create a simple application from scratch without many headaches.
SERVERLESS: RECAP
First a recap: Serverless means that you don't need to focus on underlying infrastructure 'cause AWS does it for you. You focus on code, he packages all in an image and start a conteiner in a remote VM which runs your code.
The differences with other tools is that your code just replies to an action and need of course to be triggered.
If you think on how a Rest application works using Spring boot (for example) you will find that:
- the main servlet is listen on the port you specified on properties
- when a call arrives, the Spring container invokes by reflection your code (maybe not by reflection, it depends on proxies and other features that are out of scope now)
With this in mind, in a Controller you just write the code that is invoked from Spring. So for serverless, imagine you don't have anything, you are just writing an handler (similar to a controller) which is invoked under certain conditions.
COMPONENTS OF A SERVERLESS API
One of the most used application for serverless is Api of course: your code is executed in reaction to an external http call (which can be a browser, an http client, another application and so on). In order to be exposed you need to:
- create a gateway that map an url and will act as entry point
- create all the IAM roles to be assigned to gateway to invoke your lambda
- create a repository to store your code
Believe me that the first time it will be a nightmare, no matter how many guides you have already read or how much "nerd" you are. You will make a mistake.
SAM: A NICE TO HAVE
To simplify what I wrote before, there is a framework - SAM - which gives to you the power to start experiment easily: it download a template for a simple Rest service (with just an endpoint) and it creates exactly what you need to start test (locally) and deploy it to Cloud.
First: create the template to start.
INIT
If you create an empty folder you can start using SAM. Remember that you need of course to have all the cli installed and - of course - all the credentials already configured.
So open your command prompt and type: sam init
The tool will offer you different solutions, please select the "hello world" example but change the default config for using Java and Image.
Why? Because we wont to create container for our execution. Because in the future there will be many chances that we need to create functions that need to have packaged libraries. So it is important starting from that.
If you are curious (and you are, I am sure) you can navigate into the created folder and template: you can explore the pom.xml to see what is imported as default, you can explore the template to understand what will be created later and how each component is related.
BUILD
With Sam Build it will create the image locally (remember that you need to have Docker in execution). To see how the image is build you can see the Dockerfile: the configuration is very simple. As you see the Maven is executed during image building. As you know this is not a best practice: usually the Jar file is built before and then copied during Docker build.
You can change whatever you want of course.
DEPLOY
With sam deploy --guided (using the default parameters) it creates:
- the S3 repo with all the templates involved
- the api gateway to expose http
- the image repository (ECR) and also pushes the image
- all the mandatory roles
When the recap is done you can see a nice table which shows all the resources created. Remember that this is a stack of cloudFormation: if you navigate in your console you can see it and all the resources and outputs created that you see on command console, are nicely visible in UI
TEST IT ON CLOUD
After the deploy you will have all the code deployed and you application can be invoked. So you can perform a "GET" on the enpoint visible at the row which Key is HelloWorldApi: the related Value is the endpoint.
Remember that at the first invoke you can have a slow response: this is because it is just for test purpose and we didn't activate the warmup start to have an immediate response.
CONCLUSION
You know have all the templates that are used to create the stack for you Lambda application. This is just intended to have something to test. You must change all the files to start create:
- S3 trigger
- SQS trigger
- SNS trigger
- DynamoDB trigger
and all the trigger you can. Then your function should be more smart than that, such as writing on a DB with a managed connection, upload file on S3 and so on.
If you are working in an IT companies, what is described here is already "done" by your devops which surely have a pipeline to let you just focus on code. But for me it's important to understand how AWS manages the resources.
Have a nice..experiment!

Commenti
Posta un commento