Logging Lambda - A nice library
Logging what is happening in your application, should be always the first thing you need to setup: to understanding what is going wrong, the best way to do it is to investigate log files.
In cloud environment it is more required because code is not working on your local machine, so using the CloudWatch (at first sight) should be the tool you will use.
LAMBDA ENVIRONMENT
Lambda comes with his own log, which is powerful of course but has is limitation. It is embedded in the context you receive in your handler and you can use it of course
public APIGatewayProxyResponseEvent handleRequest(final APIGatewayProxyRequestEvent input,
final Context context) {
LambdaLogger logger = context.getLogger();
Map<String, String> headers = new HashMap<>();
headers.put("Content-Type", "application/json");
headers.put("X-Custom-Header", "application/json");
APIGatewayProxyResponseEvent response = new APIGatewayProxyResponseEvent()
.withHeaders(headers);
try {
context.getLogger().log("Something to log");
This of course works but the real limitation is that if you need to add more log in subsequence classes, you need to pass to the constructor the context or the logger.
So when you write your code you need to remember it. And testing of course become not so easy
You need to declare a mock Logger for example and tell it to print to the console
@Mock
LambdaLogger lambdaLogger;
@Mock
Context context;
Mockito.when(context.getLogger()).thenReturn(lambdaLogger);
Mockito.doAnswer(invocationOnMock -> {
System.out.println((String)invocationOnMock.getArgument(0));
return null;
}).when(lambdaLogger).log(Mockito.anyString());
Which of course is not the best way to write code. Plus the other limitations are:
- There is no way of change the Level
- There is no way to write log in a Json way, which will be easy to parse when with other instrument
POWERTOOLS
POM.XML
<dependency>
<groupId>software.amazon.lambda</groupId>
<artifactId>powertools-logging</artifactId>
<version>1.17.0</version>
</dependency>
<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-lambda-java-log4j2</artifactId>
<version>1.5.1</version>
</dependency>
<dependency>
<groupId>org.slf4j</groupId>
<artifactId>slf4j-simple</artifactId>
<version>1.7.30</version>
</dependency>
The first is the library, the second will be used locally to see what is receiving from the context, and the third is to avoid an error during loading related to StaticLoggerBinding
The library uses some annotation, so it is fundamental to use the aspectj plugin and to tell him we need to use the lib
<plugin>
<groupId>org.codehaus.mojo</groupId>
<artifactId>aspectj-maven-plugin</artifactId>
<version>1.14.0</version>
<configuration>
<source>16</source>
<target>16</target>
<complianceLevel>${maven.compiler.target}</complianceLevel>
<aspectLibraries>
<aspectLibrary>
<groupId>software.amazon.lambda</groupId>
<artifactId>powertools-logging</artifactId>
</aspectLibrary>
</aspectLibraries>
</configuration>
<executions>
<execution>
<goals>
<goal>compile</goal>
</goals>
</execution>
</executions>
</plugin>
THE CONFIGURATION
Of course under the hood we have log4j2 so we need to create a specific log4j2.xml to tell to the application which configuration we will use to log
<Configuration>
<Appenders>
<Console name="JsonAppender" target="SYSTEM_OUT">
<JsonTemplateLayout eventTemplateUri="classpath:LambdaJsonLayout.json" />
</Console>
</Appenders>
<Loggers>
<Root level="DEBUG">
<AppenderRef ref="JsonAppender"/>
</Root>
<Logger name="software.amazon.awssdk" level="WARN" />
<Logger name="software.amazon.awssdk.request" level="DEBUG" />
</Loggers>
</Configuration>
With this configuration (which is adviced) you will configure a console appender but the template he will use is a JsonTemplate.
You can open the Json to see what will be logged, which variables (taken from the context most of them) will be used to create the final message. They can be changed of course, creating your own template or using other configuration you can retrieve from the lib site.
Using the log is quite easy, because you can use a Logger statically created for each class, but at handler event you can also inject some value and all the event that are happening before invocation
Here an example of configuration
@Logging(logEvent = true, correlationIdPath = "/headers/trace-header")
public APIGatewayProxyResponseEvent handleRequest(final APIGatewayProxyRequestEvent input,
final Context context) {
The @Logging will log before calling the handlerRequest with all the context info because logEvent is true. To add a correlationId we can use a correlationIdPath passing a path. This is referred to what is passed as message as the first invocation, so right now we know that there is a field named headers (the headers of the request of course) and we will parse again to take the value "trace-header", supposing it is passed always.
Referring to the message (you will find it in cloudwatch) you can decide the path, but usually an header is the easiest way to implement it
To use the logger, you will create it as Log4j does:
private static final Logger log = LogManager.getLogger(App.class);
And then use it whenever you want.
So the other classes will create the Logger they need. The level is bounded of course to the Log4j2 file, but there are other way to override it, that you can fined always in the library site.
LOG IN ACTION
Supposing we will create a log line like this
log.debug("The time of execution is {}", executionTime);
We can call the function
curl -X GET https://xxxxxxxxxx -H "trace-header: rametta4"
And we can access to Cloudwatch to see it
This is the "LogEvent" in action: the message is of course more verbose, but I just want to show that the header is passed
This is the message logged
As you see the correlation_id is present.
The first message of course does not contains the value of correlation_id, so if we need to include it in our CloudWatch search, we can refer it as the message content (this is not the best way, but it is just for test right now)
As you see with the query
We can see both of the messages.
The real advantage is that even if we log as a long String with a json messagel, cloudwatch (and other tools) will parse the String for us. This means that we can refer to json field using dot notation or using directly in the query (like correlation_id which is an internal field)
In local environment of course this type of notation is not so easy to read, so maybe you can use another template or a normal console layout pattern
CONCLUSION
First, thank to developers which created and maintain (I suppose) the library. Second, starting to use it can maybe reduce wasting time when we need to setup a project.
We can use of course directly the normal Log4j code without Powertools, but I think we need to give them a try (especially because they have also other tools then logging)
Here the site
Commenti
Posta un commento