AWS Lambda Java: Creating Deployment Package for Java 8/11 using Maven in Eclipse – Part 1

By August 10, 2020 October 13th, 2020 Powerlearnings

Written by Tejaswee Das, Software Engineer at Powerupcloud Technologies Collaborator: Neenu Jose, Senior Software Engineer

Introduction

When we talk about being serverless, AWS Lambda is definitely one that we connect with. It was never that simple before, AWS Lambda has made life easy for Developers and Data Engineers alike. You will hardly find any use-cases involving AWS without Lambda. It’s a nightmare to think about AWS without a Lambda. To know more about my AWS Lambda and some simple S3 events use cases around it, you should have a look at one of my earlier posts on AWS Lambda

This post is part of a two-part blog series. This part 1 blog will guide you through the steps to create a Java (Java 8/Java 11) deployment package for AWS Lambda in Eclipse using Maven and use S3 Event triggers. 

In Part 2, we will discuss steps on using SES with S3 Event triggers to Lambda.

Use Case

One of our clients had their workloads running on Azure Cloud. They had few serverless applications in Java 8 in Azure Functions. They wanted to upgrade Java from Java 8 to Java 11. Since Java 11 was not supported (Java 11 for Azure Functions has recently been released in Preview), they wanted to try out other cloud services – that’s when AWS Lambda was the one to come into the picture. We did a POC feasibility check for Java 11 applications running on AWS Lambda.

Deployment Steps

Step 1: Install AWS Toolkit in Eclipse

1.1 Open Eclipse → Go to Help → Install New Software

1.2 Enter https://aws.amazon.com/eclipse in Work with and select AWS Toolkit for Eclipse Core(Required)

1.3 Click Next and Install

Note: The toolkit requires Eclipse 4.4 (Luna) or higher.

Step 2: Create New AWS Lambda Function

You might need to restart Eclipse for the installation to reflect.  Add your AWS Access Keys when asked. This step is optional, you can anytime add/remove AWS credentials/accounts from Preferences Menu.

2.1 Go to File → New → Other…

2.2 Select AWS → AWS Lambda Java Project → Next

2.3 Fill in your Project Name and other details

Class Name is your Lambda Handler. In Lambda terms – it’s like the main function of your Project.  You can have anything here – we are using the default name.

For our demo we are using in-built S3 Events. There are a lot of other events to use from like – DynamoDB Event, Stream Request Handler, SNS Event, Kinesis Event, Cognito Event or even Custom if you want to build from scratch.

2.4 Click Finish

For our demonstration & test purposes, you can go with the default code. We are using us-east-1.  Make sure you add the region, you might encounter an error if not added.

Sample Code

package com.amazonaws.lambda.demo;

import com.amazonaws.regions.Regions;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.S3Object;

public class LambdaFunctionHandler implements RequestHandler<S3Event, String> {

	private AmazonS3 s3 = AmazonS3ClientBuilder.standard()
    		.withRegion(Regions.US_EAST_1)
    		.build();

    public LambdaFunctionHandler() {}

    // Test purpose only.
    LambdaFunctionHandler(AmazonS3 s3) {
        this.s3 = s3;
    }

    @Override
    public String handleRequest(S3Event event, Context context) {
        context.getLogger().log("Received event: " + event);

        // Get the object from the event and show its content type
        String bucket = event.getRecords().get(0).getS3().getBucket().getName();
        String key = event.getRecords().get(0).getS3().getObject().getKey();
        try {
            S3Object response = s3.getObject(new GetObjectRequest(bucket, key));
            String contentType = response.getObjectMetadata().getContentType();
            context.getLogger().log("CONTENT TYPE: " + contentType);
            return contentType;
        } catch (Exception e) {
            e.printStackTrace();
            context.getLogger().log(String.format(
                "Error getting object %s from bucket %s. Make sure they exist and"
                + " your bucket is in the same region as this function.", key, bucket));
            throw e;
        }
    }
}

Step 4: Java Runtime Environment (JRE)

4.1 Go to Windows → Preferences → Java → Installed JREs

Step 5: Maven Build

Now to the final step where we will build our deployment package.

5.1 Right click on your project in the Project Explorer → Run As → Maven Build

5.2 Edit Configuration & Launch

Enter ‘package’in Goals. Select your JRE, can leave else as default.

5.3 Run

Your build should happen without any errors for Java 8, but with Java 11, you might run into few errors. Make sure you add updated mockito-core.

In pom.xml of  the generated project change the version of mockito-core

<dependency>
      <groupId>org.mockito</groupId>
      <artifactId>mockito-core</artifactId>
      <version>2.7.22</version> //Change to 3.3.3
      <scope>test</scope>
    </dependency>

This version change is necessary for java 11 build to work.

On successful build, you should see something similar

Sample build

[INFO] Scanning for projects...
[INFO] 
[INFO] ------< com.amazonaws.lambda:demo >----------------------
[INFO] Building demo 1.0.0
[INFO] -----------------[ jar ]---------------------------------
[INFO] 
[INFO] -- maven-resources-plugin:2.6:resources (default-resources) @ demo ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 0 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.6.0:compile (default-compile) @ demo ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ demo ---
[WARNING] Using platform encoding (Cp1252 actually) to copy filtered resources, i.e. build is platform dependent!
[INFO] Copying 1 resource
[INFO] 
[INFO] --- maven-compiler-plugin:3.6.0:testCompile (default-testCompile) @ demo ---
[INFO] Nothing to compile - all classes are up to date
[INFO] 
[INFO] --- maven-surefire-plugin:2.12.4:test (default-test) @ demo ---
[INFO] Surefire report directory: D:\Eclipse Workspace\Maven\demo-blog-s3\target\surefire-reports

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running com.amazonaws.lambda.demo.LambdaFunctionHandlerTest
Received event: com.amazonaws.services.lambda.runtime.events.S3Event@124ac145
CONTENT TYPE: image/jpeg
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.117 sec

[INFO] 
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ demo ---
[INFO] Downloading from : https://repo.maven.apache.org/maven2/org/apache/maven/maven-archiver/2.5/maven-archiver-2.5.pom
.
.
.
.
.
[INFO] Replacing original artifact with shaded artifact.
[INFO] Replacing D:\Eclipse Workspace\Maven\demo-blog-s3\target\demo-1.0.0.jar with D:\Eclipse Workspace\Maven\demo-blog-s3\target\demo-1.0.0-shaded.jar
[INFO] Dependency-reduced POM written at: D:\Eclipse Workspace\Maven\demo-blog-s3\dependency-reduced-pom.xml
[INFO] ----------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ----------------------------------------------------------------
[INFO] Total time:  03:10 min
[INFO] Finished at: 2020-08-03T14:53:09+05:30
[INFO] ---------------------------------------------------------------

Look at the highlighted path to locate your .jar file. This is as per your Workspace directory configuration.

My Eclipse Workspace directory here is : D:\Eclipse Workspace\Maven\

Step 6: Create Test S3 Bucket

Create S3 bucket,  for putting the files, from the AWS console.

Make sure this bucket is in the same region where you are planning to create the Lambda Function.

Step 7: Creating Lambda Function in AWS Console

There are a couple of ways of creating Lambda functions. The easiest way is through the AWS Console. Choose Language, create Function and get going. You get a lot of runtimes to choose from. Start writing code on the Go. Works great for interpreted languages – Python, Node, Ruby others.

But for compiler based languages like Java, Go, .NET, you will need to upload the deployment package and do not allow in-line editing.

There are other ways to directly upload Lambda functions from Eclipse itself. We faced issues with that, so to get our task done, we created a deployment package (.jar in Java) and uploaded it to Lambda. Works great.

7.1 S3 Event Triggers & IAM

Please refer to one of my previous posts

Follow solution steps 1&2. Create required Execution roles and attach policies to provide required permissions.

Step 8: Deploying Lambda Function

8.1 All setup done, you just need to upload the maven built jar file in Step 5.3 here.

You can either directly upload if file size is less than 10MB, else you can upload large files using Amazon S3.

Great! Your code is deployed successfully. Time to test it now.

Step 9: Testing

9.1 To test your deployment, Go to S3

9.2 Go to your Bucket that you created in Step 6 and configured trigger in 7.1

Upload a test file

9.3 Go back to your Lambda Function. Click on Monitoring → View logs in CloudWatch

You can see the S3 trigger events log here. When the file was uploaded to that bucket, it triggered Lambda.

Conclusion

This was a very simple proof of concept that we have demonstrated here. This was mainly to get AWS Lambda working with Java 11 for our client. In the next part of this series, we will try to demonstrate some more stuff we can try with AWS Lambda using Java 8/11 – using AWS Java SDK to send emails & notifications on file upload to S3.

Hope this was informative. We had a tough time figuring the correct resources to use, so planned to write this to help folks out there looking for help with different Java versions & AWS Lambda.

References

https://aws.amazon.com/eclipse/

Leave a Reply