Using an Amazon S3 trigger to invoke a Lambda function on LocalStack

Rochisha Jaiswal
4 min readSep 2, 2021

--

In this article, We will create a Lambda function and configure a trigger for Amazon Simple Storage Service (Amazon S3). The trigger invokes the function every time we add an object to our Amazon S3 bucket.

Lambda can be used to process event notifications from Amazon Simple Storage Service. Amazon S3 can send an event to a Lambda function when an object is created or deleted. It invokes the function asynchronously with an event that contains details about the object.

LocalStack provides an easy way to develop AWS cloud applications directly from our localhost. It spins up a testing environment on our local machine that provides almost the same parity functionality and APIs as the real AWS cloud environment.

Let’s Start

We must have Docker installed on our system.

Docker-Compose.YML

version: '2.1'services:
localstack:
container_name: "localstack-image"
image: localstack/localstack-full
network_mode: bridge
ports:
- "4566:4566"
- "4571:4571"
- "8082:8082"
environment:
- USE_LIGHT_IMAGE=0
- DEBUG=1
- PORT_WEB_UI=8082
- LAMBDA_EXECUTOR=local
- DOCKER_HOST=unix:///var/run/docker.sock
- HOST_TMP_FOLDER=${TMPDIR}
- START_WEB=1
volumes:
- "${TMPDIR:-/tmp/localstack}:/tmp/localstack"
- "/var/run/docker.sock:/var/run/docker.sock"

Run the following command :

docker-compose up

We will see the LocalStack container is running successfully.

Create a bucket

Once our LocalStack container is running up, we can open a new terminal and create an S3 Bucket using following command.

aws s3 mb s3://mybucket --region us-west-1 --endpoint-url http://localhost:4566

Create the IAM policy

The IAM policy defines the permissions for the Lambda function. Be sure to replace mybucket with the name of the source bucket that you created previously.

{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Action": [
"logs:PutLogEvents",
"logs:CreateLogGroup",
"logs:CreateLogStream"
],
"Resource": "arn:aws:logs:*:*:*"
},
{
"Effect": "Allow",
"Action": [
"s3:GetObject"
],
"Resource": "arn:aws:s3:::mybucket/*"
}

]
}

Run the following command :

aws iam create-policy --policy-name my-pol --policy-document file://pol.txt --endpoint-url http://localhost:4566

Create the execution role

An IAM resource-based policy controls the permissions to invoke the function.

aws iam create-role --role-name lambda-s3-role --assume-role-policy-document "{"Version": "2012-10-17","Statement": [{ "Effect": "Allow", "Principal": {"Service": "lambda.amazonaws.com"}, "Action": "sts:AssumeRole"}]}"  --endpoint-url http://localhost:4566

Attach the IAM policy to an IAM role

An IAM execution role defines the permissions that control what the function is allowed to do when interacting with other AWS services.

aws iam attach-role-policy --policy-arn arn:aws:iam::000000000000:policy/my-pol --role-name lambda-s3-role --endpoint-url http://localhost:4566

Create the function code

Copy the following code example into a file named Handler.java.

package example;import java.awt.Color;
import java.awt.Graphics2D;
import java.awt.RenderingHints;
import java.awt.image.BufferedImage;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.ByteArrayOutputStream;
import java.io.IOException;
import java.io.InputStream;
import java.io.InputStreamReader;
import java.util.regex.Matcher;
import java.util.regex.Pattern;
import javax.imageio.ImageIO;import com.amazonaws.AmazonServiceException;
import com.amazonaws.services.lambda.runtime.Context;
import com.amazonaws.services.lambda.runtime.LambdaLogger;
import com.amazonaws.services.lambda.runtime.RequestHandler;
import com.amazonaws.services.lambda.runtime.events.S3Event;
import com.amazonaws.services.s3.AmazonS3;
import com.amazonaws.services.s3.event.S3EventNotification.S3EventNotificationRecord;
import com.amazonaws.services.s3.model.GetObjectRequest;
import com.amazonaws.services.s3.model.ObjectMetadata;
import com.amazonaws.services.s3.model.S3Object;
import com.amazonaws.services.s3.AmazonS3ClientBuilder;
import com.google.gson.Gson;
import com.google.gson.GsonBuilder;
import org.slf4j.Logger;
import org.slf4j.LoggerFactory;
import com.amazonaws.client.builder.AwsClientBuilder.EndpointConfiguration;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
public class Handler implements RequestHandler<S3Event, String> {
Gson gson = new GsonBuilder().setPrettyPrinting().create();
private static final Logger logger = LoggerFactory.getLogger(Handler.class);
public Handler() {
System.out.println("***");
}
@Override
public String handleRequest(S3Event s3event, Context context) {
logger.info("Hello World");

S3EventNotificationRecord record = s3event.getRecords().get(0);

String srcBucket = record.getS3().getBucket().getName();

String srcKey = record.getS3().getObject().getUrlDecodedKey();

EndpointConfiguration endpointConfiguration = new EndpointConfiguration(
"http://localhost:4566", "us-west-1");
BasicAWSCredentials awsCreds = new BasicAWSCredentials("1234", "1234");
AmazonS3 s3Client = AmazonS3ClientBuilder.standard()
.withEndpointConfiguration(endpointConfiguration)
.withCredentials(new AWSStaticCredentialsProvider(awsCreds))
.withPathStyleAccessEnabled(true)
.build();


S3Object object = s3Client.getObject(new GetObjectRequest(srcBucket, srcKey));

BufferedReader reader = new BufferedReader(new InputStreamReader(object.getObjectContent()));
String s = null;
try {
while ((s = reader.readLine()) != null)
{
System.out.println(s);
//your business logic here
}

} catch (IOException e) {
e.printStackTrace();
}
return "";
}
}

Create the deployment package

The deployment package is a .zip file archive containing your Lambda function code and its dependencies.

Run the following command :

mvn package

Create the Lambda function

The create-function command specifies the function handler as example.handler. The function can use the abbreviated handler format of package.Class because the function implements a handler interface.

aws lambda create-function --function-name CreateFunction --zip-file fileb://s3-java-1.0-SNAPSHOT.jar --handler example.Handler --runtime java8 --timeout 10 --memory-size 1024 --role arn:aws:iam::000000000000:role/lambda-s3-role --endpoint-url http://localhost:4566

Configure Amazon S3 to publish events

  • Add a notification configuration to your source S3 bucket. In the notification configuration, you provide the following:
  • The event type for which you want Amazon S3 to publish events. For this tutorial, specify the s3:ObjectCreated:* event type so that Amazon S3 publishes events when objects are created.
  • The function to invoke.
{
"LambdaFunctionConfigurations": [{
"LambdaFunctionArn": "arn:aws:lambda:us-east-1:000000000000:function:CreateFunction",
"Events": ["s3:ObjectCreated:*"]
}]
}

Run the following command :

aws s3api put-bucket-notification-configuration --bucket mybucket --notification-configuration file://notification.json --endpoint-url http://localhost:4566

Test using the S3 trigger

Upload .txt objects to the source S3 bucket and the content of the object will be displayed on the console.

aws s3 cp Myfile.txt s3://mybucket --endpoint-url http://localhost:4566

Congratulations!

You now have a successfully configured Amazon S3 to publish events and trigger Lambda. It’s time to put it to use!

Checkout another related article — https://rochisha-jaiswal70.medium.com/using-aws-lambda-with-amazon-simple-queue-service-bb0694257a2b

Dear reader, I hope this was clear and useful. If you found it interesting don’t forget to like this article and follow me to be notified about similar ones in future. See ya!

--

--

Rochisha Jaiswal
Rochisha Jaiswal

Written by Rochisha Jaiswal

Appian BPM Developer at Novartis

Responses (2)