Contents
Localstack
https://github.com/localstack/localstack
LocalStack is a cloud service emulator that runs in a single container on your laptop or in your CI environment. With LocalStack, you can run your AWS applications or Lambdas entirely on your local machine without connecting to a remote cloud provider
AWS cli environments variables
- AWS_ACCESS_KEY_ID
- AWS_SECRET_ACCESS_KEY
- AWS_SESSION_TOKEN
- AWS_DEFAULT_REGION
Localstack in Debian
1 docker run --rm -it -p 127.0.0.1:4566:4566 -p 127.0.0.1:4510-4559:4510-4559 \
2 -v /var/run/docker.sock:/var/run/docker.sock --name localstack localstack/localstack
3
4 # LocalStack version: 4.13.1.dev6
5 sudo apt install jq
6 curl http://localhost:4566/_localstack/health | jq .
7
8 # https://aws.amazon.com/cli/
9 cd ~/Downloads
10 curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
11 unzip awscliv2.zip
12 sudo ./aws/install
13 /usr/local/bin/aws --version
14 # aws-cli/2.33.12 Python/3.13.11 Linux/6.1.0-42-amd64 exe/x86_64.debian.12
15
16 export AWS_ACCESS_KEY_ID="test"
17 export AWS_SECRET_ACCESS_KEY="test"
18 export AWS_DEFAULT_REGION="us-east-1"
19 export AWS_ENDPOINT_URL="http://localhost:4566/"
20 aws s3 ls
21 aws s3api create-bucket --bucket my-bucket
22 # https://docs.aws.amazon.com/cli/latest/reference/s3api/
23 echo "test" > test.txt
24 aws s3api put-object --bucket my-bucket --key dir-1/test.txt --body test.txt
25 aws s3api get-object --bucket my-bucket --key dir-1/test.txt test2.txt
26 cat test2.txt
Python lambda and s3
run.sh
1 export AWS_ACCESS_KEY_ID="test"
2 export AWS_SECRET_ACCESS_KEY="test"
3 export AWS_DEFAULT_REGION="us-east-1"
4 export AWS_ENDPOINT_URL="http://localhost:4566/"
5
6 zip py-my-function.zip lambda_function.py
7 aws lambda delete-function --function-name py-my-function
8 aws lambda create-function --function-name py-my-function \
9 --zip-file fileb://py-my-function.zip --handler lambda_function.lambda_handler \
10 --runtime python3.13 --role arn:aws:iam::000000000000:role/lambda-ex \
11 --timeout 30
12 PAYLOAD=$( echo "{ \"first_name\": \"Bob\",\"last_name\":\"Squarepants\" }" | base64 )
13 aws lambda invoke --function-name py-my-function \
14 --payload $PAYLOAD \
15 response.json
16 cat response.json
17
18 aws s3api get-object --bucket examplebucket --key examplebucket/response.txt r.txt
lambda_function.py
1 import boto3
2 import os
3
4 def lambda_handler(event, context):
5 message = 'Hello {} {}!'.format(event['first_name'], event['last_name'])
6 session = boto3.session.Session()
7 s3 = session.client( service_name='s3' )
8
9 buckets=[]
10 for bucket in s3.list_buckets()['Buckets']:
11 buckets.append(bucket['Name'])
12
13 response = s3.create_bucket(Bucket='examplebucket')
14
15 body = {
16 'message' : message,
17 'buckets' : buckets,
18 'AWS_ACCESS_KEY_ID' : os.environ["AWS_ACCESS_KEY_ID"],
19 'AWS_SECRET_ACCESS_KEY' : os.environ["AWS_SECRET_ACCESS_KEY"],
20 'AWS_DEFAULT_REGION' : os.environ["AWS_DEFAULT_REGION"],
21 'AWS_ENDPOINT_URL': os.environ['AWS_ENDPOINT_URL']
22 }
23
24 s3.put_object(Body=str(body), Bucket='examplebucket', Key='examplebucket/response.txt')
25 return body
Access localstack from docker container
1 docker run -d --name localstack --rm -it -p 4566:4566 -p 4571:4571 localstack/localstack # run container
2 docker exec -it localstack bash # connect to container
3 cat /etc/os-release | grep -i pretty
4 # PRETTY_NAME="Debian GNU/Linux 13 (trixie)"
5
6 curl http://localhost:4566/_localstack/health
7 awslocal s3api list-buckets
8 awslocal s3api create-bucket --bucket my-bucket
9 echo "test" > test.txt
10 awslocal s3api put-object --bucket my-bucket --key dir-1/test.txt --body test.txt
11 awslocal s3api get-object --bucket my-bucket --key dir-1/test.txt test2.txt
12 cat test2.txt
13 apt install nano vim yajl-tools -y
14 # https://hub.docker.com/r/localstack/localstack
15 # https://github.com/localstack/localstack
16 node -v
17 # v22.22.0
18 python -V
19 # Python 3.13.11
20 pip3 freeze
21 curl http://localhost:4566/_localstack/health | json_reformat
22 awslocal ec2 run-instances --image-id prod-df2jln3gjtwps --count 1 --instance-type t2.micro
23 awslocal ec2 describe-instances --filters "Name=instance-type,Values=t2.micro" --query "Reservations[].Instances[].InstanceId"
24 awslocal ec2 describe-instances
Java8 lambda handler
Steps
build.sh
1 FUNCTION_NAME=lambda-function
2 aws lambda delete-function --function-name $FUNCTION_NAME
3 sleep 5
4 mvn clean install
5 sleep 5
6 aws lambda create-function --function-name $FUNCTION_NAME \
7 --zip-file fileb://target/lambda-function-1.0-SNAPSHOT.jar \
8 --handler com.mooo.bitarus.Handler --runtime java8 \
9 --role arn:aws:iam::000000000000:role/lambda-ex --timeout 30
10 #awslocal lambda update-function-configuration --function-name $FUNCTION_NAME \
11 # --timeout 15
12 sleep 15
latest_log.sh
1 LOG_GROUP="/aws/lambda/lambda-function"
2 LOG_STREAM=$(aws logs describe-log-streams \
3 --log-group-name $LOG_GROUP \
4 --order-by LastEventTime --descending | \
5 grep logStreamName | head -1 | awk '//{print $2}' | sed "s/,//g" | sed 's/\"//g' )
6 echo $LOG_GROUP
7 echo $LOG_STREAM
8 aws logs get-log-events --log-group-name $LOG_GROUP \
9 --log-stream-name "$LOG_STREAM" \
10 | grep message | sed 's/"message"\://g' | sed 's/ //g'
pom.xml
1 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
2 xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
3 <modelVersion>4.0.0</modelVersion>
4 <groupId>com.mooo.bitarus</groupId>
5 <artifactId>lambda-function</artifactId>
6 <packaging>jar</packaging>
7 <version>1.0-SNAPSHOT</version>
8 <name>lambda-function</name>
9 <properties>
10 <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
11 <maven.compiler.source>1.8</maven.compiler.source>
12 <maven.compiler.target>1.8</maven.compiler.target>
13 </properties>
14 <dependencies>
15 <dependency>
16 <groupId>com.amazonaws</groupId>
17 <artifactId>aws-lambda-java-core</artifactId>
18 <version>1.3.0</version>
19 </dependency>
20 <dependency>
21 <groupId>com.google.code.gson</groupId>
22 <artifactId>gson</artifactId>
23 <version>2.10.1</version>
24 </dependency>
25 </dependencies>
26 <build>
27 <plugins>
28 <plugin>
29 <artifactId>maven-surefire-plugin</artifactId>
30 <version>2.22.2</version>
31 </plugin>
32 <plugin>
33 <groupId>org.apache.maven.plugins</groupId>
34 <artifactId>maven-shade-plugin</artifactId>
35 <version>3.2.2</version>
36 <configuration>
37 <createDependencyReducedPom>false</createDependencyReducedPom>
38 </configuration>
39 <executions>
40 <execution>
41 <phase>package</phase>
42 <goals>
43 <goal>shade</goal>
44 </goals>
45 </execution>
46 </executions>
47 </plugin>
48 <plugin>
49 <groupId>org.apache.maven.plugins</groupId>
50 <artifactId>maven-compiler-plugin</artifactId>
51 <version>3.8.1</version>
52 <configuration>
53 <source>1.8</source>
54 <target>1.8</target>
55 </configuration>
56 </plugin>
57 </plugins>
58 </build>
59 </project>
run.sh
src/main/java/com/mooo/bitarus/Handler.java
1 package com.mooo.bitarus;
2
3 import com.amazonaws.services.lambda.runtime.Context;
4 import com.amazonaws.services.lambda.runtime.RequestHandler;
5 import com.amazonaws.services.lambda.runtime.LambdaLogger;
6 import com.google.gson.Gson;
7 import com.google.gson.GsonBuilder;
8 import java.util.Map;
9 import java.util.HashMap;
10
11 public class Handler implements RequestHandler<Map<String,String>, String>{
12 Gson gson = new GsonBuilder().setPrettyPrinting().create();
13 @Override
14 public String handleRequest(Map<String,String> event, Context context)
15 {
16 LambdaLogger logger = context.getLogger();
17 System.out.println(">>> sout test");
18 logger.log("Stuff logged");
19 String response = "Java Lambda invocation response 20260131";
20 logger.log( event.get("first_name") );
21 logger.log("EVENT TYPE: " + event.getClass());
22 Map<String,String> hashReturn = new java.util.HashMap<String,String>();
23 hashReturn.put("response",response);
24 return gson.toJson(hashReturn);
25 }
26 }
SPA app + API gateway + lambda function
To host a SPA (Single Page Application) in LocalStack that uses API Gateway, we must simulate the AWS environment where Amazon S3 acts as a static file web server and API Gateway acts as the backend calling lambda functions.
Localstack complete flow:
- The user goes to S3 URL in the browser
The browser downloads index.html and the SPA JavaScript from S3 (LocalStack).
The SPA makes a POST call to an API Gateway endpoint (LocalStack).
- API Gateway triggers a Lambda Java (that makes the sent string uppercase).
- Lambda returns a JSON and the SPA updates its screen with the JSON data
