MoinMoin Logo
  • Comments
  • Immutable Page
  • Menu
    • Navigation
    • RecentChanges
    • FindPage
    • Local Site Map
    • Help
    • HelpContents
    • HelpOnMoinWikiSyntax
    • Display
    • Attachments
    • Info
    • Raw Text
    • Print View
    • Edit
    • Load
    • Save
  • Login

Navigation

  • Start
  • Sitemap
Unknown action login.
  • AWS
  • LocalStack

Contents

  1. Localstack
    1. AWS CLI installation
    2. Localstack in Debian
    3. Check localstack services and health
    4. Python lambda and s3
    5. Access localstack from docker container
    6. Java 21 lambda handler
    7. SPA app + API gateway + lambda function
    8. DynamoDB
    9. Fat Zip python lambda

Localstack

https://github.com/localstack/localstack

LocalStack is a cloud service emulator that runs in a single container on your laptop or in your CI environment. With LocalStack, you can run your AWS applications or Lambdas entirely on your local machine without connecting to a remote cloud provider

AWS CLI installation

   1 # https://aws.amazon.com/cli/
   2 cd ~/Downloads
   3 curl "https://awscli.amazonaws.com/awscli-exe-linux-x86_64.zip" -o "awscliv2.zip"
   4 unzip awscliv2.zip
   5 sudo ./aws/install
   6 /usr/local/bin/aws --version
   7 # aws-cli/2.33.12 Python/3.13.11 Linux/6.1.0-42-amd64 exe/x86_64.debian.12
   8 
   9 export AWS_ACCESS_KEY_ID="test"
  10 export AWS_SECRET_ACCESS_KEY="test"
  11 export AWS_DEFAULT_REGION="us-east-1"
  12 export AWS_ENDPOINT_URL="http://localhost:4566/"

Localstack in Debian

   1 docker run --rm -it -p 127.0.0.1:4566:4566 -p 127.0.0.1:4510-4559:4510-4559 \
   2   -v /var/run/docker.sock:/var/run/docker.sock --name localstack localstack/localstack
   3 
   4 # LocalStack version: 4.13.1.dev6
   5 sudo apt install jq 
   6 curl http://localhost:4566/_localstack/health | jq .
   7 
   8 export AWS_ACCESS_KEY_ID="test"
   9 export AWS_SECRET_ACCESS_KEY="test"
  10 export AWS_DEFAULT_REGION="us-east-1"
  11 export AWS_ENDPOINT_URL="http://localhost:4566/"
  12 aws s3 ls 
  13 aws s3api create-bucket --bucket my-bucket 
  14 # https://docs.aws.amazon.com/cli/latest/reference/s3api/
  15 echo "test" > test.txt
  16 aws s3api put-object --bucket my-bucket --key dir-1/test.txt --body test.txt 
  17 aws s3api get-object --bucket my-bucket --key dir-1/test.txt test2.txt 
  18 cat test2.txt 

Check localstack services and health

   1 curl -s http://localhost:4566/_localstack/health | jq .

   1 {
   2   "services": {
   3     "acm": "available",
   4     "apigateway": "available",
   5     "cloudformation": "available",
   6     "cloudwatch": "running",
   7     "config": "available",
   8     "dynamodb": "available",
   9     "dynamodbstreams": "available",
  10     "ec2": "running",
  11     "es": "available",
  12     "events": "available",
  13     "firehose": "available",
  14     "iam": "available",
  15     "kinesis": "available",
  16     "kms": "available",
  17     "lambda": "running",
  18     "logs": "running",
  19     "opensearch": "available",
  20     "redshift": "available",
  21     "resource-groups": "available",
  22     "resourcegroupstaggingapi": "available",
  23     "route53": "available",
  24     "route53resolver": "available",
  25     "s3": "running",
  26     "s3control": "available",
  27     "scheduler": "available",
  28     "secretsmanager": "available",
  29     "ses": "available",
  30     "sns": "available",
  31     "sqs": "available",
  32     "ssm": "available",
  33     "stepfunctions": "available",
  34     "sts": "running",
  35     "support": "available",
  36     "swf": "available",
  37     "transcribe": "available"
  38   },
  39   "edition": "community",
  40   "version": "4.13.1.dev6"
  41 }

Python lambda and s3

Lambda image https://github.com/aws/aws-lambda-base-images/tree/python3.13

  • public.ecr.aws/lambda/python:3.13

run.sh

   1 export AWS_ACCESS_KEY_ID="test"
   2 export AWS_SECRET_ACCESS_KEY="test"
   3 export AWS_DEFAULT_REGION="us-east-1"
   4 export AWS_ENDPOINT_URL="http://localhost:4566/"
   5 
   6 zip py-my-function.zip lambda_function.py
   7 aws lambda delete-function --function-name py-my-function
   8 aws lambda create-function --function-name py-my-function \
   9   --zip-file fileb://py-my-function.zip --handler lambda_function.lambda_handler  \
  10   --runtime python3.13 --role arn:aws:iam::000000000000:role/lambda-ex \
  11   --timeout 30
  12 PAYLOAD=$( echo "{ \"first_name\": \"Bob\",\"last_name\":\"Squarepants\" }"  | base64 )
  13 aws lambda invoke --function-name py-my-function \
  14   --payload $PAYLOAD \
  15   response.json 
  16 cat response.json
  17 
  18 aws s3api get-object --bucket examplebucket --key  examplebucket/response.txt r.txt

lambda_function.py

   1 import boto3
   2 import os
   3 
   4 def lambda_handler(event, context):
   5     message = 'Hello {} {}!'.format(event['first_name'], event['last_name'])
   6     session = boto3.session.Session()
   7     s3 = session.client( service_name='s3' )
   8 
   9     buckets=[]
  10     for bucket in s3.list_buckets()['Buckets']:
  11         buckets.append(bucket['Name'])
  12 
  13     response = s3.create_bucket(Bucket='examplebucket')
  14 
  15     body = {
  16         'message' : message,
  17         'buckets' : buckets,
  18         'AWS_ACCESS_KEY_ID' : os.environ["AWS_ACCESS_KEY_ID"],
  19         'AWS_SECRET_ACCESS_KEY' : os.environ["AWS_SECRET_ACCESS_KEY"],
  20         'AWS_DEFAULT_REGION' : os.environ["AWS_DEFAULT_REGION"],
  21         'AWS_ENDPOINT_URL': os.environ['AWS_ENDPOINT_URL']
  22     }
  23 
  24     s3.put_object(Body=str(body), Bucket='examplebucket', Key='examplebucket/response.txt')
  25     return body

Access localstack from docker container

   1 docker run -d --name localstack --rm -it -p 4566:4566 -p 4571:4571 -v /var/run/docker.sock:/var/run/docker.sock localstack/localstack # run container
   2 docker exec -it localstack bash # connect to Localstack container
   3 cat /etc/os-release | grep -i pretty
   4 # PRETTY_NAME="Debian GNU/Linux 13 (trixie)"
   5 
   6 curl http://localhost:4566/_localstack/health
   7 awslocal s3api list-buckets
   8 awslocal s3api create-bucket --bucket my-bucket
   9 echo "test" > test.txt
  10 awslocal s3api put-object --bucket my-bucket --key dir-1/test.txt --body test.txt
  11 awslocal s3api get-object --bucket my-bucket --key dir-1/test.txt test2.txt
  12 cat test2.txt 
  13 apt install nano vim yajl-tools -y
  14 # https://hub.docker.com/r/localstack/localstack
  15 # https://github.com/localstack/localstack
  16 node -v 
  17 # v22.22.0
  18 python -V 
  19 # Python 3.13.11
  20 pip3 freeze
  21 curl http://localhost:4566/_localstack/health | json_reformat
  22 awslocal ec2 run-instances --image-id prod-df2jln3gjtwps --count 1 --instance-type t2.micro
  23 awslocal ec2 describe-instances --filters "Name=instance-type,Values=t2.micro" --query "Reservations[].Instances[].InstanceId"
  24 awslocal ec2 describe-instances

Java 21 lambda handler

Lambda image https://github.com/aws/aws-lambda-base-images/tree/java21

  • public.ecr.aws/lambda/java:21

Steps

   1 mkdir -p ~/Documents/Java8LambdaHandler
   2 cd ~/Documents/Java8LambdaHandler
   3 mkdir -p src/main/java/com/mooo/bitarus/

build.sh

   1 FUNCTION_NAME=lambda-function
   2 aws lambda delete-function --function-name $FUNCTION_NAME
   3 sleep 5
   4 mvn clean install
   5 sleep 5
   6 aws lambda create-function --function-name $FUNCTION_NAME \
   7   --zip-file fileb://target/lambda-function-1.0-SNAPSHOT.jar \
   8   --handler com.mooo.bitarus.Handler --runtime java21 \
   9   --role arn:aws:iam::000000000000:role/lambda-ex --timeout 30
  10 #awslocal lambda update-function-configuration --function-name $FUNCTION_NAME \
  11 #  --timeout 15
  12 sleep 15

latest_log.sh

   1 LOG_GROUP="/aws/lambda/lambda-function"
   2 LOG_STREAM=$(aws logs describe-log-streams \
   3   --log-group-name $LOG_GROUP \
   4   --order-by LastEventTime --descending | \
   5   grep logStreamName | head -1 | awk '//{print $2}' | sed "s/,//g" | sed 's/\"//g' )
   6 echo $LOG_GROUP
   7 echo $LOG_STREAM
   8 aws logs get-log-events --log-group-name $LOG_GROUP \
   9   --log-stream-name "$LOG_STREAM" \
  10   | grep message | sed 's/"message"\://g' | sed 's/             //g'

pom.xml

   1 <project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
   2   xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 http://maven.apache.org/maven-v4_0_0.xsd">
   3   <modelVersion>4.0.0</modelVersion>
   4   <groupId>com.mooo.bitarus</groupId>
   5   <artifactId>lambda-function</artifactId>
   6   <packaging>jar</packaging>
   7   <version>1.0-SNAPSHOT</version>
   8   <name>lambda-function</name>
   9   <properties>
  10     <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  11     <maven.compiler.source>21</maven.compiler.source>
  12     <maven.compiler.target>21</maven.compiler.target>
  13   </properties>
  14   <dependencies>
  15     <dependency>
  16       <groupId>com.amazonaws</groupId>
  17       <artifactId>aws-lambda-java-core</artifactId>
  18       <version>1.3.0</version>
  19     </dependency>
  20     <dependency>
  21       <groupId>com.google.code.gson</groupId>
  22       <artifactId>gson</artifactId>
  23       <version>2.10.1</version>
  24     </dependency>
  25   </dependencies>
  26   <build>
  27     <plugins>
  28       <plugin>
  29         <artifactId>maven-surefire-plugin</artifactId>
  30         <version>2.22.2</version>
  31       </plugin>
  32       <plugin>
  33         <groupId>org.apache.maven.plugins</groupId>
  34         <artifactId>maven-shade-plugin</artifactId>
  35         <version>3.2.2</version>
  36         <configuration>
  37           <createDependencyReducedPom>false</createDependencyReducedPom>
  38         </configuration>
  39         <executions>
  40           <execution>
  41             <phase>package</phase>
  42             <goals>
  43               <goal>shade</goal>
  44             </goals>
  45           </execution>
  46         </executions>
  47       </plugin>
  48       <plugin>
  49         <groupId>org.apache.maven.plugins</groupId>
  50         <artifactId>maven-compiler-plugin</artifactId>
  51         <version>3.8.1</version>
  52         <configuration>
  53            <source>21</source>
  54            <target>21</target>
  55         </configuration>
  56       </plugin>
  57     </plugins>
  58   </build>
  59 </project>

run.sh

   1 PAYLOAD=$( echo "{\"first_name\": \"Bob\",\"last_name\":\"Marley\"}"  | base64 )
   2 #aws lambda wait function-active-v2 --function-name lambda-function
   3 aws lambda invoke --function-name lambda-function \
   4   --payload $PAYLOAD response.json 
   5 cat response.json

Handler.java

   1 // src/main/java/com/mooo/bitarus/Handler.java
   2 package com.mooo.bitarus;
   3 
   4 import com.amazonaws.services.lambda.runtime.Context;
   5 import com.amazonaws.services.lambda.runtime.RequestHandler;
   6 import com.amazonaws.services.lambda.runtime.LambdaLogger;
   7 import com.google.gson.Gson;
   8 import com.google.gson.GsonBuilder;
   9 import java.util.Map;
  10 import java.util.HashMap;
  11 
  12 public class Handler implements RequestHandler<Map<String,String>, String>{
  13   Gson gson = new GsonBuilder().setPrettyPrinting().create();
  14   @Override
  15   public String handleRequest(Map<String,String> event, Context context)
  16   {
  17     LambdaLogger logger = context.getLogger();
  18     System.out.println(">>> sout test");
  19     logger.log("Stuff logged");
  20     String response = "Java Lambda invocation response 20260131";
  21     logger.log( event.get("first_name") );
  22     logger.log("EVENT TYPE: " + event.getClass());
  23     Map<String,String> hashReturn = new java.util.HashMap<String,String>();
  24     hashReturn.put("response",response);
  25     return gson.toJson(hashReturn);
  26   }
  27 }

SPA app + API gateway + lambda function

To host a SPA (Single Page Application) in LocalStack that uses API Gateway, we must simulate the AWS environment where Amazon S3 acts as a static file web server and API Gateway acts as the backend calling lambda functions.

Localstack complete flow:

  • The user goes to S3 URL in the browser
  • The browser downloads index.html and the SPA JavaScript from S3 (LocalStack).

  • The SPA makes a POST call to an API Gateway endpoint (LocalStack).

  • API Gateway triggers a Lambda Java (that makes the sent string uppercase).
  • Lambda returns a JSON and the SPA updates its screen with the JSON data

DynamoDB

DynamoDB is a key value and documents NoSQL DB.

   1 # create table
   2 aws dynamodb create-table --table-name Contacts \
   3     --attribute-definitions AttributeName=Email,AttributeType=S \
   4     --key-schema AttributeName=Email,KeyType=HASH \
   5     --provisioned-throughput ReadCapacityUnits=5,WriteCapacityUnits=5
   6 # list tables
   7 aws dynamodb list-tables
   8 
   9 # insert an item
  10 aws dynamodb put-item --table-name Contacts \
  11     --item '{
  12         "Email": {"S": "john@example.org"},
  13         "Name": {"S": "John Doe"},
  14         "Phone": {"S": "912345678"}
  15       }' 
  16 
  17 # all elements in table
  18 aws dynamodb scan --table-name Contacts 
  19 
  20 # table scan reserved word Name
  21 aws dynamodb scan --table-name Contacts \
  22     --filter-expression "contains(#n, :v)" \
  23     --expression-attribute-names '{"#n": "Name"}' \
  24     --expression-attribute-values '{":v": {"S": "Doe"}}' 
  25 
  26 # table scan
  27 aws dynamodb scan --table-name Contacts \
  28     --filter-expression "contains(Email, :n)" \
  29     --expression-attribute-values '{":n": {"S": "exem"}}' \
  30 
  31 # query using exact match and contains match
  32 aws dynamodb query --table-name Contacts \
  33     --key-condition-expression "Email = :e" \
  34     --filter-expression "contains(Phone, :t)" \
  35     --expression-attribute-values '{
  36         ":e": {"S": "john@exemplo.org"},
  37         ":t": {"S": "912"}
  38     }'

Fat Zip python lambda

run-lambda.sh

   1 export AWS_ACCESS_KEY_ID="test"
   2 export AWS_SECRET_ACCESS_KEY="test"
   3 export AWS_DEFAULT_REGION="us-east-1"
   4 export AWS_ENDPOINT_URL="http://localhost:4566/"
   5 
   6 FUNCTION_NAME=lambda-pg-function
   7 aws lambda invoke --function-name $FUNCTION_NAME response.json
   8 echo "Lambda output"
   9 cat response.json

requirements.txt

   1 pg8000
   2 requests

test_lambda.py

   1 import unittest
   2 import json
   3 from unittest.mock import patch, MagicMock
   4 from lambda_function import lambda_handler
   5 
   6 class TestLambda(unittest.TestCase):
   7 
   8     @patch('pg8000.native.Connection')
   9     @patch('requests.get')
  10     def test_handler_success(self, mock_get, mock_conn):
  11         # Mocking the API response
  12         mock_get.return_value.status_code = 200
  13 
  14         # Mocking the DB response
  15         mock_instance = MagicMock()
  16         mock_instance.run.return_value = [["PostgreSQL 15.0"]]
  17         mock_conn.return_value = mock_instance
  18 
  19         # Execute lambda handler
  20         event = {}
  21         context = None
  22         response = lambda_handler(event, context)
  23         body = json.loads(response['body'])
  24         # Assertions
  25         self.assertEqual(body['api_status'], 200)
  26         self.assertIn('PostgreSQL 15.0', response['body'])
  27 
  28 if __name__ == '__main__':
  29     unittest.main()

start-servers.sh

   1 #!/bin/sh
   2 NETWORK=mynet
   3 echo "Creating network"
   4 docker network create $NETWORK
   5 echo "Launch localstack"
   6 docker run --rm -it -d -p 127.0.0.1:4566:4566 -p 127.0.0.1:4510-4559:4510-4559 \
   7   -v /var/run/docker.sock:/var/run/docker.sock --network $NETWORK \
   8   --name localstack localstack/localstack
   9 echo "Launch postgres"
  10 docker run -p 54320:5432 --rm --name postgres-server -e POSTGRES_PASSWORD=postgres \
  11   --network $NETWORK -d postgres:15.3-alpine

deploy-lambda.sh

   1 export AWS_ACCESS_KEY_ID="test"
   2 export AWS_SECRET_ACCESS_KEY="test"
   3 export AWS_DEFAULT_REGION="us-east-1"
   4 export AWS_ENDPOINT_URL="http://localhost:4566/"
   5 
   6 FUNCTION_NAME=lambda-pg-function
   7 aws lambda delete-function --function-name $FUNCTION_NAME
   8 sleep 5
   9 aws lambda create-function --function-name $FUNCTION_NAME \
  10   --zip-file fileb://lambda_deployment.zip --handler lambda_function.lambda_handler  \
  11   --runtime python3.14 --role arn:aws:iam::000000000000:role/lambda-ex \
  12   --timeout 30

build.sh

   1 #!/bin/bash
   2 PACKAGE_NAME="lambda_deployment.zip"
   3 BUILD_DIR="dist"
   4 HANDLER_FILE="lambda_function.py"
   5 TEST_FILE="test_lambda.py"
   6 
   7 echo "Cleaning up old builds"
   8 rm -rf $BUILD_DIR
   9 rm -f $PACKAGE_NAME
  10 mkdir $BUILD_DIR
  11 echo "Installing dependencies in $BUILD_DIR ..."
  12 pip install -r requirements.txt -t $BUILD_DIR/
  13 echo "Running tests..."
  14 export PYTHONPATH=$PYTHONPATH:$(pwd)/$BUILD_DIR
  15 python3 -m unittest $TEST_FILE
  16 
  17 if [ $? -eq 0 ]; then
  18     echo "Tests passed! Proceeding to build..."
  19 else
  20     echo "Tests failed. Build aborted."
  21     exit 1
  22 fi
  23 
  24 echo "Clean up pycache"
  25 find $BUILD_DIR -type d -name "__pycache__" -exec rm -rf {} +
  26 echo "Copy source code"
  27 cp $HANDLER_FILE $BUILD_DIR/
  28 echo "Creating the ZIP $PACKAGE_NAME"
  29 cd $BUILD_DIR
  30 zip -r ../$PACKAGE_NAME .
  31 cd ..
  32 echo "Deployment package ready: $PACKAGE_NAME"

lambda_function.py

   1 import pg8000.native
   2 import requests
   3 import json
   4 
   5 def lambda_handler(event, context):
   6     con = pg8000.native.Connection(user="postgres", password="postgres", host="postgres-server", database="postgres", port=5432 )
   7 
   8     try:
   9         rows = con.run("SELECT version();")
  10         response = requests.get("https://api.github.com")
  11 
  12         return {
  13             'statusCode': 200,
  14             'body': json.dumps({
  15                 'db_version': rows[0][0],
  16                 'api_status': response.status_code
  17             })
  18         }
  19     finally:
  20         con.close()
  • MoinMoin Powered
  • Python Powered
  • GPL licensed
  • Valid HTML 4.01