MoinMoin Logo
  • Comments
  • Immutable Page
  • Menu
    • Navigation
    • RecentChanges
    • FindPage
    • Local Site Map
    • Help
    • HelpContents
    • HelpOnMoinWikiSyntax
    • Display
    • Attachments
    • Info
    • Raw Text
    • Print View
    • Edit
    • Load
    • Save
  • Login

Navigation

  • Start
  • Sitemap

Upload page content

You can upload content for the page named below. If you change the page name, you can also upload content for another page. If the page name is empty, we derive the page name from the file name.

File to load page content from
Page name
Comment

Revision 24 as of 2021-11-25 00:52:07
  • AWS

AWS

Amazon Web Services

User credentials

  • https://docs.aws.amazon.com/general/latest/gr/root-vs-iam.html

  • https://docs.aws.amazon.com/general/latest/gr/aws-access-keys-best-practices.html

  • https://docs.aws.amazon.com/IAM/latest/UserGuide/best-practices.html

Instead of sharing the credentials of the AWS account root user, create individual IAM users, granting each user only the permissions they require.

Follow the best practice of using the root user only to create your first IAM user.

There are two types of credentials:

  • Root user credentials, allow full access to all resources in the AWS account.
  • IAM credentials, control access to AWS services and resources for users in your AWS account

Serverless blog web application architecture

  • https://github.com/aws-samples/lambda-refarch-webapp

  • https://s3.amazonaws.com/aws-lambda-serverless-web-refarch/RefArch_BlogApp_Serverless.png

    • Amazon Route 53 (routes to specific places based on region)
    • Amazon CloudFront (deliver static content per region hosted inside S3)

    • Amazon Simple Storage Service (S3)
    • Amazon Cognito (Authentication and authorization)
    • Amazon API Gateway (routes requests to backend logic)
    • AWS Lambda (backend business logic)
    • AWS DynamoDB (managed DB)
    • AWS Identity and Access Management (IAM) - web service to control access to AWS resources

Localstack in Debian

  • https://github.com/localstack/localstack

  • sudo apt install python3-pip
  • sudo apt install python-pip
  • pip3 install localstack
  • pip install localstack
  • .local/bin/localstack start
  • docker run --rm -it -p 4566:4566 -p 4571:4571 localstack/localstack
  • curl http://localhost:4566/health

  • pip3 install awscli
  • pip3 install awscli-local
  • .local/bin/awslocal kinesis list-streams
  • .local/bin/awslocal s3api list-buckets
  • PATH=$PATH:/usr/sbin:~/.local/bin in ~/.bashrc

  • docker exec -it silly_greider bash
  • awslocal s3api list-buckets
  • awslocal s3api create-bucket --bucket my-bucket --region us-east-1
  • https://docs.aws.amazon.com/cli/latest/reference/s3api/

  • echo "test" > test.txt

  • awslocal s3api put-object --bucket my-bucket --key dir-1/test.txt --body test.txt
  • awslocal s3api get-object --bucket my-bucket --key dir-1/test.txt test2.txt
  • cat test2.txt

Localstack - lambda and s3

run.sh

   1 zip py-my-function.zip lambda_function.py
   2 awslocal lambda delete-function --function-name py-my-function
   3 awslocal lambda create-function --function-name py-my-function --zip-file fileb://py-my-function.zip --handler lambda_function.lambda_handler  --runtime python3.9 --role arn:aws:iam::000000000000:role/lambda-ex
   4 awslocal lambda invoke --function-name py-my-function --payload '{ "first_name": "Bob","last_name":"Squarepants" }' response.json 
   5 cat response.json

lambda_function.py

   1 import boto3
   2 import os
   3 
   4 def lambda_handler(event, context):
   5     message = 'Hello {} {}!'.format(event['first_name'], event['last_name'])
   6     session = boto3.session.Session()
   7 
   8     s3_client = session.client(
   9         service_name='s3',
  10         aws_access_key_id=os.environ["AWS_ACCESS_KEY_ID"],
  11         aws_secret_access_key=os.environ["AWS_SECRET_ACCESS_KEY"],
  12         endpoint_url='http://localhost:4566',
  13     )
  14 
  15     buckets=[]
  16     for bucket in s3_client.list_buckets()['Buckets']:
  17         buckets.append(bucket['Name'])
  18 
  19     response = s3_client.create_bucket(Bucket='examplebucket')
  20 
  21     body = {
  22         'message' : message,
  23         'buckets' : buckets,
  24         'AWS_ACCESS_KEY_ID' : os.environ["AWS_ACCESS_KEY_ID"],
  25         'AWS_SECRET_ACCESS_KEY' : os.environ["AWS_SECRET_ACCESS_KEY"]
  26     }
  27 
  28     s3_client.put_object(Body=str(body), Bucket='examplebucket', Key='examplebucket/response.txt')
  29     return body

Access localstack from docker container

   1 docker run -d --name localstack --rm -it -p 4566:4566 -p 4571:4571 localstack/localstack
   2 docker exec -it localstack bash
   3 lsb_release -a
   4 curl http://localhost:4566/health
   5 awslocal s3api list-buckets
   6 awslocal s3api create-bucket --bucket my-bucket
   7 echo "test" > test.txt
   8 awslocal s3api put-object --bucket my-bucket --key dir-1/test.txt --body test.txt
   9 awslocal s3api get-object --bucket my-bucket --key dir-1/test.txt test2.txt
  10 cat test2.txt 
  11 apt install nano vim yajl-tools -y
  12 https://hub.docker.com/r/localstack/localstack
  13 https://github.com/localstack/localstack
  14 node -v # v14.18.1
  15 python -V # Python 3.8.12
  16 pip3 freeze
  17 curl http://localhost:4566/health | json_reformat
  18 awslocal ec2 run-instances --image-id prod-df2jln3gjtwps --count 1 --instance-type t2.micro
  19 awslocal ec2 describe-instances --filters "Name=instance-type,Values=t2.micro" --query "Reservations[].Instances[].InstanceId"
  20 awslocal ec2 describe-instances
  • MoinMoin Powered
  • Python Powered
  • GPL licensed
  • Valid HTML 4.01