• This repository has been archived on 25/Oct/2022
  • Stars
    star
    289
  • Rank 143,419 (Top 3 %)
  • Language
    JavaScript
  • License
    Apache License 2.0
  • Created over 8 years ago
  • Updated about 2 years ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A sample AWS Lambda function that accepts messages from an Amazon Kinesis Stream and transfers the messages to another data transport.

aws-lambda-fanout

This function answers a need I have had multiple times, where I want to replicate data from an Amazon Kinesis Stream to another account or another region for processing, or to another environment such as development.

This AWS Lambda function can be used to propagate incoming messages from Amazon Kinesis Streams or Amazon DynamoDB Streams to other services (Amazon SNS, Amazon SQS, Amazon Elasticsearch Service, Amazon Kinesis Streams, Amazon Kinesis Firehose, AWS IoT, AWS Lambda, Amazon ElastiCache for Memcached and Redis), regions or accounts. This function generates metrics that will be published to Amazon CloudWatch Metrics, in the Custom/FanOut namespace.

Architecture

This function can be run in the 'public' AWS Lambda mode (and should be the default for most services), or inside an Amazon Virtual Private Cloud (VPC) if you use Amazon ElastiCache (Redis or memcached).

As the configuration data resides in Amazon DynamoDB, and because the function sends metrics via Amazon CloudWatch, the function must have Internet access. To have Internet access, an AWS Lambda function running inside an Amazon VPC must reside in a private subnet with a route through a NAT Gateway (either a self-managed NAT instance or an Amazon Managed NAT Gateway) for Internet. This limitation exists because AWS Lambda functions in an Amazon VPC don't have a public IP Address.

Glossary

Here is a list of terms used in this documentation:

  • fanout function refers to the AWS Lambda function running the provided code;
  • Source refers to an AWS Kinesis Stream or Amazon DynamoDB Stream on which the fanout function has an event source mapping and that will generate events;
  • Source Account refers to the AWS Account in which the fanout function is executing;
  • Source Region refers to the region in which the fanout function is executing;
  • Target refers to a destination (see target types for details) to which the fanout function will send records after processing;
  • Target Account refers to the AWS Account in which a target resides;
  • Target Region refers to the region in which a target resides.

Mappings

The fanout function maps sourcesto targets.

Sources can currently be:

  • An Amazon Kinesis Stream (specified by ARN)
  • An Amazon DynamoDB Stream (specified by ARN)

Targets have a specific type defined, and a destination. Currently the allowed types and destination formats are the following:

  • sns for specifying an Amazon Simple Notification Service (SNS) Topic
    • ARN of the target Amazon SNS Topic
  • sqs for specifying an Amazon Simple Queue Service (SQS) Queue
    • URL of the target Amazon SQS Queue
  • es for specifying an Amazon Elasticsearch Domain
    • Composite string containing the FQDN of the target Amazon Elasticache Service Domain endpoint, followed by # then the storage specification <doctype>/<index>
  • kinesis for specifying an Amazon Kinesis Stream
    • name of the target Amazon Kinesis Stream
  • firehose for specifying an Amazon Kinesis Firehose Delivery Stream
    • name of the target Amazon Kinesis Firehose Delivery Stream
  • iot for specifying an AWS IoT MQTT topic
    • Composite string containing the FQDN of the target Amazon IoT endpoint (specific per account / region), followed by # then the MQTT topic name
  • lambda for specifying an AWS Lambda Function
    • name of the target AWS Lambda Function
  • memcached for specifying an Amazon ElastiCache Memcached Cluster
    • FQDN of the target Amazon ElastiCache Memcached Cluster endpoint
  • redis for specifying an Amazon ElastiCache Redis Replication Group
    • FQDN of the target Amazon ElastiCache Redis Replication Group primary endpoint

Configuration entries

Each target is defined by a set of parameters, stored in an Amazon DynamoDB table. The default name of the table is derived from the function <function-name>Targets. Here are the properties for this table:

  • sourceArn (String) [required]: the ARN of the event source (Amazon Kinesis Stream or Amazon DynamoDB Stream) (Table Hash Key)
  • id (String) [required]: the identifier of the fan-out target (Table Range Key)
  • type (String) [required]: the type of destination for the fan-out target
  • destination (String) [required]: the destination of the messages (as defined in the Targets section)
  • active (Boolean) [required]: indicates if this target is active or not
  • role (String) [optional]: for cross-account roles: you can specify the role ARN that will be assumed
  • externalId (String) [optional]: for cross-account roles: you can specify an external Id for the STS:AssumeRole call
  • region (String) [optional]: for cross-region calls, you can specify the region name
  • collapse (String) [optional]: for AWS IoT, Amazon SQS and Amazon SNS, defines how the messages should be collapsed, if at all (none, JSON, concat, concat-b64; default: JSON)
  • parallel (Boolean) [optional]: indicates if we should process sending these messages in parallel. Warning: this may break in-shard ordering for Amazon Kinesis (default true)
  • convertDDB (Boolean) [optional]: for Amazon DynamoDB Streams messages, converts the DDB objects to plain Javascript objects
  • deaggregate (Boolean) [optional]: for Amazon Kinesis Streams messages, deserializes KPL (protobuf-based) messages

For sns, sqs, es, kinesis, firehose, iot and lambda:

  • you can specify a target account and a target region for your targets.
  • you can either run the fanout function as a 'public' AWS Lambda function or in an Amazon VPC with NAT Gateway

For memcached and redis:

  • you will need to run the fanout function as an AWS Lambda function inside an Amazon VPC;
  • the targets can only reside in the same Amazon VPC where your fanout function resides, as such in the source region and source account.

Supporting cross-account publication

You can send the records to a target account. This feature leverages the STS:AssumeRole API to allow cross-account access.

To activate this feature, you have to specify the role property in the configuration. This property will contain the ARN of the AWS IAM Role from the target account to be used when publishing the data.

If you send data to an account that you don't own, you should specify the externalId property that is used to further limit the access to sts:AssumeRole.

To activate this feature, you need to configure a policy in the AWS IAM Role used by the fanout function to include the sts:AssumeRole action on your target account. You also need to configure the trust relationship of your target account AWS IAM Role to allow sts:AssumeRole calls from the source account.

This feature is not available for redis and memcached.

Supporting cross-region publication

You can also send records to a target region, by specifying the region property in the configuration. This property will contain the name of the target region.

This feature is not available for redis and memcached.

The CLI

To simplify the deployment, configuration and management of the function, a Bash Command Line Interface (CLI) is available. The command is ./fanout and is an automation of the AWS CLI.

Available commands are:

  • ./fanout deploy: deploys the fanout function as a new AWS Lambda function, or updates the currently existing AWS Lambda function
  • ./fanout register <type>: creates a new mapping (source -> target) for the specified fanout function
  • ./fanout list: lists existing targets for the specified fanout function and source
  • ./fanout update: updates a mapping for the specified fanout function and source
  • ./fanout activate: activates a mapping for the specified fanout function
  • ./fanout deactivate: deactivates a mapping for the specified fanout function
  • ./fanout unregister: deletes a mapping for the specified fanout function
  • ./fanout destroy: destroys the fanout function

Some parameters (--exec-role, --table, --source) have a short form or a long form. The short form will call the AWS CLI to search for the element and either create it or raise an error if it does not exist, and the long form (the same parameters with -arn suffix: --exec-role-arn, --table-arn, --source-arn) will just accept the parameter as-is. This speeds up execution time, and allow scenarios where the user running the command does not have enough rights to discover the elements, while the function does.

Optional command line parameters are:

  • --region <region>, --profile <profile>, --debug <boolean>, --endpoint-url <url>, --no-verify-ssl: passed directly to the AWS CLI calls

Deploy

The deploy command creates the fanout function and supporting resources (AWS IAM Role, Amazon DynamoDB Table). It expects the following parameters:

  • --function <fanout>: (optional, defaults to fanout) all commands expect an AWS Lambda function name to be provided.
    • This name will be used to derive the AWS IAM Role name (<function-name>Role) if it is not provided
    • This name will be used to derive the Amazon DynamoDB Table name (<function-name>Targets) if it is not provided.
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration. The table is created if it does not exist.
    • You can specify the table by ARN and avoid table detection and creation by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>
  • --subnet <subnet-01234567> (optional) specify the subnet to use for executing the AWS Lambda function in VPC mode
    • this parameter can be repeated to specify multiple subnets for the function, recommended for multi-AZ scenarios
    • this parameter can accept a comma-separated list of subnets --subnet <subnet-01234567>,subnet-<89abcdef>
    • this parameter is required at least once if the --security-group parameter is used
  • --security-group <sg-01234567> (zero or multiple times) specify the security groups to use for executing the AWS Lambda function in VPC mode
    • this parameter can be repeated to specify multiple security groups for the function
    • this parameter can accept a comma-separated list of security groups --security-group <sg-01234567>,sg-<89abcdef>
    • this parameter is required at least once if the --subnet parameter is used
  • --exec-role <fanoutRole> (optional) specify the AWS IAM Role to use for the fanout function. The role is created if it does not exist.
    • You can specify the role by ARN and avoid role detection and creation by using --exec-role-arn <arn:aws:iam::0123456789abcdef::role/fanoutRole>
  • --memory <128> (optional, default 128) specify the amount of memory (in MiB) to use for the function
  • --runtime <nodejs14.x> (optional, default nodejs14.x) specify the runtime environment for the Lambda function
  • --timeout <30> (optional, default 30) specify the timeout of the function (in seconds)

Example:

./fanout deploy --function fanout

List

The list command retrieves all the existing mappings for a specific source. It expects the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout list --function fanout --source-type kinesis --source inputStream

Register

The register <type> command creates a new mapping for an existing fanout function. As the fanout function caches the configuration for performance reasons (default time of 1 minute), there may be a delay in the activation of this mapping. Note also that unless --active true is specified new mappings are created inactive for safety reasons. It expects the following parameters:

  • <type> (required) the type of the destination
  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --id <mapping-id> (required) specify the identifier of this mapping
  • --destination <name> (optional) where name depends on the provider type:
    • ./fanout create sns:
      • name of the target topic, the CLI will search for the topic in the specified region
      • ARN of the target topic, detection will be deactivated
    • ./fanout create sqs:
      • name of the target queue, the CLI will search for the queue in the specified region
      • ARN of the target queue, detection will be deactivated
    • ./fanout create es:
      • name of the target domain, the CLI will search for the endpoint in the specified region, you will need to specify the --index parameter
      • Composite string containing the FQDN of the target Amazon Elasticache Service domain endpoint, followed by '#' then the storage specification 'doctype/index'
    • ./fanout create kinesis:
      • name of the target stream, the CLI will search for the stream in the specified region
      • ARN of the target stream, detection will be deactivated
    • ./fanout create firehose:
      • name of the target stream, the CLI will search for the stream in the specified region
      • ARN of the target stream, detection will be deactivated
    • ./fanout create iot:
      • name of the target topic
      • Composite string containing the FQDN of the target Amazon IoT endpoint (specific per account / region), followed by '#' then the MQTT Topic Name
    • ./fanout create lambda:
      • name of the target function, the CLI will search for the function in the specified region
      • ARN of the target topic, detection will be deactivated
    • ./fanout create memcached:
      • name of the target cluster, the CLI will search for the function in the specified region
      • FQDN of the target cluster endpoint, detection will be deactivated
    • ./fanout create redis:
      • name of the target cluster, the CLI will search for the function in the specified region
      • FQDN of the target primary endpoint, detection will be deactivated
  • --index <doctype/index> (optional, required only for es with domain name) specify, for Amazon Elasticsearch Service, the index in the domain where the data will reside
  • --destination-region <us-west-2> (optional) specify the target region for this mapping
  • --active <true|false> (optional, default false) indicates if this target is active or not
  • --destination-role-arn <arn:aws:iam::0123456789abcdef::role/targetRole> (optional) specify, for cross-account roles, the role ARN that will be assumed
  • --external-id <123456> (optional) specify, for cross-account roles, an external Id for the STS:AssumeRole call
  • --collapse <none|JSON|concat|concat-b64> (optional, default JSON) for AWS IoT, Amazon SQS and Amazon SNS, defines how the messages should be collapsed, if at all
  • --parallel <true|false> (optional, default true) indicates if we should process sending these messages in parallel
  • --convert-ddb <true|false> (optional, default false) for Amazon DynamoDB Streams messages, converts the DDB objects to plain Javascript objects
  • --deaggregate <true|false> (optional, default false) for Amazon Kinesis Streams messages, deserializes KPL (protobuf-based) messages
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout register lambda --function fanout --source-type kinesis --source inputStream --id target1 --destination targetFunction

Update

The update command allows you to modify some parameters of your mappings. As the fanout function caches the configuration for performance reasons (default time of 1 minute), there may be a delay in the application of the modification. It expects the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --id <mapping-id> (required) specify the identifier of this mapping
  • --active <true|false> (optional) indicates if this target is active or not
  • --destination-role-arn <arn:aws:iam::0123456789abcdef::role/targetRole> (optional) specify, for cross-account roles, the role ARN that will be assumed
  • --external-id <123456> (optional) specify, for cross-account roles, an external Id for the STS:AssumeRole call
  • --collapse <none|JSON|concat|concat-b64> (optional, default JSON) for AWS IoT, Amazon SQS and Amazon SNS, defines how the messages should be collapsed, if at all
  • --parallel <true|false> (optional) indicates if we should process sending these messages in parallel
  • --convert-ddb <true|false> (optional, default false) for Amazon DynamoDB Streams messages, converts the DDB objects to plain Javascript objects
  • --deaggregate <true|false> (optional, default false) for Amazon Kinesis Streams messages, deserializes KPL (protobuf-based) messages
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout update --function fanout --source-type kinesis --source inputStream --id target1 --parallel false

(De)activate

The activate and deactivate commands turn on or off a specific mapping. As the fanout function caches the configuration for performance reasons (default time of 1 minute), there may be a delay in the application of the modification. They expect the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --id <mapping-id> (required) specify the identifier of this mapping
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout activate --function fanout --source-type kinesis --source inputStream --id target1 ./fanout deactivate --function fanout --source-type kinesis --source inputStream --id target1

Unregister

The unregister command removes an existing mapping fromt the configuration table. As the fanout function caches the configuration for performance reasons (default time of 1 minute), there may be a delay before the target is effectively removed. It expects the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --id <mapping-id> (required) specify the identifier of this mapping
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout unregister --function fanout --source-type kinesis --source inputStream --id target1

Destroy

The destroy command removes the fanout function and its configuration. It expects the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function

Example:

./fanout destroy --function fanout

(Un)hook

The hook and unhook commands register and unregister an event source mapping for the specified source. They expect the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout hook --function fanout --source-type kinesis --source inputStream ./fanout unhook --function fanout --source-type kinesis --source inputStream

Pause|Resume

The pause and resume commands turn on or off a specific source. They will respectly disable and enable the event source mapping from the source to the fanout function. They expect the following parameters:

  • --function <fanout> (optional, defaults to fanout) specify the name of the function
    • You can specify the function by ARN and avoid detection by using --function-arn <arn:aws:lambda:us-east-1:0123456789abcdef:function:fanout>
  • --source-type <kinesis|dynamodb> (required) specify the type of the source (one of Amazon Kinesis Stream or Amazon DynamoDB Stream)
  • --source <kinesisStream> (required) specify the name of the input Amazon Kinesis Stream or Amazon DynamoDB Stream
    • You can specify the source by ARN and avoid detection by using --source-arn <arn:aws:kinesis:us-east-1:0123456789abcdef:stream/inputStream>
    • When --source is used, you must specify --source-type as well
  • --table <fanoutTargets> (optional) specify the table name to use for the function configuration.
    • If not specified a default value of <function-name>Targets will be used.
    • You can specify the table by ARN and avoid table detection by using --table-arn <arn:aws:dynamodb:us-east-1:0123456789abcdef:table/fanoutTargets>

Example:

./fanout pause --function fanout --source-type kinesis --source inputStream ./fanout resume --function fanout --source-type kinesis --source inputStream

License

Copyright 2011-2013 Amazon.com, Inc. or its affiliates. All Rights Reserved.

Licensed under the Apache License, Version 2.0 (the "License"). You may not use this file except in compliance with the License. A copy of the License is located at

http://aws.amazon.com/apache2.0/

or in the "license" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. See the License for the specific language governing permissions and limitations under the License.

More Repositories

1

aws-cdk-examples

Example projects using the AWS CDK
Python
4,121
star
2

aws-serverless-workshops

Code and walkthrough labs to set up serverless applications for Wild Rydes workshops
JavaScript
3,977
star
3

aws-workshop-for-kubernetes

AWS Workshop for Kubernetes
Shell
2,618
star
4

aws-machine-learning-university-accelerated-nlp

Machine Learning University: Accelerated Natural Language Processing Class
Jupyter Notebook
2,080
star
5

aws-serverless-airline-booking

Airline Booking is a sample web application that provides Flight Search, Flight Payment, Flight Booking and Loyalty points including end-to-end testing, GraphQL and CI/CD. This web application was the theme of Build on Serverless Season 2 on AWS Twitch running from April 24th until end of August in 2019.
Vue
1,967
star
6

ecs-refarch-cloudformation

A reference architecture for deploying containerized microservices with Amazon ECS and AWS CloudFormation (YAML)
Makefile
1,673
star
7

lambda-refarch-webapp

The Web Application reference architecture is a general-purpose, event-driven, web application back-end that uses AWS Lambda, Amazon API Gateway for its business logic. It also uses Amazon DynamoDB as its database and Amazon Cognito for user management. All static content is hosted using AWS Amplify Console.
JavaScript
1,561
star
8

serverless-patterns

Serverless patterns. Learn more at the website: https://serverlessland.com/patterns.
Python
1,544
star
9

aws-modern-application-workshop

A tutorial for developers that want to learn about how to build modern applications on top of AWS. You will build a sample website that leverages infrastructure as code, containers, serverless code functions, CI/CD, and more.
1,459
star
10

amazon-bedrock-workshop

This is a workshop designed for Amazon Bedrock a foundational model service.
Jupyter Notebook
1,419
star
11

aws-machine-learning-university-accelerated-cv

Machine Learning University: Accelerated Computer Vision Class
Jupyter Notebook
1,409
star
12

aws-glue-samples

AWS Glue code samples
Python
1,277
star
13

aws-deepracer-workshops

DeepRacer workshop content
Jupyter Notebook
1,086
star
14

aws-genai-llm-chatbot

A modular and comprehensive solution to deploy a Multi-LLM and Multi-RAG powered chatbot (Amazon Bedrock, Anthropic, HuggingFace, OpenAI, Meta, AI21, Cohere, Mistral) using AWS CDK on AWS
TypeScript
1,061
star
15

aws-refarch-wordpress

This reference architecture provides best practices and a set of YAML CloudFormation templates for deploying WordPress on AWS.
PHP
1,001
star
16

aws-machine-learning-university-accelerated-tab

Machine Learning University: Accelerated Tabular Data Class
Jupyter Notebook
955
star
17

aws-serverless-ecommerce-platform

Serverless Ecommerce Platform is a sample implementation of a serverless backend for an e-commerce website. This sample is not meant to be used as an e-commerce platform as-is, but as an inspiration on how to build event-driven serverless microservices on AWS.
Python
947
star
18

aws-big-data-blog

Java
895
star
19

machine-learning-samples

Sample applications built using AWS' Amazon Machine Learning.
Python
867
star
20

eks-workshop

AWS Workshop for Learning EKS
CSS
777
star
21

startup-kit-templates

CloudFormation templates to accelerate getting started on AWS.
Python
760
star
22

aws-incident-response-playbooks

756
star
23

aws-security-reference-architecture-examples

Example solutions demonstrating how to implement patterns within the AWS Security Reference Architecture guide using CloudFormation and Customizations for AWS Control Tower.
Python
731
star
24

retail-demo-store

AWS Retail Demo Store is a sample retail web application and workshop platform demonstrating how AWS infrastructure and services can be used to build compelling customer experiences for eCommerce, retail, and digital marketing use-cases
Jupyter Notebook
708
star
25

lambda-refarch-imagerecognition

The Image Recognition and Processing Backend reference architecture demonstrates how to use AWS Step Functions to orchestrate a serverless processing workflow using AWS Lambda, Amazon S3, Amazon DynamoDB and Amazon Rekognition.
JavaScript
662
star
26

aws-secure-environment-accelerator

The AWS Secure Environment Accelerator is a tool designed to help deploy and operate secure multi-account, multi-region AWS environments on an ongoing basis. The power of the solution is the configuration file which enables the completely automated deployment of customizable architectures within AWS without changing a single line of code.
HTML
653
star
27

simple-websockets-chat-app

This SAM application provides the Lambda functions, DynamoDB table, and roles to allow you to build a simple chat application based on API Gateway's new WebSocket-based API feature.
JavaScript
632
star
28

aws-codedeploy-samples

Samples and template scenarios for AWS CodeDeploy
Shell
627
star
29

emr-bootstrap-actions

This repository hold the Amazon Elastic MapReduce sample bootstrap actions
Shell
612
star
30

aws-bookstore-demo-app

AWS Bookstore Demo App is a full-stack sample web application that creates a storefront (and backend) for customers to shop for fictitious books. The entire application can be created with a single template. Built on AWS Full-Stack Template.
TypeScript
612
star
31

generative-ai-use-cases-jp

すぐに業務活用できるビジネスユースケース集付きの安全な生成AIアプリ実装
TypeScript
611
star
32

aws-lex-web-ui

Sample Amazon Lex chat bot web interface
JavaScript
607
star
33

hardeneks

Runs checks to see if an EKS cluster follows EKS Best Practices.
Python
603
star
34

lambda-refarch-mobilebackend

Serverless Reference Architecture for creating a Mobile Backend
Objective-C
584
star
35

amazon-personalize-samples

Notebooks and examples on how to onboard and use various features of Amazon Personalize
Jupyter Notebook
572
star
36

aws-serverless-workshop-innovator-island

Welcome to the Innovator Island serverless workshop! This repo contains all the instructions and code you need to complete the workshop.
JavaScript
564
star
37

kubernetes-for-java-developers

A Day in Java Developer’s Life, with a taste of Kubernetes
Java
562
star
38

aws-iot-chat-example

💬 Chat application using AWS IoT platform via MQTT over the WebSocket protocol
JavaScript
534
star
39

aws-dynamodb-examples

DynamoDB Examples
JavaScript
532
star
40

aws-amplify-graphql

Sample using AWS Amplify and AWS AppSync together for user login and authorization when making GraphQL queries and mutations. Also includes complex objects for uploading and downloading data to and from S3 with a React app.
JavaScript
521
star
41

aws-mobile-appsync-chat-starter-angular

GraphQL starter progressive web application (PWA) with Realtime and Offline functionality using AWS AppSync
TypeScript
520
star
42

aws-serverless-security-workshop

In this workshop, you will learn techniques to secure a serverless application built with AWS Lambda, Amazon API Gateway and RDS Aurora. We will cover AWS services and features you can leverage to improve the security of a serverless applications in 5 domains: identity & access management, code, data, infrastructure, logging & monitoring.
JavaScript
505
star
43

amazon-forecast-samples

Notebooks and examples on how to onboard and use various features of Amazon Forecast.
Jupyter Notebook
471
star
44

lambda-refarch-fileprocessing

Serverless Reference Architecture for Real-time File Processing
Python
450
star
45

ecs-blue-green-deployment

Reference architecture for doing blue green deployments on ECS.
Python
442
star
46

cloudfront-authorization-at-edge

Protect downloads of your content hosted on CloudFront with Cognito authentication using cookies and Lambda@Edge
TypeScript
439
star
47

aws-service-catalog-reference-architectures

Sample CloudFormation templates and architecture for AWS Service Catalog
JavaScript
430
star
48

amazon-bedrock-samples

This repository contains examples for customers to get started using the Amazon Bedrock Service. This contains examples for all available foundational models
Jupyter Notebook
422
star
49

siem-on-amazon-opensearch-service

A solution for collecting, correlating and visualizing multiple types of logs to help investigate security incidents.
Python
409
star
50

aws-microservices-deploy-options

This repo contains a simple application that consists of three microservices. Each application is deployed using different Compute options on AWS.
Jsonnet
407
star
51

aws-cost-explorer-report

Python SAM Lambda module for generating an Excel cost report with graphs, including month on month cost changes. Uses the AWS Cost Explorer API for data.
Python
406
star
52

aws-security-workshops

A collection of the latest AWS Security workshops
Jupyter Notebook
401
star
53

aws-sam-java-rest

A sample REST application built on SAM and DynamoDB that demonstrates testing with DynamoDB Local.
Java
400
star
54

amazon-elasticsearch-lambda-samples

Data ingestion for Amazon Elasticsearch Service from S3 and Amazon Kinesis, using AWS Lambda: Sample code
JavaScript
393
star
55

amazon-textract-textractor

Analyze documents with Amazon Textract and generate output in multiple formats.
Jupyter Notebook
390
star
56

amazon-cloudfront-functions

JavaScript
388
star
57

aws-saas-factory-bootcamp

SaaS on AWS Bootcamp - Building SaaS Solutions on AWS
JavaScript
376
star
58

aws-lambda-extensions

A collection of sample extensions to help you get started with AWS Lambda Extensions
Go
376
star
59

amazon-sagemaker-notebook-instance-lifecycle-config-samples

A collection of sample scripts to customize Amazon SageMaker Notebook Instances using Lifecycle Configurations
Shell
366
star
60

non-profit-blockchain

Builds a blockchain network and application to track donations to non-profit organizations, using Amazon Managed Blockchain
SCSS
360
star
61

amazon-textract-code-samples

Amazon Textract Code Samples
Jupyter Notebook
355
star
62

amazon-neptune-samples

Samples and documentation for using the Amazon Neptune graph database service
JavaScript
355
star
63

lambda-refarch-streamprocessing

Serverless Reference Architecture for Real-time Stream Processing
JavaScript
349
star
64

amazon-ecs-java-microservices

This is a reference architecture for java microservice on Amazon ECS
Java
345
star
65

sessions-with-aws-sam

This repo contains all the SAM templates created in the Twitch series #SessionsWithSAM. The show is every Thursday on Twitch at 10 AM PDT.
JavaScript
343
star
66

amazon-rekognition-video-analyzer

A working prototype for capturing frames off of a live MJPEG video stream, identifying objects in near real-time using deep learning, and triggering actions based on an objects watch list.
JavaScript
343
star
67

aws-eks-accelerator-for-terraform

The AWS EKS Accelerator for Terraform is a framework designed to help deploy and operate secure multi-account, multi-region AWS environments. The power of the solution is the configuration file which enables the users to provide a unique terraform state for each cluster and manage multiple clusters from one repository. This code base allows users to deploy EKS add-ons using Helm charts.
HCL
338
star
68

aws-deepcomposer-samples

Jupyter Notebook
336
star
69

amazon-ecs-mythicalmysfits-workshop

A tutorial for developers who want to learn about how to containerized applications on top of AWS using AWS Fargate. You will build a sample website that leverages infrastructure as code, containers, CI/CD, and more! If you're planning on running this, let us know @ [email protected]. At re:Invent 2018, these sessions were run as CON214/CON321/CON322.
HTML
334
star
70

aws-iot-examples

Examples using AWS IoT (Internet of Things). Deprecated. See README for updated guidance.
JavaScript
331
star
71

aws-media-services-simple-vod-workflow

Lab that covers video conversion workflow for Video On Demand using AWS MediaConvert.
Python
328
star
72

php-examples-for-aws-lambda

Demo serverless applications, examples code snippets and resources for PHP
PHP
324
star
73

aws-serverless-cicd-workshop

Learn how to build a CI/CD pipeline for SAM-based applications
CSS
317
star
74

create-react-app-auth-amplify

Implements a basic authentication flow for signing up/signing in users as well as protected client side routing using AWS Amplify.
JavaScript
314
star
75

api-gateway-secure-pet-store

Amazon API Gateway sample using Amazon Cognito credentials through AWS Lambda
Objective-C
309
star
76

aws-etl-orchestrator

A serverless architecture for orchestrating ETL jobs in arbitrarily-complex workflows using AWS Step Functions and AWS Lambda.
Python
307
star
77

amazon-textract-serverless-large-scale-document-processing

Process documents at scale using Amazon Textract
Python
302
star
78

lambda-go-samples

An example of using AWS Lambda with Go
Go
302
star
79

amazon-cloudfront-secure-static-site

Create a secure static website with CloudFront for your registered domain.
JavaScript
300
star
80

amazon-ecs-firelens-examples

Sample logging architectures for FireLens on Amazon ECS and AWS Fargate.
300
star
81

aws-nodejs-sample

Sample project to demonstrate usage of the AWS SDK for Node.js
JavaScript
299
star
82

aws-cognito-apigw-angular-auth

A simple/sample AngularV4-based web app that demonstrates different API authentication options using Amazon Cognito and API Gateway with an AWS Lambda and Amazon DynamoDB backend that stores user details in a complete end to end Serverless fashion.
JavaScript
297
star
83

lambda-ecs-worker-pattern

This example code illustrates how to extend AWS Lambda functionality using Amazon SQS and the Amazon EC2 Container Service (ECS).
POV-Ray SDL
291
star
84

aws-saas-factory-ref-solution-serverless-saas

Python
286
star
85

aws-mlu-explain

Visual, Interactive Articles About Machine Learning: https://mlu-explain.github.io/
JavaScript
285
star
86

aws-serverless-shopping-cart

Serverless Shopping Cart is a sample implementation of a serverless shopping cart for an e-commerce website.
Python
282
star
87

aws-serverless-samfarm

This repo is full CI/CD Serverless example which was used in the What's New with AWS Lambda presentation at Re:Invent 2016.
JavaScript
280
star
88

eb-node-express-sample

Sample Express application for AWS Elastic Beanstalk
EJS
279
star
89

eb-py-flask-signup

HTML
270
star
90

codepipeline-nested-cfn

CloudFormation templates, CodeBuild build specification & Python scripts to perform unit tests of a nested CloudFormation template.
Python
269
star
91

aws-amplify-auth-starters

Starter projects for developers looking to build web & mobile applications that have Authentication & protected routing
269
star
92

aws-containers-task-definitions

Task Definitions for running common applications Amazon ECS
264
star
93

aws-proton-cloudformation-sample-templates

Sample templates for AWS Proton
262
star
94

aws2tf

aws2tf - automates the importing of existing AWS resources into Terraform and outputs the Terraform HCL code.
Shell
261
star
95

aws-cdk-changelogs-demo

This is a demo application that uses modern serverless architecture to crawl changelogs from open source projects, parse them, and provide an API and website for viewing them.
JavaScript
260
star
96

designing-cloud-native-microservices-on-aws

Introduce a fluent way to design cloud native microservices via EventStorming workshop, this is a hands-on workshop. Contains such topics: DDD, Event storming, Specification by example. Including the AWS product : Serverless Lambda , DynamoDB, Fargate, CloudWatch.
Java
257
star
97

aws-secrets-manager-rotation-lambdas

Contains Lambda functions to be used for automatic rotation of secrets stored in AWS Secrets Manager
Python
256
star
98

lambda-refarch-iotbackend

Serverless Reference Architecture for creating an IoT Backend
Python
251
star
99

aws-health-aware

AHA is an incident management & communication framework to provide real-time alert customers when there are active AWS event(s). For customers with AWS Organizations, customers can get aggregated active account level events of all the accounts in the Organization. Customers not using AWS Organizations still benefit alerting at the account level.
Python
250
star
100

Intelli-Agent

Chatbot Portal with Agent: Streamlined Workflow for Building Agent-Based Applications
Python
250
star