• Stars
    star
    2,829
  • Rank 16,081 (Top 0.4 %)
  • Language
    Rust
  • License
    Apache License 2.0
  • Created about 6 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A Rust runtime for AWS Lambda

Rust Runtime for AWS Lambda

Build Status

This package makes it easy to run AWS Lambda Functions written in Rust. This workspace includes multiple crates:

  • Docs lambda-runtime is a library that provides a Lambda runtime for applications written in Rust.
  • Docs lambda-http is a library that makes it easy to write API Gateway proxy event focused Lambda functions in Rust.
  • Docs lambda-extension is a library that makes it easy to write Lambda Runtime Extensions in Rust.
  • Docs lambda-events is a library with strongly-typed Lambda event structs in Rust.
  • Docs lambda-runtime-api-client is a shared library between the lambda runtime and lambda extension libraries that includes a common API client to talk with the AWS Lambda Runtime API.

The Rust runtime client is an experimental package. It is subject to change and intended only for evaluation purposes.

Getting started

The easiest way to start writing Lambda functions with Rust is by using Cargo Lambda, a related project. Cargo Lambda is a Cargo plugin, or subcommand, that provides several commands to help you in your journey with Rust on AWS Lambda.

The preferred way to install Cargo Lambda is by using a package manager.

1- Use Homebrew on MacOS:

brew tap cargo-lambda/cargo-lambda
brew install cargo-lambda

2- Use Scoop on Windows:

scoop bucket add cargo-lambda https://github.com/cargo-lambda/scoop-cargo-lambda
scoop install cargo-lambda/cargo-lambda

Or PiP on any system with Python 3 installed:

pip3 install cargo-lambda

See other installation options in the Cargo Lambda documentation.

Your first function

To create your first function, run Cargo Lambda with the subcommand new. This command will generate a Rust package with the initial source code for your function:

cargo lambda new YOUR_FUNCTION_NAME

Example function

If you'd like to manually create your first function, the code below shows you a simple function that receives an event with a firstName field and returns a message to the caller.

use lambda_runtime::{service_fn, LambdaEvent, Error};
use serde_json::{json, Value};

#[tokio::main]
async fn main() -> Result<(), Error> {
    let func = service_fn(func);
    lambda_runtime::run(func).await?;
    Ok(())
}

async fn func(event: LambdaEvent<Value>) -> Result<Value, Error> {
    let (event, _context) = event.into_parts();
    let first_name = event["firstName"].as_str().unwrap_or("world");

    Ok(json!({ "message": format!("Hello, {}!", first_name) }))
}

Building and deploying your Lambda functions

If you already have Cargo Lambda installed in your machine, run the next command to build your function:

cargo lambda build --release

There are other ways of building your function: manually with the AWS CLI, with AWS SAM, and with the Serverless framework.

1. Cross-compiling your Lambda functions

By default, Cargo Lambda builds your functions to run on x86_64 architectures. If you'd like to use a different architecture, use the options described below.

1.2. Build your Lambda functions

Amazon Linux 2

We recommend you to use Amazon Linux 2 runtimes (such as provided.al2) as much as possible for building Lambda functions in Rust. To build your Lambda functions for Amazon Linux 2 runtimes, run:

cargo lambda build --release --arm64

Amazon Linux 1

Amazon Linux 1 uses glibc version 2.17, while Rust binaries need glibc version 2.18 or later by default. However, with Cargo Lambda, you can specify a different version of glibc.

If you are building for Amazon Linux 1, or you want to support both Amazon Linux 2 and 1, run:

cargo lambda build --release --target aarch64-unknown-linux-gnu.2.17

Note Replace "aarch64" with "x86_64" if you are building for x86_64

2. Deploying the binary to AWS Lambda

For a custom runtime, AWS Lambda looks for an executable called bootstrap in the deployment package zip. Rename the generated executable to bootstrap and add it to a zip archive.

You can find the bootstrap binary for your function under the target/lambda directory.

2.2. Deploying with Cargo Lambda

Once you've built your code with one of the options described earlier, use the deploy subcommand to upload your function to AWS:

cargo lambda deploy \
  --iam-role arn:aws:iam::XXXXXXXXXXXXX:role/your_lambda_execution_role

Warning Make sure to replace the execution role with an existing role in your account!

This command will create a Lambda function with the same name of your rust package. You can change the name of the function by adding the argument at the end of the command:

cargo lambda deploy \
  --iam-role arn:aws:iam::XXXXXXXXXXXXX:role/your_lambda_execution_role \
  my-first-lambda-function

Note See other deployment options in the Cargo Lambda documentation.

You can test the function with the invoke subcommand:

cargo lambda invoke --remote \
  --data-ascii '{"command": "hi"}' \
  --output-format json \
  my-first-lambda-function

Note CLI commands in the examples use Linux/MacOS syntax. For different shells like Windows CMD or PowerShell, modify syntax when using nested quotation marks like '{"command": "hi"}'. Escaping with a backslash may be necessary. See AWS CLI Reference for more information.

2.2. Deploying with the AWS CLI

You can also use the AWS CLI to deploy your Rust functions. First, you will need to create a ZIP archive of your function. Cargo Lambda can do that for you automatically when it builds your binary if you add the output-format flag:

cargo lambda build --release --arm64 --output-format zip

You can find the resulting zip file in target/lambda/YOUR_PACKAGE/bootstrap.zip. Use that file path to deploy your function with the AWS CLI:

$ aws lambda create-function --function-name rustTest \
  --handler bootstrap \
  --zip-file fileb://./target/lambda/basic/bootstrap.zip \
  --runtime provided.al2023 \ # Change this to provided.al2 if you would like to use Amazon Linux 2 (or to provided.al for Amazon Linux 1).
  --role arn:aws:iam::XXXXXXXXXXXXX:role/your_lambda_execution_role \
  --environment Variables={RUST_BACKTRACE=1} \
  --tracing-config Mode=Active

Warning Make sure to replace the execution role with an existing role in your account!

You can now test the function using the AWS CLI or the AWS Lambda console

$ aws lambda invoke
  --cli-binary-format raw-in-base64-out \
  --function-name rustTest \
  --payload '{"command": "Say Hi!"}' \
  output.json
$ cat output.json  # Prints: {"msg": "Command Say Hi! executed."}

Note --cli-binary-format raw-in-base64-out is a required argument when using the AWS CLI version 2. More Information

2.3. AWS Serverless Application Model (SAM)

You can use Lambda functions built in Rust with the AWS Serverless Application Model (SAM). To do so, you will need to install the AWS SAM CLI, which will help you package and deploy your Lambda functions in your AWS account.

You will need to create a template.yaml file containing your desired infrastructure in YAML. Here is an example with a single Lambda function:

AWSTemplateFormatVersion: '2010-09-09'
Transform: AWS::Serverless-2016-10-31

Resources:
  HelloWorldFunction:
    Type: AWS::Serverless::Function
    Properties:
      MemorySize: 128
      Architectures: ["arm64"]
      Handler: bootstrap
      Runtime: provided.al2023
      Timeout: 5
      CodeUri: target/lambda/basic/

Outputs:
  FunctionName:
    Value: !Ref HelloWorldFunction
    Description: Name of the Lambda function

You can then deploy your Lambda function using the AWS SAM CLI:

sam deploy --guided

At the end, sam will output the actual Lambda function name. You can use this name to invoke your function:

$ aws lambda invoke
  --cli-binary-format raw-in-base64-out \
  --function-name HelloWorldFunction-XXXXXXXX \ # Replace with the actual function name
  --payload '{"command": "Say Hi!"}' \
  output.json
$ cat output.json  # Prints: {"msg": "Command Say Hi! executed."}

2.4. Serverless Framework

Alternatively, you can build a Rust-based Lambda function declaratively using the Serverless framework Rust plugin.

A number of getting started Serverless application templates exist to get you up and running quickly:

  • a minimal echo function to demonstrate what the smallest Rust function setup looks like
  • a minimal http function to demonstrate how to interface with API Gateway using Rust's native http crate (note this will be a git dependency until 0.2 is published)
  • a combination multi function service to demonstrate how to set up a services with multiple independent functions

Assuming your host machine has a relatively recent version of node, you won't need to install any host-wide serverless dependencies. To get started, run the following commands to create a new lambda Rust application and install project level dependencies.

$ npx serverless install \
  --url https://github.com/softprops/serverless-aws-rust \
  --name my-new-app \
  && cd my-new-app \
  && npm install --silent

Deploy it using the standard serverless workflow:

# build, package, and deploy service to aws lambda
$ npx serverless deploy

Invoke it using serverless framework or a configured AWS integrated trigger source:

npx serverless invoke -f hello -d '{"foo":"bar"}'

2.5. Docker

Alternatively, you can build a Rust-based Lambda function in a docker mirror of the AWS Lambda provided runtime with the Rust toolchain preinstalled.

Running the following command will start an ephemeral docker container, which will build your Rust application and produce a zip file containing its binary auto-renamed to bootstrap to meet the AWS Lambda's expectations for binaries under target/lambda_runtime/release/{your-binary-name}.zip. Typically, this is just the name of your crate if you are using the cargo default binary (i.e. main.rs):

# build and package deploy-ready artifact
$ docker run --rm \
    -v ${PWD}:/code \
    -v ${HOME}/.cargo/registry:/root/.cargo/registry \
    -v ${HOME}/.cargo/git:/root/.cargo/git \
    rustserverless/lambda-rust

With your application built and packaged, it's ready to ship to production. You can also invoke it locally to verify is behavior using the lambci :provided docker container, which is also a mirror of the AWS Lambda provided runtime with build dependencies omitted:

# start a docker container replicating the "provided" lambda runtime
# awaiting an event to be provided via stdin
$ unzip -o \
    target/lambda/release/{your-binary-name}.zip \
    -d /tmp/lambda && \
  docker run \
    -i -e DOCKER_LAMBDA_USE_STDIN=1 \
    --rm \
    -v /tmp/lambda:/var/task \
    lambci/lambda:provided

# provide an event payload via stdin (typically a json blob)

# Ctrl-D to yield control back to your function

Local development and testing

Testing your code with unit and integration tests

AWS Lambda events are plain structures deserialized from JSON objects. If your function handler uses the standard runtime, you can use serde to deserialize your text fixtures into the structures, and call your handler directly:

#[test]
fn test_my_lambda_handler() {
  let input = serde_json::from_str("{\"command\": \"Say Hi!\"}").expect("failed to parse event");
  let context = lambda_runtime::Context::default();

  let event = lambda_runtime::LambdaEvent::new(input, context);

  my_lambda_handler(event).await.expect("failed to handle event");
}

If you're using lambda_http to receive HTTP events, you can also create http_lambda::Request structures from plain text fixtures:

#[test]
fn test_my_lambda_handler() {
  let input = include_str!("apigw_proxy_request.json");

  let request = lambda_http::request::from_str(input)
    .expect("failed to create request");

  let response = my_lambda_handler(request).await.expect("failed to handle request");
}

Cargo Lambda

Cargo Lambda provides a local server that emulates the AWS Lambda control plane. This server works on Windows, Linux, and MacOS. In the root of your Lambda project. You can run the following subcommand to compile your function(s) and start the server.

cargo lambda watch -a 127.0.0.1 -p 9001

Now you can use the cargo lambda invoke to send requests to your function. For example:

cargo lambda invoke <lambda-function-name> --data-ascii '{ "command": "hi" }'

Running the command on a HTTP function (Function URL, API Gateway, etc) will require you to use the appropriate scheme. You can find examples of these schemes here. Otherwise, you will be presented with the following error.

Error: serde_json::error::Error

  ร— data did not match any variant of untagged enum LambdaRequest

An simpler alternative is to cURL the following endpoint based on the address and port you defined. For example:

curl -v -X POST \
  'http://127.0.0.1:9001/lambda-url/<lambda-function-name>/' \
  -H 'content-type: application/json' \
  -d '{ "command": "hi" }'

Warning Do not remove the content-type header. It is necessary to instruct the function how to deserialize the request body.

You can read more about how cargo lambda watch and cargo lambda invoke work on the project's documentation page.

Lambda Debug Proxy

Lambdas can be run and debugged locally using a special Lambda debug proxy (a non-AWS repo maintained by @rimutaka), which is a Lambda function that forwards incoming requests to one AWS SQS queue and reads responses from another queue. A local proxy running on your development computer reads the queue, calls your Lambda locally and sends back the response. This approach allows debugging of Lambda functions locally while being part of your AWS workflow. The Lambda handler code does not need to be modified between the local and AWS versions.

Tracing and Logging

The Rust Runtime for Lambda integrates with the Tracing libraries to provide tracing and logging.

By default, the runtime emits tracing events that you can collect via tracing-subscriber. It also enabled a feature called tracing that exposes a default subscriber with sensible options to send logging information to AWS CloudWatch. Follow the next example that shows how to enable the default subscriber:

use lambda_runtime::{run, service_fn, tracing, Error};

#[tokio::main]
async fn main() -> Result<(), Error> {
    tracing::init_default_subscriber();
    run(service_fn(|event| tracing::info!(?event))).await
}

The subscriber uses RUST_LOG as the environment variable to determine the log level for your function. It also uses Lambda's advance logging controls if they're configured for your function. By default, the log level to emit events is INFO.

AWS event objects

This project includes Lambda event struct definitions, aws_lambda_events. This crate can be leveraged to provide strongly-typed Lambda event structs. You can create your own custom event objects and their corresponding structs as well.

Custom event objects

To serialize and deserialize events and responses, we suggest using the serde library. To receive custom events, annotate your structure with Serde's macros:

use serde::{Serialize, Deserialize};
use serde_json::json;
use std::error::Error;

#[derive(Serialize, Deserialize)]
pub struct NewIceCreamEvent {
  pub flavors: Vec<String>,
}

#[derive(Serialize, Deserialize)]
pub struct NewIceCreamResponse {
  pub flavors_added_count: usize,
}

fn main() -> Result<(), Box<Error>> {
    let flavors = json!({
      "flavors": [
        "Nocciola",
        "ๆŠน่Œถ",
        "เค†เคฎ"
      ]
    });

    let event: NewIceCreamEvent = serde_json::from_value(flavors)?;
    let response = NewIceCreamResponse {
        flavors_added_count: event.flavors.len(),
    };
    serde_json::to_string(&response)?;

    Ok(())
}

Feature flags in lambda_http

lambda_http is a wrapper for HTTP events coming from three different services, Amazon Load Balancer (ALB), Amazon Api Gateway (APIGW), and AWS Lambda Function URLs. Amazon Api Gateway can also send events from three different endpoints, REST APIs, HTTP APIs, and WebSockets. lambda_http transforms events from all these sources into native http::Request objects, so you can incorporate Rust HTTP semantics into your Lambda functions.

By default, lambda_http compiles your function to support any of those services. This increases the compile time of your function because we have to generate code for all the sources. In reality, you'll usually put a Lambda function only behind one of those sources. You can choose which source to generate code for with feature flags.

The available features flags for lambda_http are the following:

If you only want to support one of these sources, you can disable the default features, and enable only the source that you care about in your package's Cargo.toml file. Substitute the dependency line for lambda_http for the snippet below, changing the feature that you want to enable:

[dependencies.lambda_http]
version = "0.5.3"
default-features = false
features = ["apigw_rest"]

This will make your function compile much faster.

Supported Rust Versions (MSRV)

The AWS Lambda Rust Runtime requires a minimum of Rust 1.70, and is not guaranteed to build on compiler versions earlier than that.

Security

See CONTRIBUTING for more information.

License

This project is licensed under the Apache-2.0 License.

More Repositories

1

git-secrets

Prevents you from committing secrets and credentials into git repositories
Shell
11,616
star
2

llrt

LLRT (Low Latency Runtime) is an experimental, lightweight JavaScript runtime designed to address the growing demand for fast and efficient Serverless applications.
JavaScript
8,074
star
3

aws-shell

An integrated shell for working with the AWS CLI.
Python
7,182
star
4

mountpoint-s3

A simple, high-throughput file client for mounting an Amazon S3 bucket as a local file system.
Rust
4,475
star
5

autogluon

AutoGluon: AutoML for Image, Text, and Tabular Data
Python
4,348
star
6

gluonts

Probabilistic time series modeling in Python
Python
3,686
star
7

aws-sdk-rust

AWS SDK for the Rust Programming Language
Rust
3,014
star
8

deequ

Deequ is a library built on top of Apache Spark for defining "unit tests for data", which measure data quality in large datasets.
Scala
2,871
star
9

amazon-redshift-utils

Amazon Redshift Utils contains utilities, scripts and view which are useful in a Redshift environment
Python
2,643
star
10

diagram-maker

A library to display an interactive editor for any graph-like data.
TypeScript
2,359
star
11

amazon-ecr-credential-helper

Automatically gets credentials for Amazon ECR on docker push/docker pull
Go
2,261
star
12

amazon-eks-ami

Packer configuration for building a custom EKS AMI
Shell
2,164
star
13

aws-lambda-powertools-python

A developer toolkit to implement Serverless best practices and increase developer velocity.
Python
2,148
star
14

aws-well-architected-labs

Hands on labs and code to help you learn, measure, and build using architectural best practices.
Python
1,834
star
15

aws-config-rules

[Node, Python, Java] Repository of sample Custom Rules for AWS Config.
Python
1,473
star
16

smithy

Smithy is a protocol-agnostic interface definition language and set of tools for generating clients, servers, and documentation for any programming language.
Java
1,356
star
17

aws-support-tools

Tools and sample code provided by AWS Premium Support.
Python
1,290
star
18

open-data-registry

A registry of publicly available datasets on AWS
Python
1,199
star
19

sockeye

Sequence-to-sequence framework with a focus on Neural Machine Translation based on PyTorch
Python
1,181
star
20

aws-lambda-powertools-typescript

Powertools is a developer toolkit to implement Serverless best practices and increase developer velocity.
TypeScript
1,179
star
21

dgl-ke

High performance, easy-to-use, and scalable package for learning large-scale knowledge graph embeddings.
Python
1,144
star
22

aws-sdk-ios-samples

This repository has samples that demonstrate various aspects of the AWS SDK for iOS, you can get the SDK source on Github https://github.com/aws-amplify/aws-sdk-ios/
Swift
1,038
star
23

amazon-kinesis-video-streams-webrtc-sdk-c

Amazon Kinesis Video Streams Webrtc SDK is for developers to install and customize realtime communication between devices and enable secure streaming of video, audio to Kinesis Video Streams.
C
1,031
star
24

aws-sdk-android-samples

This repository has samples that demonstrate various aspects of the AWS SDK for Android, you can get the SDK source on Github https://github.com/aws-amplify/aws-sdk-android/
Java
1,018
star
25

aws-solutions-constructs

The AWS Solutions Constructs Library is an open-source extension of the AWS Cloud Development Kit (AWS CDK) that provides multi-service, well-architected patterns for quickly defining solutions
TypeScript
1,013
star
26

aws-lambda-go-api-proxy

lambda-go-api-proxy makes it easy to port APIs written with Go frameworks such as Gin (https://gin-gonic.github.io/gin/ ) to AWS Lambda and Amazon API Gateway.
Go
1,005
star
27

aws-cfn-template-flip

Tool for converting AWS CloudFormation templates between JSON and YAML formats.
Python
991
star
28

eks-node-viewer

EKS Node Viewer
Go
947
star
29

multi-model-server

Multi Model Server is a tool for serving neural net models for inference
Java
936
star
30

ec2-spot-labs

Collection of tools and code examples to demonstrate best practices in using Amazon EC2 Spot Instances.
Jupyter Notebook
905
star
31

aws-mobile-appsync-sdk-js

JavaScript library files for Offline, Sync, Sigv4. includes support for React Native
TypeScript
902
star
32

aws-saas-boost

AWS SaaS Boost is a ready-to-use toolset that removes the complexity of successfully running SaaS workloads in the AWS cloud.
Java
901
star
33

fargatecli

CLI for AWS Fargate
Go
891
star
34

fortuna

A Library for Uncertainty Quantification.
Python
882
star
35

aws-api-gateway-developer-portal

A Serverless Developer Portal for easily publishing and cataloging APIs
JavaScript
879
star
36

ecs-refarch-continuous-deployment

ECS Reference Architecture for creating a flexible and scalable deployment pipeline to Amazon ECS using AWS CodePipeline
Shell
842
star
37

dynamodb-data-mapper-js

A schema-based data mapper for Amazon DynamoDB.
TypeScript
818
star
38

goformation

GoFormation is a Go library for working with CloudFormation templates.
Go
812
star
39

flowgger

A fast data collector in Rust
Rust
796
star
40

aws-js-s3-explorer

AWS JavaScript S3 Explorer is a JavaScript application that uses AWS's JavaScript SDK and S3 APIs to make the contents of an S3 bucket easy to browse via a web browser.
HTML
771
star
41

aws-icons-for-plantuml

PlantUML sprites, macros, and other includes for Amazon Web Services services and resources
Python
737
star
42

aws-devops-essential

In few hours, quickly learn how to effectively leverage various AWS services to improve developer productivity and reduce the overall time to market for new product capabilities.
Shell
674
star
43

aws-apigateway-lambda-authorizer-blueprints

Blueprints and examples for Lambda-based custom Authorizers for use in API Gateway.
C#
660
star
44

amazon-ecs-nodejs-microservices

Reference architecture that shows how to take a Node.js application, containerize it, and deploy it as microservices on Amazon Elastic Container Service.
Shell
650
star
45

aws-deployment-framework

The AWS Deployment Framework (ADF) is an extensive and flexible framework to manage and deploy resources across multiple AWS accounts and regions based on AWS Organizations.
Python
636
star
46

amazon-kinesis-client

Client library for Amazon Kinesis
Java
621
star
47

aws-lambda-web-adapter

Run web applications on AWS Lambda
Rust
610
star
48

dgl-lifesci

Python package for graph neural networks in chemistry and biology
Python
594
star
49

data-on-eks

DoEKS is a tool to build, deploy and scale Data & ML Platforms on Amazon EKS
HCL
590
star
50

aws-security-automation

Collection of scripts and resources for DevSecOps and Automated Incident Response Security
Python
585
star
51

aws-glue-libs

AWS Glue Libraries are additions and enhancements to Spark for ETL operations.
Python
565
star
52

python-deequ

Python API for Deequ
Python
535
star
53

aws-athena-query-federation

The Amazon Athena Query Federation SDK allows you to customize Amazon Athena with your own data sources and code.
Java
507
star
54

amazon-dynamodb-lock-client

The AmazonDynamoDBLockClient is a general purpose distributed locking library built on top of DynamoDB. It supports both coarse-grained and fine-grained locking.
Java
469
star
55

shuttle

Shuttle is a library for testing concurrent Rust code
Rust
465
star
56

ami-builder-packer

An example of an AMI Builder using CI/CD with AWS CodePipeline, AWS CodeBuild, Hashicorp Packer and Ansible.
465
star
57

route53-dynamic-dns-with-lambda

A Dynamic DNS system built with API Gateway, Lambda & Route 53.
Python
461
star
58

aws-servicebroker

AWS Service Broker
Python
461
star
59

diagram-as-code

Diagram-as-code for AWS architecture.
Go
459
star
60

amazon-ecs-local-container-endpoints

A container that provides local versions of the ECS Task Metadata Endpoint and ECS Task IAM Roles Endpoint.
Go
456
star
61

datawig

Imputation of missing values in tables.
JavaScript
454
star
62

aws-jwt-verify

JS library for verifying JWTs signed by Amazon Cognito, and any OIDC-compatible IDP that signs JWTs with RS256, RS384, and RS512
TypeScript
452
star
63

aws-config-rdk

The AWS Config Rules Development Kit helps developers set up, author and test custom Config rules. It contains scripts to enable AWS Config, create a Config rule and test it with sample ConfigurationItems.
Python
444
star
64

ecs-refarch-service-discovery

An EC2 Container Service Reference Architecture for providing Service Discovery to containers using CloudWatch Events, Lambda and Route 53 private hosted zones.
Go
444
star
65

ssosync

Populate AWS SSO directly with your G Suite users and groups using either a CLI or AWS Lambda
Go
443
star
66

handwritten-text-recognition-for-apache-mxnet

This repository lets you train neural networks models for performing end-to-end full-page handwriting recognition using the Apache MXNet deep learning frameworks on the IAM Dataset.
Jupyter Notebook
442
star
67

awscli-aliases

Repository for AWS CLI aliases.
437
star
68

snapchange

Lightweight fuzzing of a memory snapshot using KVM
Rust
436
star
69

threat-composer

A simple threat modeling tool to help humans to reduce time-to-value when threat modeling
TypeScript
426
star
70

aws-security-assessment-solution

An AWS tool to help you create a point in time assessment of your AWS account using Prowler and Scout as well as optional AWS developed ransomware checks.
423
star
71

lambda-refarch-mapreduce

This repo presents a reference architecture for running serverless MapReduce jobs. This has been implemented using AWS Lambda and Amazon S3.
JavaScript
422
star
72

aws-lambda-cpp

C++ implementation of the AWS Lambda runtime
C++
409
star
73

pgbouncer-fast-switchover

Adds query routing and rewriting extensions to pgbouncer
C
396
star
74

aws-sdk-kotlin

Multiplatform AWS SDK for Kotlin
Kotlin
392
star
75

aws-cloudsaga

AWS CloudSaga - Simulate security events in AWS
Python
389
star
76

amazon-kinesis-producer

Amazon Kinesis Producer Library
C++
385
star
77

soci-snapshotter

Go
383
star
78

serverless-photo-recognition

A collection of 3 lambda functions that are invoked by Amazon S3 or Amazon API Gateway to analyze uploaded images with Amazon Rekognition and save picture labels to ElasticSearch (written in Kotlin)
Kotlin
378
star
79

amazon-sagemaker-workshop

Amazon SageMaker workshops: Introduction, TensorFlow in SageMaker, and more
Jupyter Notebook
378
star
80

serverless-rules

Compilation of rules to validate infrastructure-as-code templates against recommended practices for serverless applications.
Go
378
star
81

logstash-output-amazon_es

Logstash output plugin to sign and export logstash events to Amazon Elasticsearch Service
Ruby
374
star
82

kinesis-aggregation

AWS libraries/modules for working with Kinesis aggregated record data
Java
370
star
83

smithy-rs

Code generation for the AWS SDK for Rust, as well as server and generic smithy client generation.
Rust
369
star
84

syne-tune

Large scale and asynchronous Hyperparameter and Architecture Optimization at your fingertips.
Python
367
star
85

graphstorm

Enterprise graph machine learning framework for billion-scale graphs for ML scientists and data scientists.
Python
366
star
86

dynamodb-transactions

Java
354
star
87

amazon-kinesis-client-python

Amazon Kinesis Client Library for Python
Python
354
star
88

aws-sigv4-proxy

This project signs and proxies HTTP requests with Sigv4
Go
351
star
89

aws-serverless-data-lake-framework

Enterprise-grade, production-hardened, serverless data lake on AWS
Python
349
star
90

amazon-kinesis-agent

Continuously monitors a set of log files and sends new data to the Amazon Kinesis Stream and Amazon Kinesis Firehose in near-real-time.
Java
342
star
91

rds-snapshot-tool

The Snapshot Tool for Amazon RDS automates the task of creating manual snapshots, copying them into a different account and a different region, and deleting them after a specified number of days
Python
337
star
92

amazon-kinesis-scaling-utils

The Kinesis Scaling Utility is designed to give you the ability to scale Amazon Kinesis Streams in the same way that you scale EC2 Auto Scaling groups โ€“ up or down by a count or as a percentage of the total fleet. You can also simply scale to an exact number of Shards. There is no requirement for you to manage the allocation of the keyspace to Shards when using this API, as it is done automatically.
Java
333
star
93

amazon-kinesis-video-streams-producer-sdk-cpp

Amazon Kinesis Video Streams Producer SDK for C++ is for developers to install and customize for their connected camera and other devices to securely stream video, audio, and time-encoded data to Kinesis Video Streams.
C++
332
star
94

landing-zone-accelerator-on-aws

Deploy a multi-account cloud foundation to support highly-regulated workloads and complex compliance requirements.
TypeScript
330
star
95

statelint

A Ruby gem that provides a command-line validator for Amazon States Language JSON files.
Ruby
330
star
96

generative-ai-cdk-constructs

AWS Generative AI CDK Constructs are sample implementations of AWS CDK for common generative AI patterns.
TypeScript
327
star
97

route53-infima

Library for managing service-level fault isolation using Amazon Route 53.
Java
326
star
98

aws-automated-incident-response-and-forensics

326
star
99

mxboard

Logging MXNet data for visualization in TensorBoard.
Python
326
star
100

crossplane-on-eks

Crossplane bespoke composition blueprints for AWS resources
HCL
319
star