• Stars
    star
    102
  • Rank 333,645 (Top 7 %)
  • Language
    Python
  • License
    Other
  • Created about 7 years ago
  • Updated over 1 year ago

Reviews

There are no reviews yet. Be the first to send feedback to the community and the maintainers!

Repository Details

A solution for automated and scheduled execution of actions on selected AWS resources, including an updated EBS Snapshot Scheduler

AWS Ops Automator

Ops Automator is a developer framework for running actions to manage AWS environments with explicit support for multiple accounts and regions.

Ops Automator's primary function is to run tasks. A task is an action with a set of parameters that runs at scheduled times or events and operated on a select set of AWS resources. Events are triggered by changes in your environments and resources are selected through the resource discovery and tagging mechanisms built into Ops Automator.

Ops Automator comes with a number of actions. These are ready to use in your AWS environment and can be used as an example/starting point for developing your own actions.

Examples of actions included are creating backups, setting capacity, cleaning up and security reporting.

Ops Automator helps you to develop your own operation automations tasks in a consistent way with the framework handling all the heavy lifting.

The Ops Automator framework handles the following functionality:

  • Operations across multiple accounts and regions
  • Task audit trails
  • Logging
  • Resource selection
  • Scaling
  • AWS API retries
  • Completion handling for long running tasks
  • Concurrency handling via queue throttling

Ops Automator lets you focus on implementing the actual logic of the action. Actions are developed in Python and can be added easily to the Ops Automator solution. Ops Automator has the ability to generate CloudFormation scripts for configuring tasks, based on metadata of the action that are part of the deployment.

Development of actions is described in the Ops Automator Action Implementation Guide.

Documentation

Ops Automator full documentation is available on the AWS web site.

Platform Support

Ops Automator v2.2.0 and later supports AWS Lambda, AWS ECS, AWS Fargate for the execution platform. Choose ECSFargate = Yes in the CloudFormation template to use ECS or Fargate, or leave it set to No to use Lambda. Note that with ECS/Fargate you may choose to use Lambda or containers at the Task level. To implement ECS/Fargate see the instructions later in this README.

Building from GitHub

Overview of the Process

Building from GitHub source will allow you to modify the solution, such as adding custom actions or upgrading to a new release. The process consists of downloading the source from GitHub, creating buckets to be used for deployment, building the solution, and uploading the artifacts needed for deployment.

You will need:

  • a Linux client with the AWS CLI installed and python 3.6+
  • source code downloaded from GitHub
  • two S3 buckets (minimum): 1 global and 1 for each region where you will deploy

Download from GitHub

Clone or download the repository to a local directory on your linux client. Note: if you intend to modify Ops Automator you may wish to create your own fork of the GitHub repo and work from that. This allows you to check in any changes you make to your private copy of the solution.

Git Clone example:

git clone https://github.com/awslabs/aws-ops-automator.git

Download Zip example:

wget https://github.com/awslabs/aws-ops-automator/archive/master.zip

Customize to your needs

Some customers have implementations of older versions of Ops Automator that include deprecated or custom actions. In order to upgrade to the latest release you will need to bring these actions forward to the latest build. See details later in this file.

See Ops Automator documentation for more details.

Build for Distribution

AWS Solutions use two types of buckets: a bucket for global access to templates, which is accessed via HTTP, and regional buckets for access to assets within the region, such as Lambda code. You will need:

  • One global bucket that is access via the http end point. AWS CloudFormation templates are stored here. Ex. "mybucket"
  • One regional bucket for each region where you plan to deploy using the name of the global bucket as the root, and suffixed with the region name. Ex. "mybucket-us-east-1"
  • Your buckets should be encrypted and disallow public access

From the deployment folder in your cloned repo, run build-s3-dist.sh

chmod +x build-s3-dist.sh
build-s3-dist.sh <bucketname> ops-automator {version}

<bucketname>: name of the "global bucket" - mybucket in the example above

ops-automator: name of the solution. This is used to form the first level prefix in the regional s3 bucket

version: Optionally, you can override the version (from version.txt). You will want to do this when doing incremental updates within the same version, as this causes CloudFormation to update the infrastructure, particularly Lambdas when the source code has changed. We recommend using a build suffix to the semver version. Ex. for version 2.2.0 suffix it with ".001" and increment for each subsequent build. 2.2.0.001, 2.2.0.002, and so on. This value is used as the second part of the prefix for artifacts in the S3 buckets. The default version is the value in version.txt.

Automatically upload files using the deployment/upload-s3-dist.sh script. You must have configured the AWS CLI and have access to the S3 buckets. The upload script automatically receives the values you provided the the build script. You may run upload-s3-dist.sh for each region where you plan to deploy the solution.

chmod +x upload-s3-dist.sh
upload-s3-dist.sh <region>

Upgrading from a 2.0.0 Release

Version 2.1.0 and later include 7 supported actions: * DynamoDbSetCapacity * Ec2CopySnapshot * Ec2CreateSnapshot * Ec2DeleteSnapshot * Ec2ReplaceInstance * Ec2ResizeInstance * Ec2TagCpuInstance

Many customers have older versions of Ops Automator that include custom actions. It is possible to add these actions to Ops Automator 2.1 and later. You will need a copy of the source for your current implementation. You can find a zip file containing your current deployment as follows:

  1. Open CloudFormation and locate your Ops Automator stack
  2. Open the stack and view the template (Tenplate tab)
  3. Find the Resource definition for OpsAutomatorLambdaFunctionStandard. Note the values for S3Bucket and S3Key.
  4. Interpret these values to get the bucket name and prefix.
  5. Derive the url: https://<bucketname>-<region>/<S3Key>
  6. Get the file
  7. Extract the file to a convenient location

Example

OpsAutomatorLambdaFunctionStandard": {
            "Type": "AWS::Lambda::Function", 
            "Properties": {
                "Code": {
                    "S3Bucket": {
                        "Fn::Join": [
                            "-", 
                            [
                                "ops-automator-deploy", 
                                {
                                    "Ref": "AWS::Region"
                                }
                            ]
                        ]
                    }, 
                    "S3Key": "ops-automator/latest/ops-automator-2.2.0.61.zip"
                }, 
                "FunctionName": {
                    "Fn::Join": [
                        "-", 
                        [
                            {
                                "Ref": "AWS::StackName"
                            }, 
                            {
                                "Fn::FindInMap": [
                                    "Settings", 
                                    "Names", 
                                    "OpsAutomatorLambdaFunction"
                                ]
                            }, 
                            "Standard"
                        ]
                    ]
                }, 

S3Bucket: ops-automator-deploy

S3Key: ops-automator/latest/ops-automator-2.2.0.61.zip

url for us-east-1: https://ops-automator-deploy-us-east-1.s3.amazonaws.com/ops-automator/latest/ops-automator-2.2.0.61.zip

Get the source:

wget https://ops-automator-deploy-us-east-1.s3.amazonaws.com/ops-automator/latest/ops-automator-2.2.0.61.zip

Create Build Area

Follow the instructions above to create a development copy of Ops Automator from GitHub. Go to the root of that copy. You should see:

(python3) [ec2-user@ip-10-0-20-184 oa-220-customer]$ ll
total 28
-rw-rw-r-- 1 ec2-user ec2-user   324 Jan  6 14:29 CHANGELOG.md
drwxrwxr-x 2 ec2-user ec2-user   122 Jan  6 14:29 deployment
-rwxrwxr-x 1 ec2-user ec2-user 10577 Jan  6 14:29 LICENSE.txt
-rwxrwxr-x 1 ec2-user ec2-user   822 Jan  6 14:29 NOTICE.txt
-rwxrwxr-x 1 ec2-user ec2-user  3837 Jan  6 14:29 README.md
drwxrwxr-x 5 ec2-user ec2-user    51 Jan  6 14:29 source
-rw-rw-r-- 1 ec2-user ec2-user     5 Jan  6 14:29 version.txt
(python3) [ec2-user@ip-10-0-20-184 oa-220-customer]$

Initial Build

To verify that all is well, do a base build from this source. You will need the base name of the global bucket you created earlier.

Ex. "mybucket" will use "mybucket" for the templates and "mybucket-us-east-1" for a deployment in us-east-1.

cd deployment
chmod +x *.sh
./build-s3-dist.sh <bucket> ops-automator

After your build completes without error copy the output files to your S3 buckets using the upload-s3-dist.sh script to send the files to the desire region:

./upload-s3-dist.sh <region>

This will create the prefix ops-automator/<version> in both buckets, one containing the templates and the other a zip of the Lambda source code. This is your baseline, box-stock OA build.

Upgrading Actions

Overview

  1. Get a list of Actions to be migrated
  2. Copy action source to source/code/actions
  • Audit for prereqs
  • Audit for Python 3 compatibility
  1. Run build-s3-dist.sh
  2. Run upload-s3-dist.sh
  3. Update the stack using the S3 url for the updated new template after all actions imported

Get a list of Actions

Use the DynamoDB Console to query the Ops Automator ConfigurationTable for unique values in the Action column. For any action not in the above list you will need to find the source code in your current release, source/code/actions. Repeat the following steps for each Action.

Update each Action

  1. Check dependencies

Locate the Action file in your current deployment. For example, we'll work with DynamodbCreateBackup, which was a supported action as recently as 2.0.0.213, removed in a later 2.0 build.

Copy the file to source/code/actions in the new release source.

(python3) [ec2-user@ip-10-0-20-184 actions]$ ll
total 312
-rw-rw-r-- 1 ec2-user ec2-user  7911 Jan  6 15:17 action_base.py
-rw-rw-r-- 1 ec2-user ec2-user  9365 Jan  6 15:17 action_ec2_events_base.py
-rw-rw-r-- 1 ec2-user ec2-user  9688 Jan  6 16:37 dynamodb_create_backup_action.py
-rw-rw-r-- 1 ec2-user ec2-user 12694 Jan  6 15:17 dynamodb_set_capacity_action.py
-rw-rw-r-- 1 ec2-user ec2-user 49681 Jan  6 15:17 ec2_copy_snapshot_action.py
-rwxrwxr-x 1 ec2-user ec2-user 38045 Jan  6 15:17 ec2_create_snapshot_action.py
-rwxrwxr-x 1 ec2-user ec2-user 16840 Jan  6 15:17 ec2_delete_snapshot_action.py
-rwxrwxr-x 1 ec2-user ec2-user 55337 Jan  6 15:17 ec2_replace_instance_action.py
-rwxrwxr-x 1 ec2-user ec2-user 34373 Jan  6 15:17 ec2_resize_instance_action.py
-rwxrwxr-x 1 ec2-user ec2-user 15825 Jan  6 15:17 ec2_tag_cpu_instance_action.py
-rw-rw-r-- 1 ec2-user ec2-user 14559 Jan  6 15:17 __init__.py
-rwxrwxr-x 1 ec2-user ec2-user  6199 Jan  6 15:17 scheduler_config_backup_action.py
-rwxrwxr-x 1 ec2-user ec2-user  8092 Jan  6 15:17 scheduler_task_cleanup_action.py
-rwxrwxr-x 1 ec2-user ec2-user  9132 Jan  6 15:17 scheduler_task_export_action.py

Open the file in an editor and observe the imports:

import services.dynamodb_service
import tagging
from actions import *
from actions.action_base import ActionBase
from boto_retry import get_client_with_retries, get_default_retry_strategy
from helpers import safe_json

Verify that dynamodb_service.py exists in the source/code/services:

(python3) [ec2-user@ip-10-0-20-184 deployment]$ cd ../services
(python3) [ec2-user@ip-10-0-20-184 services]$ ll
total 212
-rwxrwxr-x 1 ec2-user ec2-user 29299 Jan  6 15:17 aws_service.py
-rwxrwxr-x 1 ec2-user ec2-user  4882 Jan  6 15:17 cloudformation_service.py
-rwxrwxr-x 1 ec2-user ec2-user  4871 Jan  6 15:17 cloudwatchlogs_service.py
-rwxrwxr-x 1 ec2-user ec2-user  5390 Jan  6 15:17 dynamodb_service.py
-rwxrwxr-x 1 ec2-user ec2-user 12657 Jan  6 15:17 ec2_service.py
-rwxrwxr-x 1 ec2-user ec2-user  4987 Jan  6 15:17 ecs_service.py
-rwxrwxr-x 1 ec2-user ec2-user  6861 Jan  6 15:17 elasticache_service.py
-rwxrwxr-x 1 ec2-user ec2-user  5214 Jan  6 15:17 elb_service.py
-rwxrwxr-x 1 ec2-user ec2-user  5369 Jan  6 15:17 elbv2_service.py
-rwxrwxr-x 1 ec2-user ec2-user  6125 Jan  6 15:17 iam_service.py
-rwxrwxr-x 1 ec2-user ec2-user  8193 Jan  6 15:17 __init__.py
-rwxrwxr-x 1 ec2-user ec2-user  5341 Jan  6 15:17 kms_service.py
-rwxrwxr-x 1 ec2-user ec2-user  5291 Jan  6 15:17 lambda_service.py
-rwxrwxr-x 1 ec2-user ec2-user  7558 Jan  6 15:17 opsautomatortest_service.py
-rwxrwxr-x 1 ec2-user ec2-user 11413 Jan  6 15:17 rds_service.py
-rw-rw-r-- 1 ec2-user ec2-user 13363 Jan  6 15:17 route53_service.py
-rwxrwxr-x 1 ec2-user ec2-user  9749 Jan  6 15:17 s3_service.py
-rwxrwxr-x 1 ec2-user ec2-user  6725 Jan  6 15:17 servicecatalog_service.py
-rwxrwxr-x 1 ec2-user ec2-user  5769 Jan  6 15:17 storagegateway_service.py
-rwxrwxr-x 1 ec2-user ec2-user  3441 Jan  6 15:17 tagging_service.py
-rwxrwxr-x 1 ec2-user ec2-user  4086 Jan  6 15:17 time_service.py

Note that this action uses ActionBase, which is already in the actions folder (see above listing).

  1. Verify the code / compatibility

Do a quick scan to make sure there are no Python 3 compatibility issues.

TIP: use a linter. This code looks clean with regards to Python 3 issues.

  1. Repeat for all actions to be added from the old release

Build it as a new version

Open the directory for your new version and go to the deployment folder. Find out the current base semver version:

(python3) [ec2-user@ip-10-0-20-184 customer]$ more version.txt
2.2.0
(python3) [ec2-user@ip-10-0-20-184 customer]$ 

Append a build number. Start with 001. We will use version 2.2.0.001 for our first build. This is important as it will allow us to update the install. Do not change the semver version, as this allows AWS to match your installation back to the original.

Run build_s3_dist:

(python3) [ec2-user@ip-10-0-20-184 deployment]$ ./build-s3-dist.sh mybucket ops-automator v2.2.0.001

Upon successful completion, upload to S3:

(python3) [ec2-user@ip-10-0-20-184 deployment]$ ./upload-s3-dist.sh us-east-1
==========================================================================
Deploying ops-automator version v2.2.0 to bucket mybucket-us-east-1
==========================================================================
Templates: mybucket/ops-automator/v2.2.0/
Lambda code: mybucket-us-east-1/ops-automator/v2.2.0/
---
Press [Enter] key to start upload to us-east-1
upload: global-s3-assets/ops-automator-ecs-cluster.template to s3://mybucket/ops-automator/v2.2.0/ops-automator-ecs-cluster.template
upload: global-s3-assets/ops-automator.template to s3://mybucket/ops-automator/v2.2.0/ops-automator.template
upload: regional-s3-assets/cloudwatch-handler.zip to s3://mybucket-us-east-1/ops-automator/v2.2.0/cloudwatch-handler.zip
upload: regional-s3-assets/ops-automator.zip to s3://mybucket-us-east-1/ops-automator/v2.2.0/ops-automator.zip
Completed uploading distribution. You may now install from the templates in mybucket/ops-automator/v2.2.0/
(python3) [ec2-user@ip-10-0-20-184 deployment]$ 

Update the Stack or Deploy as New

We generally recommend that you deploy a new stack with the new version and then migrate your actions from old to new. You may optionally update the stack in place. We have tested upgrade-in-place from v2.0.0 to v2.2.0 successfully, following the instructions above very carefully.

To Update

Replace the template with the one from the new version.

Ex. https://mybucket.s3-us-west-2.amazonaws.com/ops-automator/v2.2.0.001/ops-automator.template.

There is no need to change any parameters.

Validate the change in Lambda

Open the Lambda console. Find all of your Lambdas by filtering by stack name. All should show an update at the time you updated the stack. Open onf of the OpsAutomator-<size> Lambdas - ex. OA-220-customer-OpsAutomator-Large. View the Function code. Expand the actions folder. You should see the new action, dynamodb_create_backup_action.py.

Verify that the action template was uploaded to the S3 configuration bucket:

(python3) [ec2-user@ip-10-0-20-184 deployment]$ aws s3 ls s3://oa-220-customer-configuration-1wg089n4zjpt4/TaskConfiguration/
                           PRE ScenarioTemplates/
                           PRE Scripts/
2020-01-06 17:13:52       6324 ActionsConfiguration.html
2020-01-06 17:13:46      25492 DynamodbCreateBackup.template
2020-01-06 17:13:45      33083 DynamodbSetCapacity.template
2020-01-06 17:13:50      37284 Ec2CopySnapshot.template
2020-01-06 17:13:49      34782 Ec2CreateSnapshot.template
2020-01-06 17:13:49      27938 Ec2DeleteSnapshot.template
2020-01-06 17:13:48      39649 Ec2ReplaceInstance.template
2020-01-06 17:13:46      38499 Ec2ResizeInstance.template
2020-01-06 17:13:51      26854 Ec2TagCpuInstance.template

Check the Logs

Examine both the Lambda logs and the Ops Automator logs for errors.

ECS/Fargate Implementation

This section describes how to setup the use of an AWS Fargate cluster that is used by the Ops Automator framework for long-running tasks. Starting with Ops Automator 2.2.0 you can deploy with AWS ECS / Fargate or add it later. With ECS/Fargate enabled you can choose which tasks to run on containers, and which to run on Lambda - they are not mutually exclusive. ECS/Fargate may be a desirable option for customers with tasks that run longer than 15 minutes.

Setup

This assumes that you have downloaded the Ops Automator source from Github, built, and deployed the solution from source in your account. If you installed from the AWS Solutions template a simpler deployment is described in the AWS Ops Automator Implementation Guide, Appendix K.

Overview

  1. Deploy/update the AWS Ops Automator stack to use the ECS option
  2. Build and deploy the Docker container
  3. Update/deploy tasks using ECS/Fargate

Deploy Ops Automator with ECS

See above procedure to build and deploy the solution from source. Select the ECS/Fargate option. You must do this first, as this option will create the ECS Container Registry needed in the last step.

ECS can deploy a cluster in an existing VPC. You will need to provide the VPC ID and subnet IDs for at least two subnets.

Fargate is automatically selected if you do not provide a VPC ID. It deploys a new VPC and public subnets for the Fargate cluster.

Build and Deploy the Docker Container

From the /deployment/ecs folder where you built the solution the ecs files, run the following command

./build-and-deploy-image.sh -s <stack-name> -r <region>

This step pulls down the required files to build docker image based on the AWS Linux Docker optimized AMI. It installs the ops-automator-ecs-runner.py script on the image.

The image is then pushed to the ops-automator repository, created by the Ops Automator template

Deploy Ops Automator Actions using ECS

The ECS option is now available for Actions. You can now deploy additional tasks using the ECS option or modify existing tasks to use ECS. Note: if you deployed tasks prior to selecting ECS in the main AWS Ops Automator stack you will need to update their template from the S3 Ops Automator configuration bucket. ECS will now be an option for Resource Selection Memory and Execution Memory.


Copyright 2020 Amazon.com, Inc. or its affiliates. All Rights Reserved.

Licensed under the Apache License Version 2.0 (the "License"). You may not use this file except in compliance with the License. A copy of the License is located at

http://www.apache.org/licenses/

or in the "license" file accompanying this file. This file is distributed on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, express or implied. See the License for the specific language governing permissions and limitations under the License.

Collection of operational metrics

This solution collects anonymous operational metrics to help AWS improve the quality of features of the solution. For more information, including how to disable this capability, please see the Implementation Guide

More Repositories

1

serverless-image-handler

A solution to dynamically handle images on the fly, utilizing SharpJS
TypeScript
1,254
star
2

aws-waf-security-automations

This solution automatically deploys a single web access control list (web ACL) with a set of AWS WAF rules designed to filter common web-based attacks.
Python
821
star
3

workload-discovery-on-aws

Workload Discovery on AWS is a solution to visualize AWS Cloud workloads. With it you can build, customize, and share architecture diagrams of your workloads based on live data from AWS. The solution maintains an inventory of the AWS resources across your accounts and regions, mapping their relationships and displaying them in the user interface.
JavaScript
712
star
4

instance-scheduler-on-aws

A cross-account and cross-region solution that allows customers to automatically start and stop EC2 and RDS Instances
Python
504
star
5

video-on-demand-on-aws

An automated reference implementation leveraging AWS Step Functions and AWS Media Services to deploy a scalable fault tolerant Video on demand workflow
JavaScript
489
star
6

quota-monitor-for-aws

This solution leverages AWS Trusted Advisor and Service Quotas to monitor AWS resource usage and raise alerts.
TypeScript
404
star
7

aws-data-lake-solution

A deployable reference implementation intended to address pain points around conceptualizing data lake architectures that automatically configures the core AWS services necessary to easily tag, search, share, and govern specific subsets of data across a business or with other external businesses.
JavaScript
371
star
8

qnabot-on-aws

AWS QnABot is a multi-channel, multi-language conversational interface (chatbot) that responds to your customer's questions, answers, and feedback. The solution allows you to deploy a fully functional chatbot across multiple channels including chat, voice, SMS and Amazon Alexa.
JavaScript
369
star
9

aws-control-tower-customizations

The Customizations for AWS Control Tower solution combines AWS Control Tower and other highly-available, trusted AWS services to help customers more quickly set up a secure, multi-account AWS environment using AWS best practices.
Python
343
star
10

automated-security-response-on-aws

Automated Security Response on AWS is an add-on solution that works with AWS Security Hub to provide a ready-to-deploy architecture and a library of automated playbooks. The solution makes it easier for AWS Security Hub customers to resolve common security findings and to improve their security posture in AWS.
Python
334
star
11

distributed-load-testing-on-aws

Distributed Load Testing on AWS
JavaScript
313
star
12

live-streaming-on-aws

The live stream solution is a reference deployment that demonstrates how to deliver highly available live streaming video through an integrated workflow between Elemental Cloud and AWS.
JavaScript
280
star
13

aws-centralized-logging

TypeScript
253
star
14

media-insights-on-aws

A serverless framework to accelerate the development of applications that discover next-generation insights in your video, audio, text, and image resources by utilizing AWS Machine Learning and Media services.
Python
242
star
15

document-understanding-solution

Example of integrating & using Amazon Textract, Amazon Comprehend, Amazon Comprehend Medical, Amazon Kendra to automate the processing of documents for use cases such as enterprise search and discovery, control and compliance, and general business process workflow.
JavaScript
232
star
16

performance-dashboard-on-aws

A simple cost-effective web application to build and publish dashboards.
TypeScript
174
star
17

iot-device-simulator

The IoT Device Simulator solution is a Graphical User Interface (GUI) based engine designed to enable customers to get started quickly assessing AWS IoT services without an existing pool of devices. The IoT Device Simulator helps effortlessly create and simulate thousands of connected devices that are defined by the customer.
TypeScript
147
star
18

mlops-workload-orchestrator

The MLOps Workload Orchestrator solution helps you streamline and enforce architecture best practices for machine learning (ML) model productionization. This solution is an extendable framework that provides a standard interface for managing ML pipelines for AWS ML services and third-party services.
Python
138
star
19

data-transfer-hub

Seamless User Interface for replicating data into AWS.
TypeScript
132
star
20

video-on-demand-on-aws-foundation

How to implement a video-on-demand workflow on AWS leveraging AWS Lambda, AWS Elemental MediaConvert, Amazon s3 and Amazon CloudWatch. Source code for Video on Demand on AWS Foundation solution.
JavaScript
112
star
21

live-streaming-on-aws-with-amazon-s3

Live streaming on AWS with Amazon S3 automatically configures AWS Elemental MediaLive, Amazon S3 and Amazon CloudFront to ingest, encode, package and deliver a single source live stream through the AWS Cloud. The Solution provides 3 Encoding profiles to support 1080p through 288p HTTP live streaming (HLS) outputs.
JavaScript
108
star
22

network-orchestration-for-aws-transit-gateway

The Network Orchestration for AWS Transit Gateway solution automates the process of setting up and managing transit networks in distributed AWS environments. It creates a web interface to help control, audit, and approve (transit) network changes.
Python
101
star
23

efs-backup

EFS backup solution performs backup from source EFS to destination EFS. It utilizes fpsync utils (fpart + rysnc) for efficient incremental backups on the file system.
Python
95
star
24

centralized-logging-with-opensearch

Build your own log analytics platform on OpenSearch in 20 minutes
Python
94
star
25

content-analysis-on-aws

As of August 30, 2023, this AWS Solution is no longer available. Existing deployments will continue to run. The functionality provided by Content Analysis on AWS will be superseded with functionality in Media2Cloud on AWS and Content Localization on AWS. We encourage you to explore these solutions.
Vue
93
star
26

virtual-waiting-room-on-aws

Virtual Waiting Room on AWS solution helps absorb and control incoming user requests to your website during an unusually large burst of traffic, usually due to a large-scale event.
Python
88
star
27

streaming-data-solution-for-amazon-kinesis-and-amazon-msk

A solutions that automatically configures the AWS services necessary to easily capture, store, process, and deliver streaming data. This solution helps you solve for real-time streaming use cases like capturing high volume application logs, analyzing clickstream data, continuously delivering to a data lake, and more.
TypeScript
87
star
28

aws-connected-vehicle-solution

The AWS Connected Vehicle Solution is a reference implementation that provides a foundation for automotive product transformations for connected vehicle services, autonomous driving, electric powertrains, and shared mobility.
JavaScript
87
star
29

media-services-application-mapper

Media Services Application Mapper is a browser-based tool that allows operators to visualize the structure and logical connections among AWS Media Services and supporting services in the cloud. The tool can be used as a top-down resource monitoring tool when integrated with CloudWatch.
JavaScript
82
star
30

generative-ai-application-builder-on-aws

Generative AI Application Builder on AWS facilitates the development, rapid experimentation, and deployment of generative artificial intelligence (AI) applications without requiring deep experience in AI. The solution includes integrations with Amazon Bedrock and its included LLMs, such as Amazon Titan, and pre-built connectors for 3rd-party LLMs.
TypeScript
81
star
31

media2cloud-on-aws

Media2Cloud on AWS solution is designed to demonstrate a serverless ingest framework that can quickly setup a baseline ingest workflow for placing video assets and associated metadata under management control of an AWS customer.
JavaScript
80
star
32

simple-file-manager-for-amazon-efs

Serverless web application to manage data in your Amazon EFS Filesystem
Python
74
star
33

automated-data-analytics-on-aws

The Automated Data Analytics on AWS solution provides an end-to-end data platform for ingesting, transforming, managing and querying datasets. This helps analysts and business users manage and gain insights from data without deep technical experience using Amazon Web Services (AWS).
TypeScript
73
star
34

real-time-web-analytics-with-kinesis

AWS Solution with a CloudFormation template used to deploy an Kinesis Analytics application, optional web server for generating web usage data, and Cognito authenticated dashboard for viewing web analytics in real-time.
JavaScript
69
star
35

cost-optimizer-for-amazon-workspaces

This solution analyzes all of your Amazon WorkSpaces usage data and automatically converts the WorkSpace to the most cost-effective billing option (hourly or monthly), depending on your individual usage. Use this with a single account, or with AWS Organizations across multiple accounts, to help you monitor your WorkSpace usage and optimize costs.
Python
69
star
36

multi-region-application-architecture

TypeScript
67
star
37

aws-devops-monitoring-dashboard

The DevOps Monitoring Dashboard on AWS solution is a reference implementation that automates the process for monitoring and visualizing performance and operational metrics in continuous integration/continuous delivery (CI/CD) pipeline following AWS best practices.
JavaScript
64
star
38

real-time-iot-device-monitoring-with-kinesis

AWS Solution with a CloudFormation template for an ingestion mechanism for analytics about device connectivity and activity (i.e. sensor readings), as well as a dashboard for visualizing this data.
CSS
64
star
39

edit-in-the-cloud-on-aws

This step-by-step guide details how to deploy an edit host, storage, and connectivity on AWS.
PowerShell
62
star
40

discovering-hot-topics-using-machine-learning

The Discovering Hot Topics Using Machine Learning solution helps brand-conscious customers understand the most popular topics being actively discussed by ingesting digital assets and performing near real-time inferences and analytics
JavaScript
62
star
41

cloud-migration-factory-on-aws

The Cloud Migration Factory on AWS solution is designed to coordinate and automate manual processes for large-scale migrations. This solution helps enterprises improve performance and prevents long cutover windows by providing an orchestration platform for migrating workloads to AWS at scale.
Python
57
star
42

cognito-user-profiles-export-reference-architecture

A reference architecture for exporting user profiles, group details, and group memberships from an Amazon Cognito User Pool to an Amazon DynamoDB global table using AWS Step Functions and AWS Lambda.
JavaScript
49
star
43

automated-forensic-orchestrator-for-amazon-ec2

Automated Forensics Orchestrator for Amazon EC2 is a self-service AWS Solution implementation that enterprise customers can deploy to quickly set up and configure an automated orchestration workflow that enables their Security Operations Centre (SOC) to capture and examine data from EC2 instances and attached volumes as evidence for forensic analysis, in the event of a potential security breach. It will orchestrate the forensics process from the point at which a threat is first detected, enable isolation of the affected EC2 instances and data volumes, capture memory and disk images to secure storage, and trigger automated actions or tools for investigation and analysis of such artefacts. All the while, the solution will notify and report on its progress, status, and findings. It will enable SOC to continuously discover and analyze patterns of fraudulent activities across multi-account and multi-region environments. The solution will leverage native AWS services and be underpinned by a highly available, resilient, and serverless architecture, security, and operational monitoring features. Digital forensics is a 4 step process of triaging, acquisition, analysis and reporting. Automated Forensics framework provides capability to enterprise to act on security event by imaging or acquisition of breached resource for examination and generate forensic report about the security breach. In the event of a security breach, it will enable customers to easily to capture and examine required targeted data for forsensic’s storage and analysis. This solution framework enables security operations centre to discover and analyse patterns of fraudulent activities. The automated forensics solution will provide a multi-account and a multi-region [“solution”] built using native AWS services.
Python
47
star
44

serverless-bot-framework

A solution which enables a multilingual conversational platform, facilitating the creation of applications using voice and text
Python
43
star
45

liveness-detection-framework

A framework that helps you implement liveness detection mechanisms into your applications by means of an extensible architecture on AWS.
TypeScript
43
star
46

real-time-insights-account-activity

An application and real-time dashboard providing insights into AWS account activity.
CSS
42
star
47

improving-forecast-accuracy-with-machine-learning

The Improving Forecast Accuracy with Machine Learning solution generates, tests, compares, and iterates on Amazon Forecast forecasts. The solution automatically produces forecasts and generates visualization dashboards for Amazon QuickSight or Amazon SageMaker Jupyter Notebooks—providing a quick, easy, drag-and-drop interface that displays time series input and forecasted output.
Python
42
star
48

aws-crr-monitor

A solution for near real-time monitoring of replication of objects in Amazon S3 between a source bucket and a destination bucket across multiple regions.
Python
39
star
49

real-time-live-sports-updates-using-aws-appsync

The Real-Time Live Sports Updates Using AWS AppSync solution is designed to help media and entertainment customers to deliver real-time live sports updates to web and mobile application via AppSync subscriptions. By leveraging the reference architecture implemented in this solution, M&E companies will be able to deliver sports scores, track live game/match info, send fantasty sports updtates.
JavaScript
38
star
50

application-monitoring-with-amazon-cloudwatch

Deploy a solution that provides a preconfigured dashboard so that you can instantly monitor key performance metrics and logs for your Apache, NGINX and Puma workloads running on Amazon EC2.
TypeScript
34
star
51

real-time-analytics-spark-streaming

A solution describing data-processing design pattern for streaming data through Kinesis and Spark Streaming at real-time.
Shell
34
star
52

firewall-automation-for-network-traffic-on-aws

Firewall Automation for Network Traffic on AWS configures the AWS resources needed to filter network traffic. This solution saves you time by automating the process of provisioning a centralized AWS Network Firewall to inspect traffic between your Amazon VPCs.
TypeScript
33
star
53

content-localization-on-aws

Automatically generate multi-language subtitles using AWS AI/ML services. Machine generated subtitles can be edited to improve accuracy and downstream tracks will automatically be regenerated based on the edits. Built on Media Insights Engine (https://github.com/awslabs/aws-media-insights-engine)
Vue
33
star
54

fhir-works-on-aws

A serverless implementation of the FHIR standard that enables users to focus more on their business needs/uniqueness rather than the FHIR specification.
TypeScript
29
star
55

smart-product-solution

The Smart Product Solution is a customer deployable reference architecture to help manufacturers to jumpstart development of innovative connected/smart product services.
JavaScript
28
star
56

research-service-workbench-on-aws

A mono-repository containing many tools and libraries to spark innovation
TypeScript
28
star
57

machine-learning-for-telecommunications

A base solution that helps to generate insights from their data. The solution provides a framework for an end-to-end machine learning process including ad-hoc data exploration, data processing and feature engineering, and modeling training and evaluation. This baseline will provide the foundation for industry specific data to be applied and models created to release industry specific ML solutions.
Jupyter Notebook
27
star
58

aws-firewall-manager-automations-for-aws-organizations

The Automations For AWS Firewall Manager solution is intended for customers looking to easily manage consistent security posture across their entire AWS Organization. The solution uses AWS Firewall Manager Service.
TypeScript
27
star
59

maintaining-personalized-experiences-with-machine-learning

The Maintaining Personalized Experiences with Machine Learning solution provides an automated pipeline to maintain resources in Amazon Personalize. This pipeline allows you to keep up to date with your user’s most recent activity while sustaining and improving the relevance of recommendations
Python
24
star
60

account-assessment-for-aws-organizations

Account Assessment for AWS Organizations programmatically scans all AWS accounts in an AWS Organization for identity-based and resource-based policies with Organization-based conditions.
Python
23
star
61

media-exchange-on-aws

MediaExchange On AWS
TypeScript
22
star
62

enhanced-document-understanding-on-aws

Enhanced Document Understanding on AWS delivers an easy-to-use web application that ingests and analyzes documents, extracts content, identifies and redacts sensitive customer information, and creates search indexes from the analyzed data.
JavaScript
22
star
63

secure-media-delivery-at-the-edge-on-aws

TypeScript
20
star
64

amazon-s3-glacier-refreezer

The Amazon S3 Glacier Re:Freezer is a serverless solution that automatically copies entire Amazon S3 Glacier vault archives to a defined destination Amazon Simple Storage Service (Amazon S3 bucket) and S3 storage class.
JavaScript
20
star
65

verifiable-controls-evidence-store

This repository contains the source code of the Verifiable Controls Evidence Store solution
TypeScript
18
star
66

amazon-marketing-cloud-uploader-from-aws

Easily upload first-party signals into Amazon Marketing Cloud (AMC) for evaluating and planning advertising campaigns
Python
18
star
67

digital-evidence-archive-on-aws

TypeScript
17
star
68

multi-region-infrastructure-deployment

JavaScript
17
star
69

machine-to-cloud-connectivity-framework

Machine to Cloud Connectivity Framework is a reference implementation that enables connectivity between equipment supporting Open Protocol Communication Data Access (OPC DA), Open Protocol Communication Unified Architecture (OPC UA), OSI Pi, and Modbus TCP protocol.
TypeScript
16
star
70

application-pattern-orchestrator-on-aws

Application Pattern Orchestrator (APO) is an AWS Solution that helps customers to establish and manage an internal catalog of reusable, repeatable, well-architected, secure-by-design, and production-ready cloud infrastructure patterns for use by application development and engineering teams throughout their organisations.
TypeScript
15
star
71

dynamic-object-and-rule-extensions-for-anfw

Solution to specify elastic and dynamic cloud resources as objects that can be easily referenced within AWS Network Firewall rules
TypeScript
15
star
72

operations-conductor

[Deprecated] This solution helps customers reduce operational complexity and enables administrators to quickly create manual, event-based or time-based triggers for managing resources.
TypeScript
14
star
73

data-connectors-for-aws-clean-rooms

Simplify the process of selecting application sources and preparing data for collaborating in AWS Clean Rooms
Python
13
star
74

amazon-virtual-andon

Deploy a solution that provides a scalable Andon system to help optimize processes, support the transition to predictive maintenance, and prevent issues.
TypeScript
13
star
75

connected-mobility-solution-on-aws

Accelerate development and deployment of connected vehicle assets with purpose-built, deployment-ready accelerators, and an Automotive Cloud Developer Portal (ACDP)
Python
12
star
76

aws-connected-mobility-solution

7
star
77

scalable-analytics-using-apache-druid-on-aws

Scalable analytics using Apache Druid on AWS is a solution offered by AWS that enables customers to quickly and efficiently deploy, operate and manage a cost-effective, highly available, resilient, and fault tolerant hosting environment for Apache Druid analytics databases on AWS.
TypeScript
7
star
78

migration-assistant-for-amazon-opensearch

Upgrade, Migrate, and Compare OpenSearch Clusters
TypeScript
6
star
79

audience-uploader-from-aws-clean-rooms

Audience Uploader from AWS Clean Rooms deploys the resources required to enable users to upload privacy-protected data to 3rd party platform to add users to a specific segment.
Python
5
star
80

amazon-marketing-cloud-insights-on-aws

Amazon Marketing Cloud Insights on AWS helps advertisers and agencies running campaigns on Amazon Ads to easily deploy AWS services to store, query, analyze, and visualize reporting from the AMC API
Python
4
star
81

machine-downtime-monitor-on-aws

Machine Downtime Monitor on AWS is a self-service, cloud solution that customers can utilize to monitor their machines for breakdowns on factory floors. The solution connects to machines on a factory floor via the OPC-DA protocol and provides a near real-time view of machines statuses across lines and factories. The solution deploys Amazon Kinesis stream and provides a flexible model to configure how machine data should be interpreted, based on customers’ needs.
TypeScript
4
star