Coder Social home page Coder Social logo

aws-samples / amazon-ecs-and-aws-step-functions-design-patterns-starter-kit Goto Github PK

View Code? Open in Web Editor NEW
16.0 2.0 8.0 917 KB

License: MIT No Attribution

Shell 5.55% Java 93.95% Dockerfile 0.50%
ecs-tasks aws-lambda ecs-cluster amazon-ecs dynamodb-tables

amazon-ecs-and-aws-step-functions-design-patterns-starter-kit's Introduction

Amazon ECS and AWS Step Functions Design Patterns Starter kit

This starter kit demonstrates how to run Amazon Elastic Container Service (ECS) tasks using AWS Step Functions. We will implement the following design patterns:

  1. Running ECS tasks using AWS Lambda
  2. Running ECS tasks using Step Functions native integration

We use AWS Cloud Development Kit (CDK) to deploy application resources.


Contents


Prerequisites

  1. Docker software is installed on your MacBook / Laptop
  2. Docker daemon is running
  3. You have AWS account credentials

Overview

ECS Task Business Logic

We run a simple business logic within an ECS task. It creates a copy of the input file in S3 bucket. We will run multiple instances of the task simultaneously.


Amazon DynamoDB Tables

Each pattern requires 2 DynamoDB tables. They are workflow_summary and workflow_details. workflow_summary is used to audit the status of overall workflow execution status. workflow_details is used to audit the status of individual ECS tasks. The schema of the DynamoDB tables is described in the below table.

Table Schema Capacity
workflow_summary_pattern_x Partition key = workflow_name (String), Sort key = workflow_run_id (Number) Provisioned read capacity units = 5, Provisioned write capacity units = 5
workflow_details_pattern_y Partition key = workflow_run_id (Number), Sort key = ecs_task_id (String) Provisioned read capacity units = 5, Provisioned write capacity units = 5

Note: here, x and y represent either 1 or 2.


Workflow Specification

We create 2 Step Functions State machines to demonstrate the design patterns. State machine is executed with a JSON specs as an input. The specs have two parts - 1) values for ECS cluster, DynamoDB tables, subnets, security groups, S3 bucket etc. 2) list of ECS tasks to run. Table below describes the specs.

JSON Attribute Description
region AWS region used
s3BucketName Amazon S3 bucket used demonstrate ECS Task Business Logic
subnetIdLiteral List of Subnet Ids separated by a separator
separator The separator used in subnetIdLiteral
workflowName Name of the workflow name for e.g. amazon_ecs_starter_kit-pattern-1
securityGroupId The security group id used to run ECS tasks
ddbTableNameWFSummary Name of the DynamoDB table for workflow summary
hashKeyWFSummary The hash key of workflow summary table
rangeKeyWFSummary The sort key of workflow summary table
ddbTableNameWFDetails Name of the DynamoDB table for workflow details
hashKeyWFDetails The hash key of workflow details table
rangeKeyWFDetails The sort key of workflow details table
clusterName Name of the ECS cluster
containerName Name of the container
taskDefinition Name of the ECS task definition name
taskList It has specs for one more ECS tasks. These specs drive the business logic of a task. Each task has three attributes - 1) taskName (Name of the ECS task) 2) s3BucketName (S3 bucket name) 3) objectKey (Object key)

AWS CDK Stacks

CdkApp runs the following stacks

Stack Name Purpose
ECSTaskSubmissionFromLambdaPattern This stack provisions resources needed to demonstrate Pattern 1
ECSTaskSubmissionFromStepFunctionsPattern This stack provisions resources needed to demonstrate Pattern 2

Patterns

Running ECS tasks using AWS Lambda

As show in the below figure, this pattern (Pattern 1) uses AWS Lambda function to run ECS tasks. We call the Lambda function as ECS Task Launcher. It parses workflow specs, submits ECS tasks to ECS Cluster and invokes second AWS Lambda function called ECS Task Monitor.

ECS Task Monitor tracks the completion status of running ECS tasks. Each time it runs, it checks the number of completed tasks versus the total number of tasks submitted and updates the DynamoDB table workflow_summary.

The task executed on ECS cluster is called ECS Task. It takes the following actions - 1) reads input parameters 2) inserts a record in DynamoDB table for auditing 3) copies the input file to a target folder 4) marks the status of its job to Complete in the the DynamoDB table workflow_detail.

Alt


Running ECS tasks using Step Functions native integration

As shown in the below figure, this pattern (Pattern 2) uses AWS Step Functions' native integration with Amazon ECS. Unlike the usage of a Lambda function in Pattern 1, we use Parallel state to run ECS tasks. The number of tasks run depends on the size of "taskList":[] in workflow_specs_pattern_2.json. The role of ECS Task Monitor and the way ECS Task executes are similar to Pattern 1.

Alt


Build

  1. Clone this repository to your Mac/Laptop

  2. Open your IDE for e.g. Eclipse or Spring Tools or Intellij IDEA

  3. Import the project as a Maven project by pointing to <Path_to_cloned_repo>/Amazon-ecs-java-starter-kit/pom.xml | This imports 4 module projects.

  4. Select parent project Amazon-ecs-java-starter-kit and build it using the below instructions

    1. Using standalone Maven, go to project home directory and run command mvn -X clean install
    2. From Eclipse or STS, run command -X clean install. Navigation: Project right click --> Run As --> Maven Build (Option 4)
  5. Expected output 1: In your IDE, you will see the following output

    [INFO] Reactor Summary for amazon-ecs-java-starter-kit 1.0:
    [INFO] 
    [INFO] amazon-ecs-java-starter-kit ........................ [SUCCESS [  0.717 s]
    [INFO] amazon-ecs-java-starter-kit-cdk .................... [SUCCESS [ 14.230 s]
    [INFO] amazon-ecs-java-starter-kit-tasklauncher ........... [SUCCESS [  8.418 s]
    [INFO] amazon-ecs-java-starter-kit-task ................... [SUCCESS [ 21.857 s]
    [INFO] amazon-ecs-java-starter-kit-taskmonitor ............ [SUCCESS [  4.587 s]
    [INFO] ------------------------------------------------------------------------
    [INFO] BUILD SUCCESS
    [INFO] ------------------------------------------------------------------------
    [INFO] Total time:  49.979 s
    [INFO] Finished at: 2020-12-21T13:03:30-06:00
  6. Expected output 2: Build process generates the following jar file in their respective directories

    Module artifact name Approximate Size
    amazon-ecs-java-starter-kit-cdk-1.0.jar 32 KB
    amazon-ecs-java-starter-kit-tasklauncher-1.0.jar 21 MB
    amazon-ecs-java-starter-kit-task-1.0.jar 19 MB
    amazon-ecs-java-starter-kit-taskmonitor-1.0.jar 21 MB

Deploy

  1. In the terminal, go to path /<Path_to_your_cloned_rep>/amazon-ecs-and-aws-step-functions-design-patterns-starter-kit/amazon-ecs-java-starter-kit-cdk. Now, you are in the CDK module of this project.

  2. Replace 1234567890 with your AWS Account Id wherever applicable in the following steps.

  3. Set these to your account and region

    export AWS_ACCOUNT_ID=1234567890
    export AWS_REGION=us-east-2
  4. Bootstrap CDK

    cdk bootstrap aws://${AWS_ACCOUNT_ID}/$AWS_REGION
  5. Output 1: In the command line, you will get the following output

    (node:63268) ExperimentalWarning: The fs.promises API is experimental
    ⏳  Bootstrapping environment aws://AWS_ACCOUNT_ID/us-west-2...
    ✅  Environment aws://AWS_ACCOUNT_ID/us-west-2 bootstrapped (no changes).
  6. Output 2: In the AWS console under CloudFormation, you will see a Stack created as follows

    Alt

  7. Output 3: In the AWS console under S3, you will see a bucket created with name cdktoolkit-stagingbucket-*

  8. Deploy both stacks

    cdk deploy --require-approval never --all --outputs-file outputs.json
  9. Expected output 1: Stack for amazon-ecs-java-starter-pattern-1 created with the following resources:

    Resource Type Resource Details
    VPC 1 VPC to launch resources needed by the starter kit
    Subnet 2 public subnets and 2 private subnets
    Route Table 1 route table per public, and private subnet
    Security Group 1 security group per ECR, ECS, and ECS Agent endpoints
    Security Group 1 security group per ECS Task Launcher and ECS Task Monitor
    VPC Endpoint 1 VPC endpoint per Amazon DynamoDB, Amazon S3, Amazon ECS, Amazon ECS Agent, and Amazon ECR API.
    ECS Cluster 1 ECS cluster to run ECS tasks
    ECR Repository 1 ECR repository to store Docker image for ECS Task binary
    ECS Task Definition 1 ECS task definition for ECS Task
    Amazon DynamoDB 2 tables. Refer to Amazon DynamoDB Tables for more details
    Step Functions state machine 1 State machine for orchestration
    AWS Lambda Lambda Function to submit ECS tasks
    AWS Lambda Lambda Function to monitor the progress of ECS tasks
    Amazon IAM Role 1 IAM role per Step Functions State machine, ECS Task Launcher, ECS Task Monitor. 2 IAM roles for ECS Task Definition - 1) ECS Task Role 2) ECS Task Execution Role
  10. Expected output 2: Stack for amazon-ecs-java-starter-pattern-2 created with the following resources:

    Resource Type Resource Details
    VPC 1 VPC to launch resources needed by the starter kit
    Subnet 2 public subnets and 2 private subnets
    Route Table 1 route table per public, and private subnet
    Security Group 1 security group per ECR, ECS, and ECS Agent endpoints
    Security Group 1 security group per ECS Task Launcher and ECS Task Monitor
    VPC Endpoint 1 VPC endpoint per Amazon DynamoDB, Amazon S3, Amazon ECS, Amazon ECS Agent, and Amazon ECR API.
    ECS Cluster 1 ECS cluster to run ECS tasks
    ECR Repository 1 ECR repository to store Docker image for ECS Task binary
    ECS Task Definition 1 ECS task definition for ECS Task
    Amazon DynamoDB 2 tables. Refer to Amazon DynamoDB Tables for more details
    Step Functions state machine 1 State machine for orchestration
    AWS Lambda Lambda Function to monitor the progress of ECS tasks
    Amazon IAM Role 1 IAM role per Step Functions State machine, and ECS Task Launcher. 2 IAM roles for ECS Task Definition - 1) ECS Task Role 2) ECS Task Execution Role
  11. Expected output 3: A file outputs.json is created with a list of AWS resource values provisioned by the CDK.


Test

Testing ECS tasks using AWS Lambda

  1. Open file workflow_specs_pattern_1.json in your IDE and update JSON attributes based on the values for amazon-ecs-java-starter-pattern-1 of outputs.json.

  2. Open your command prompt / Mac terminal

  3. Go to path /<Path_to_your_cloned_rep>/amazon-ecs-and-aws-step-functions-design-patterns-starter-kit/amazon-ecs-java-starter-kit-tasklauncher/

  4. Copy jar file to S3 bucket. Use the following command

    aws s3 cp ../amazon-ecs-java-starter-kit-tasklauncher/target/amazon-ecs-java-starter-kit-tasklauncher-1.0.jar s3://${AWS_ACCOUNT_ID}-amazon-ecs-java-starter-kit-pattern-1-bucket/amazon_ecs_java_starter_kit_jar/
  5. Go to path /<Path_to_your_cloned_rep>/amazon-ecs-and-aws-step-functions-design-patterns-starter-kit/amazon-ecs-java-starter-kit-cdk

  6. Start Step Functions execution. Use the following command

    aws stepfunctions start-execution --state-machine-arn "arn:aws:states:${AWS_REGION}:${AWS_ACCOUNT_ID}:stateMachine:amazon-ecs-java-starter-kit-pattern-1" --input "$(cat workflow_specs_pattern_1.json)"
  7. Expected output 1: You will get a response as follows

    {
     "executionArn": "arn:aws:states:us-east-2:1234567890:execution:amazon-ecs-java-starter-kit-pattern-1:4ea1f256-a0bb-4692-b63a-6b80edc02cb7",
     "startDate": "2020-12-21T09:54:29.385000-06:00"
    }
  8. Expected output 2: In Step Functions console, state machine amazon-ecs-java-starter-kit-pattern-1 changes to Running state

  9. Expected output 3: After a minute or two, your State machine execution status will be successful as follows.

    Alt

  10. Expected output 4: In S3 bucket, you will see 10 extra jar files which are copied by ECS task instances.

    Alt

  11. Expected output 5: In DynamoDB tables, you will find new items as follows:

    Table Number of items Sample column values
    workflow_summary_pattern_1 1 number_of_tasks = 10, completed_tasks = 10
    workflow_details_pattern_1 10 status = Completed

Testing ECS tasks using Step Functions native integration

  1. Open file workflow_specs_pattern_2.json in your IDE and update JSON attributes based on the values for amazon-ecs-java-starter-pattern-2 of outputs.json.

  2. Open your command prompt or Mac terminal

  3. Go to path /<Path_to_your_cloned_rep>/amazon-ecs-and-aws-step-functions-design-patterns-starter-kit/amazon-ecs-java-starter-kit-tasklauncher/

  4. Copy jar file to S3 bucket. Use the following command

    aws s3 cp ../amazon-ecs-java-starter-kit-tasklauncher/target/amazon-ecs-java-starter-kit-tasklauncher-1.0.jar s3://${AWS_ACCOUNT_ID}-amazon-ecs-java-starter-kit-pattern-2-bucket/amazon_ecs_java_starter_kit_jar/
  5. Go to path /<Path_to_your_cloned_rep>/amazon-ecs-and-aws-step-functions-design-patterns-starter-kit/amazon-ecs-java-starter-kit-cdk

  6. Start Step Functions execution. Use the following command

    aws stepfunctions start-execution --state-machine-arn "arn:aws:states:${AWS_REGION}:${AWS_ACCOUNT_ID}:stateMachine:amazon-ecs-java-starter-kit-pattern-1" --input "$(cat workflow_specs_pattern_2.json)"
    
  7. Expected outputs are similar to Pattern 1


Cleanup

  1. Go to /<Path_to_your_cloned_rep>/Amazon-ecs-java-starter-kit/amazon-ecs-java-starter-kit-cdk

  2. Delete DynamoDB tables

    ./delete_ddb_items.sh workflow_details_pattern_1 workflow_summary_pattern_1
    ./delete_ddb_items.sh workflow_details_pattern_2 workflow_summary_pattern_2
  3. Empty S3 buckets

    aws s3 ls s3://${AWS_ACCOUNT_ID}-amazon-ecs-java-starter-kit-pattern-1-bucket/amazon_ecs_java_starter_kit_jar/ | grep _ | awk '{print $NF}' | while read OBJ; do aws s3 rm s3://${AWS_ACCOUNT_ID-}amazon-ecs-java-starter-kit-pattern-2-bucket/amazon_ecs_java_starter_kit_jar/$OBJ;done
    aws s3 ls s3://${AWS_ACCOUNT_ID-}-amazon-ecs-java-starter-kit-pattern-2-bucket/amazon_ecs_java_starter_kit_jar/ | grep _ | awk '{print $NF}' | while read OBJ; do aws s3 rm s3://${AWS_ACCOUNT_ID-}amazon-ecs-java-starter-kit-pattern-2-bucket/amazon_ecs_java_starter_kit_jar/$OBJ;done
  4. Delete S3 buckets

    aws s3 rm s3://${AWS_ACCOUNT_ID-}amazon-ecs-java-starter-kit-pattern-1-bucket/amazon_ecs_java_starter_kit_jar/amazon-ecs-java-starter-kit-tasklauncher-1.0.jar
    aws s3 rm s3://${AWS_ACCOUNT_ID-}amazon-ecs-java-starter-kit-pattern-2-bucket/amazon_ecs_java_starter_kit_jar/amazon-ecs-java-starter-kit-tasklauncher-1.0.jar
  5. Delete ECR Repositories

    aws ecr delete-repository --force --repository-name amazon-ecs-java-starter-kit-pattern-1
    aws ecr delete-repository --force --repository-name amazon-ecs-java-starter-kit-pattern-2
  6. Cleanup stacks

    cdk destroy --force --all

Contributors

  1. Sarma Palli, Senior DevOps Cloud Architect, Amazon Web Services
  2. Ravi Itha, Senior Big Data Consultant, Amazon Web Services

License Summary

This sample code is made available under the MIT-0 license. See the LICENSE file.

amazon-ecs-and-aws-step-functions-design-patterns-starter-kit's People

Contributors

amazon-auto avatar dependabot[bot] avatar itharavi avatar palsarma avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.