Coder Social home page Coder Social logo

batch-data-import-workflow-using-aws-step-functions-and-amazon-redshift's Introduction

Batch-Data-Import-Workflow-Using-AWS-Step-Functions-and-Amazon-Redshift

Objective: Import data from a CSV file stored in an S3 bucket into an Amazon Redshift table.

Prerequisites

  1. AWS Account: Please ensure you have an AWS account with the necessary permissions
  2. AWS CLI: Installed and configured.
  3. Git: Installed.
  4. Terraform: Installed.
  5. Amazon Redshift Cluster: Set up with necessary tables.
  6. Amazon S3 Bucket: Store the CSV files.

Workflow Steps

  1. Setup Infrastructure

    • Clone Repository:
      git clone https://github.com/aws-samples/step-functions-workflows-collection
      cd step-functions-workflows-collection/distributed-map-csv-iterator-tf
    • Initialize Terraform:
      terraform init
    • Apply Terraform Configuration:
      terraform apply
      • Confirm with yes.
  2. Upload CSV File to S3

    • Upload your CSV file to the designated S3 bucket.
  3. Modify State Machine

    • Update the state machine definition to include steps to load data into Amazon Redshift.
    • Example State Machine Definition:
      {
        "Comment": "A description of my state machine",
        "StartAt": "ProcessCSV",
        "States": {
          "ProcessCSV": {
            "Type": "Map",
            "ItemProcessor": {
              "ProcessorConfig": {
                "Mode": "DISTRIBUTED",
                "ExecutionType": "CHILD"
              },
              "StartAt": "LoadToRedshift",
              "States": {
                "LoadToRedshift": {
                  "Type": "Task",
                  "Resource": "arn:aws:states:::redshift-data:executeStatement.sync",
                  "Parameters": {
                    "ClusterIdentifier": "YOUR-REDSHIFT-CLUSTER-ID",
                    "Database": "YOUR-REDSHIFT-DB",
                    "Sql": {
                      "Fn::Sub": [
                        "COPY ${TableName} FROM 's3://${BucketName}/${FileKey}' CREDENTIALS 'aws_iam_role=${IamRole}' CSV;",
                        {
                          "TableName": "YOUR-TABLE-NAME",
                          "BucketName": "YOUR-BUCKET-NAME",
                          "FileKey": "PREFIX/metrics.csv",
                          "IamRole": "YOUR-REDSHIFT-IAM-ROLE"
                        }
                      ]
                    }
                  },
                  "End": true
                }
              }
            },
            "End": true
          }
        }
      }
  4. Trigger State Machine

    • Start Execution:
      • Use the following input:
        {
          "BucketName": "YOUR-BUCKET-NAME",
          "FileKey": "PREFIX/metrics.csv"
        }
  5. Verify Data in Redshift

    • Connect to your Redshift cluster and query the table to verify the data has been loaded.
  6. Cleanup Resources

    • Destroy Resources:
      terraform destroy
      • Confirm with yes.

batch-data-import-workflow-using-aws-step-functions-and-amazon-redshift's People

Contributors

jackobid avatar

Watchers

 avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.