Using AWS Batch jobs to bulk copy/sync files in S3

Overview

Using AWS Batch jobs to bulk copy/sync files in S3

Use Cases

Overview

This guide details how to use AWS Batch to perform bulk copy/sync activities on files in S3. Batch allows users to run massively scalable computing jobs on AWS by provisioning the optimal compute resources needed. The user is able to focus on configuring what the job should be doing instead of provisioning infrastructure.

Here we use Batch's Managed Compute environment with the Fargate provioning model, where Batch runs containers without the user having to manage the underlying EC2 instances. We will package our application within an ECR container image that Fargate will use to launch the environment where the job runs.

With Batch configured, the user will upload a .csv file in an S3 location that has it's events being logged in Cloudtrail. This is being monitored by Eventbridge which will kick off the Batch job once the appropriate file is uploaded. The .csv file contains a list of S3 source/destination pairs to be copied/synced in the job as detailed below. For accessing S3 resources in different AWS accounts, be sure to look at the IAM Roles section below

Architecture

This is an overview of the architecture described above:

awsBatchS3SyncArch

ECR Image

The ECR Image contains our application logic to sync/copy S3 files based on the csv input. This is done in the python script s3CopySyncScript.py. Based on the CSV input, it will perform a managed transfer using the copy api if a file is given as a source/destination. If a prefix is given as source/destination, it will use the AWS CLI to perform an aws s3 sync.

The Dockerfile builds an image based on AL2, installing the AWS cli, python3, boto3, and setting other s3 configuration for optimal transfers.

Create an ECR repository and use these commands to build and push the image as latest using the CLI.

aws ecr get-login-password --region us-east-1 | docker login --username AWS --password-stdin <<
   
    >>.dkr.ecr.us-east-1.amazonaws.com

docker build -t <<
    
     >> .

docker tag <<
     
      >>:latest <<
      
       >>.dkr.ecr.us-east-1.amazonaws.com/<<
       
        >>:latest docker push <<
        
         >>.dkr.ecr.us-east-1.amazonaws.com/<<
         
          >>:latest 
         
        
       
      
     
    
   

S3 Input Location

The .csv file will be uploaded to s3 to kick off the job. Designate an area for this. In this example, I've created a sample bucket with the prefix input and will be uploading files here. Notice also the uploaded csv is prefixed as such: s3_batch_sync_input-*.csv. Using a naming convention like this can simplify eventbridge monitoring which we'll see below.

s3InputSample

CSV File Format Details

The csv file should be formatted as such.

source destination
s3://some-bucket/source/ s3://some-bucket/dest/
s3://some-bucket/sourcetwo/ s3://some-bucket/desttwo/
s3://some-bucket/sourceindiv/individualFile.txt s3://some-bucket/dest/individualFile.txt

The first 2 rows in this example are prefixes where we want an s3 sync to occur. The last row is a specific object we want to copy directly. The bucket/AWS account does not need to be the same, as long as IAM permissions are properly applied as noted below.

Cloudtrail Monitoring

Setup a trail that will monitor the S3 location where the input file will land. In setting up the trail you can set these options as you see fit: name, an s3 location for the logs, log encryption using KMS, log validation. You will need to make sure that S3 data events for the location where the input file lands is enabled at a minimum:

cloudTrailS3DataEvents

AWS Batch

Compute Environment

This will serve as a pool that our batch jobs can pull resources from. Create a managed environment with the instance configuration set to fargate for this example. Set the max vCPUs to set an upper limit for concurrent fargate resources being used. Other configuration options for an AWS Batch Compute Environment are detailed here. Lastly pick the VPC and subnets your environment will be located in, and security groups that may need to be attached to instances. If you're using S3 VPC gateway endpoints this would be key. In our example, we're using the default VPC since we're accessing S3 through public internet. Once complete, the environment state would be ENABLED.

batchComputeEnvironment

Job Queue

AWS Batch Jobs are submitted to a job queue until compute environment resources are available for the job to run. You can have multiple queues with different priorities which pull from different compute environments. More details are here. For this example, create a queue and attach it to the compute environment previously made.

batchJobQueue

Job Definition

The Job Definition acts as a template from which to launch our individual jobs. Detailed instructions are here for additional configuration.

  • Basics

    • Enter a name for the template and pick fargate for the platform.
    • Depending on your anticipated use, set a retry strategy and timeout. For this example we set 2 job attempts and a timeout of 120 seconds.
    • We'll also put job logs in the default AWS Batch logs group in cloudwatch but it can be customized as detailed here.
  • Python Script Usage

    • Note the usage of the script described here and then set the container properties in the next step as required
    • Syntax: python s3CopySyncScript.py < > < > <
      > < >
      • header indicates whether the input csv has a header row
      • sync_delete indicates whether the --delete flag is used in case of an aws s3 sync
      • EG: Syntax: python s3CopySyncScript.py my-s3-bucket s3_batch_sync_input-my-sample.csv True True
  • Container Properties

    • In the image box, put the URI of the ECR image that was created.
    • The Command is used as the CMD instruction to execute our container. In our case, we want to execute the python script and pass it our input file details.
      • In JSON form we enter: ["python3","s3CopySyncScript.py","Ref::s3_bucket","Ref::s3_key", "True", "True"]
        • In this example, I have a header in the input and am using the --delete flag an aws s3 sync
    • For vCPUs and memory, we set 1 and 2GB to be conservative for this example. Set it as needed.
    • Job Role and Execution Role are detailed below.
    • We ticked the box for assign public IP since we're accessing S3 through the public internet and are using Fargate platform version 1.4.0
  • Parameters

    • In the python command above, notice the "Ref::s3_bucket","Ref::s3_key". These are parameters to be substituted when a job is invoked through Eventbridge.
    • In this section, we could set defaults for them or other parameters. See more details here.

batchJobDef

IAM Roles

Execution Role
The Execution role is used to setup individual ECS tasks where we run the Batch jobs and for logging. The role should have a trust relationship with ecs-tasks.amazonaws.com. In our example, the AWS managed policy AmazonECSTaskExecutionRolePolicy is attached along with an inline policy giving it permission to create log groups if needed.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "logs:CreateLogGroup",
                "logs:CreateLogStream",
                "logs:PutLogEvents",
                "logs:DescribeLogStreams"
            ],
            "Resource": [
                "arn:aws:logs:*:*:*"
            ]
        }
    ]
}

More details about ECS task execution roles are here.

Job Role
The Job Role is an IAM role that is used provide AWS API access to individual running jobs. Here we configure access to AWS resources the job accesses, the files in S3 in our case. Here we're only accessing resources in one bucket, but be sure to configure this as needed depending on your sources/destinations. Again, the role should have a trust relationship with ecs-tasks.amazonaws.com.

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "VisualEditor0",
            "Effect": "Allow",
            "Action": [
                "s3:PutObject",
                "s3:GetObject",
                "s3:ListBucketMultipartUploads",
                "s3:ListBucket",
                "s3:ListMultipartUploadParts"
            ],
            "Resource": [
                "arn:aws:s3:::s3-batch-sync-article/*",
                "arn:aws:s3:::s3-batch-sync-article"
            ]
        }
    ]
}

If you're accessing S3 objects from/syncing to destinations in multiple accounts, Cross Account S3 Resource Access would need to be configured as detailed here. The account where the Batch jobs run can be considered Account A where a policy providing access resources in AWS Account B's buckets is attached to the IAM role. In Account B, the bucket policy would be modified to allow access from Account A's IAM role. More details about these task IAM roles can be found here.

Eventbridge Rule & Job Invokation

Create an Eventbridge rule that will invoke the AWS Batch job.

Here, the S3 uploads are being logged in cloudtrail. An eventbridge rule will invoke the job with an appropriate upload. Using the naming convention mentioned above, we can use a custom event pattern match and content filtering to only trigger on certain uploads.

{
  "source": ["aws.s3"],
  "detail-type": ["AWS API Call via CloudTrail"],
  "detail": {
    "eventSource": ["s3.amazonaws.com"],
    "eventName": ["PutObject", "CompleteMultipartUpload"],
    "requestParameters": {
      "bucketName": ["s3-batch-sync-article"],
      "key": [{
        "prefix": "input/s3_batch_sync_input-"
      }]
    }
  }
}

Here, we'll trigger the target for this rule when a file lands in the appropriate location with the required prefix.

AWS Batch Target

  • Set the target for this rule to a Batch job queue.
  • Give it the job queue and job definition set above. Provide a name for the jobs that will run.
  • Use configure input to pass details about the input file to the job. In our job, the bucket and key is required as arguments to the python script which we supply as Job Parameters.
    • Use the first input path box to get the bucket and key from the event that triggered the Eventbridge rule. This gets the bucket and key
      • {"S3BucketValue":"$.detail.requestParameters.bucketName","S3KeyValue":"$.detail.requestParameters.key"}
    • The input template box lets you pass parameters or other arguments to the job that is to be invoked. Here we pass the s3_bucket and s3_key job parameters
      • {"Parameters" : {"s3_bucket": , "s3_key": }}
    • See more details about AWS Batch Jobs as CloudWatch Events Targets here
Owner
AWS Samples
AWS Samples
A Pancakeswap v2 trading client (and bot) with limit orders, stop-loss, custom gas strategies, a GUI and much more.

Pancakeswap v2 trading client A Pancakeswap trading client (and bot) with limit orders, stop-loss, custom gas strategies, a GUI and much more. If you

571 Mar 15, 2022
A Flask inspired, decorator based API wrapper for Python-Slack.

A Flask inspired, decorator based API wrapper for Python-Slack. About Tangerine is a lightweight Slackbot framework that abstracts away all the boiler

Nick Ficano 149 Jun 30, 2022
Assassination API for getting random quotes from Assassination Classroom.

Assassination API Take advantage of what you have, while you have it. Quotes from Assassination Classroom Assassination classroom is one of best anime

Swanand Mulay 3 Jul 15, 2022
This bot automaticaly access to giveaway ! You can won free NFT !

This bot automaticaly access to giveaway ! You can won free NFT !

2s.py 28 Oct 20, 2022
ZenML ๐Ÿ™: MLOps framework to create reproducible ML pipelines for production machine learning.

ZenML is an extensible, open-source MLOps framework to create production-ready machine learning pipelines. It has a simple, flexible syntax, is cloud and tool agnostic, and has interfaces/abstraction

ZenML 2.6k Dec 27, 2022
A basic Ubisoft API wrapper created in python.

UbisoftAPI A basic Ubisoft API wrapper created in python. I will be updating this with more endpoints as time goes on. Please note that this is my fir

Ethan 2 Oct 31, 2021
A discord program that will send a message to nearly every user in a discord server

Discord Mass DM Scrapes users from a discord server to promote/mass dm Report Bug ยท Request Feature Features Asynchronous Easy to use Free Auto scrape

dropout 56 Jan 02, 2023
Covid19 API. (Currently Scrapes: worldometers)

Covid19-API An opensource Covid19 API (currently uses worldometer only) Output Examples Covid19 Every Country Data Request URL your-ip/api/all Resp

Amresh Prasad Sinha 14 Oct 03, 2022
Python script to extract all Humble Bundle keys and redeem them on Steam automagically.

humble-steam-key-redeemer Python script to extract all Humble keys and redeem them on Steam automagically. This is primarily designed to be a set-it-a

74 Jan 08, 2023
Fetch fund data from avanza.se using Python and some web scraping with bs4

Py(A)vanza Fetch fund data from avanza.se using Python and some web scraping with bs4. The default way is to display the data in the terminal, apply -

dunderrrrrr 1 Jan 27, 2022
A Telegram Bot which will ask new Group Members to verify them by solving an emoji captcha.

Emoji-Captcha-Bot A Telegram Bot which will ask new Group Members to verify them by solving an emoji captcha. About API: Using api.abirhasan.wtf/captc

Abir Hasan 52 Dec 11, 2022
A zero-dependency Python library for getting the Kubernetes token of a AWS EKS cluster

tokeks A zero-dependency Python library for getting the Kubernetes token of a AWS EKS cluster. No AWS CLI, third-party client or library (boto3, botoc

Chris Karageorgiou Kaneen 6 Nov 04, 2022
Start multiple bots using one script. VK RAID BOTNET

MultiRaidBotnet Start multiple bots using one script. VK RAID BOTNET ะ ัƒััะบะธะน launcher.py - ะณะปะฐะฒะฝั‹ะน ัะบั€ะธะฟั‚, ะทะฐะฟัƒัะบะฐัŽั‰ะธะน ะฒะตััŒ ะฑะพั‚ะฝะตั‚ config.py - ะฒ ะฝั‘ะผ ั…

2 Jul 22, 2022
A library for demo trading | backtest and forward test simulation

Trade Engine a library for demo trading | backtest and forward test simulation Features Limit/Market orders: you can place a Limit or Market order in

Ali Moradi 7 Jul 02, 2022
Hydro Quebec API wrapper.

HydroQC Hydro Quebec API wrapper. This is a package to access some functionalities of Hydro Quebec API that are not documented. Documentation https://

Olivier BEAU 9 Dec 02, 2022
A Matrix-Instagram DM puppeting bridge

mautrix-instagram A Matrix-Instagram DM puppeting bridge. Documentation All setup and usage instructions are located on docs.mau.fi. Some quick links:

89 Dec 14, 2022
Wrapper for Between - ๋น„ํŠธ์œˆ์„ ์œ„ํ•œ ํŒŒ์ด์ฌ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ

PyBetween Wrapper for Between - ๋น„ํŠธ์œˆ์„ ์œ„ํ•œ ํŒŒ์ด์ฌ ๋ผ์ด๋ธŒ๋Ÿฌ๋ฆฌ Legal Disclaimer ์˜ค์ง ๊ต์œก์  ๋ชฉ์ ์œผ๋กœ๋งŒ ์‚ฌ์šฉํ• ์ˆ˜ ์žˆ์œผ๋ฉฐ, ๋น„ํŠธ์œˆ์€ VCNC์˜ ์ž์‚ฐ์ž…๋‹ˆ๋‹ค. ์•…์˜์  ๊ณต๊ฒฉ์— ์ด์šฉํ• ์‹œ ์ฒ˜๋ฒŒ ๋ฐ›์„์ˆ˜ ์žˆ์Šต๋‹ˆ๋‹ค. ์‚ฌ์šฉ์— ๋”ฐ๋ฅธ ์ฑ…์ž„์€ ์‚ฌ์šฉ์ž๊ฐ€

1 Mar 15, 2022
This project checks the weather in the next 12 hours and sends an SMS to your phone number if it's going to rain to remind you to take your umbrella.

RainAlert-Request-Twilio This project checks the weather in the next 12 hours and sends an SMS to your phone number if it's going to rain to remind yo

9 Apr 15, 2022
Ulaavi for nuke, helps to keep our stocl elements organised.

Ulaavi Ulaavi for nuke, helps to keep our stock elements organised. Installation Downlaod ffmpeg from ffmpeg.org linux : https://johnvansickle.com/ffm

Arun Subramaniyam 17 Aug 24, 2022
AK-LEECH-BOT - AK LEECH BOT For python

Benefits :- โœ“ Google Drive link cloning using gclone.(wip) โœ“ Telegram File mirro

5 Mar 24, 2022