AWS, Cloud Computing

4 Mins Read

Serverless Migration of Data Using DynamoDB Import from S3 – Part 1

Voiced by Amazon Polly

Overview

Before DynamoDB import from S3, you had a few alternatives for bulk importing data into the DynamoDB table using a data pipeline. A data loader may be needed for bulk data import, which costs money to create and maintain. Loading terabytes of data may take days or weeks until the solution is deployed across a fleet of virtual instances.

Freedom Month Sale — Upgrade Your Skills, Save Big!

  • Up to 80% OFF AWS Courses
  • Up to 30% OFF Microsoft Certs
Act Fast!

Introduction to DynamoDB import from S3

DynamoDB import from S3 is fully serverless which enables you to bulk import terabytes of data from Amazon S3 into a new DynamoDB. Source data can either be a single Amazon S3 object or multiple Amazon S3 objects that use the same prefix. Every record in S3 should have a sort key(optional) and partition key to match the schema of the target table. It provides the ability to import application data staged in CSV, DynamoDB JSON, or ION format to DynamoDB speeds up the migration of legacy applications to the AWS cloud. You can start imports using AWS CLI, AWS management console, or AWS SDK.

You do not need to supply additional capacity when defining a new table because DynamoDB import from S3 does not use any writing capacity. You must confirm that the person requesting the import has the authority to list and obtain data from the source S3 bucket to import data between AWS accounts. Additionally, the requester must be given access according to the S3 bucket policies.

Benefits

  • Move data more easily with a few clicks using the AWS console
  • It supports Cross account and cross-region sharing
  • It is simple and easy to use.

Steps to import data from S3 to DynamoDB

Creating an S3 bucket

  1. Log in to the Amazon console and search for S3.
  2. Click on Create bucket. Provide a unique bucket name and select the region.
  3. Upload .CSV files to the bucket.

DynamoDB1

Note: Only CSV, DynamoDB JSON, or ION format are supported for importing data to DynamoDB

Importing S3 data to DynamoDB

  1. In the search bar, search for DynamoDB and select the service. Choose Imports from S3 in the navigation pane.
  1. Click on Import from S3

DynamoDB2

3. Provide the appropriate details as below:

  • Select the S3 bucket created in Step1
  • Select the AWS account where your source S3 bucket is located
  • Choose the compression type as per your source S3 data
  • Select the appropriate file format
  • Choose the CSV delimiter character as per data in the source file

DynamoDB3

DynamoDB4

Click on Next, to navigate to the next page

  • Provide the table name where you want to store data
  • Provide the partition key that should match the data
  • Provide a sort key if it is required
  • Choose table settings as the default setting. You can select customized settings to see additional options.

DynamoDB5

DynamoDB6

Click on Next, to navigate to the next page.

  • Review the options carefully before importing data. Once it is imported you cannot change it.
  • Click on Import

DynamoDB7

DynamoDB8

4. Check the status of your import on the Imports from the S3 page. This page shows all import jobs from the last 90 days.

DynamoDB9

5. To check the results of the import. Navigate to the Tables

DynamoDB10

 

Troubleshooting the errors

You can come across common mistakes including syntax errors, formatting issues, and records without the necessary primary key. Error information is recorded in the CloudWatch logs for later examination. The logging will stop once it reaches a threshold of 10,000, but the import will still go on.

  1. Go to CloudWatch log groups in the navigation panel of CloudWatch.
  2. Here you can see Log groups with name /aws-dynamodb/imports. The log stream indicates whether the import is successful or failed along with metadata.

DynamoDB11

DynamoDB12

Conclusion

DynamoDB import from S3 provides an easy way to import a huge amount of data from S3 to DynamoDB’s new table. It is integrated with CloudWatch which creates a log entry for each error. Using DynamoDB import from the S3 does not require any additional services to migrate it to DynamoDB which reduces the maintenance cost and speeds up the process.

Freedom Month Sale — Discounts That Set You Free!

  • Up to 80% OFF AWS Courses
  • Up to 30% OFF Microsoft Certs
Act Fast!

About CloudThat

CloudThat is an award-winning company and the first in India to offer cloud training and consulting services worldwide. As a Microsoft Solutions Partner, AWS Advanced Tier Training Partner, and Google Cloud Platform Partner, CloudThat has empowered over 850,000 professionals through 600+ cloud certifications winning global recognition for its training excellence including 20 MCT Trainers in Microsoft’s Global Top 100 and an impressive 12 awards in the last 8 years. CloudThat specializes in Cloud Migration, Data Platforms, DevOps, IoT, and cutting-edge technologies like Gen AI & AI/ML. It has delivered over 500 consulting projects for 250+ organizations in 30+ countries as it continues to empower professionals and enterprises to thrive in the digital-first world.

FAQs

1. How much does it cost to import data from S3?

ANS: – The cost of running an import is based on the uncompressed size of the source data in S3, multiplied by a per-GB cost, which is $0.15 per GB in the US East Region. Items that are processed but fail to load into the table due to some formatting issues in the source data are also billed as part of the import process.

2. What are the limitations of the Import from S3 feature?

ANS: – Data cannot be imported to already existing DynamoDB tables.

WRITTEN BY Anusha

Anusha works as Research Associate at CloudThat. She is an enthusiastic person about learning new technologies and her interest is inclined towards AWS and DataScience.

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!