AWS, Cloud Computing

3 Mins Read

13 Easy Steps for Syncing Data from On-Premises To AWS S3 Using DataSync

Voiced by Amazon Polly

Introduction

Data synchronization is syncing data across two or more devices and automatically updating changes between them to ensure system consistency.

While the massive amount of data stored in the cloud poses issues, it also makes it the ideal option for big data. Today’s data solutions provide quick and straightforward tools for avoiding repetitive activities, resulting in data in sync throughout the system.

Freedom Month Sale — Upgrade Your Skills, Save Big!

  • Up to 80% OFF AWS Courses
  • Up to 30% OFF Microsoft Certs
Act Fast!

Agent creation

Use the CLI command below to acquire the most recent DataSync Amazon Machine Image (AMI) ID for the selected AWS Region.

  1. aws ssm get-parameter –name /aws/service/datasync/ami –region $region

Launch the agent using your AMI from the Amazon EC2 launch wizard from the AWS account where the source file system is stored. To start the AMI, go to the following URL: https://console.aws.amazon.com/ec2/v2/home?region=source-file-system-region#LaunchInstanceWizard:ami=ami-id

Note:- Please make sure that you are launching a machine having RAM of 16 GiB or more

Here I am launching t2.xlarge for my agent with public IP and opening an HTTP port.

Once your agent creating is done now, let us go to DataSync.

Step by Step Guide for Configuring DataSync

  1. Open the DataSync service from the AWS Console and select the below option for the data transfer option. Click on Get Started.
    Data Sync
  2. Select the Hypervisor as AWS EC2, select the Endpoint type as Public service Endpoints, and provide the public IP of the agent in the agent address as below
    Data Sync
  3. Once the activation key retrieval is successful you can provide the agent with a name and click on create button then your agent will be created and shows Agent status as Online.
  4. Before creating a task for the DataSync, create an EC2 (t2.micro) which will act as an On-premise system.
  5. Please open the RDP and NFS ports in the On-premise security group
  6. Now install the NFS server On-Premise such that it will sync the local files to the cloud through NFS.
  7. Create a folder and text file inside it. In my case, I have created a folder with the name test and a text file with the name sampletext.txt, which we want to sync. Go to the properties of the folder. Select the NFS sharing option and click on Manage NFS sharing and check the share this folder as shown below.
  8. Click Apply and then OK to save the configuration over the folder
    Data Sync
  9. Now go to the DataSync console and create a task and choose the configuration details as below:
    a. Location as the new location
    b. The location type should be Network File System (NFS)
    c. Select the agent which we created in the steps
    d. Provide the Ip address in the NFS server domain
    e. Provide the mount path in my case as /test
    f. The destination location is S3, and select the S3 bucket to where these files need to be sync
  10. Leave the remaining fields as default and click on create
  11. Select the autogenerate for creating the IAM in the provided fields
  12. Once the previous step is completed, wait until your task’s status becomes available.
  13. Now start the task and wait for a while; this starts syncing your on-premise data to AWS S3

Conclusion

We investigated and highlighted the relevance of data synchronization in this blog. When just the changed data is transferred, data synchronization will work smoothly. As a result, each synchronization procedure uses a marker to determine the most up-to-date information.

Freedom Month Sale — Discounts That Set You Free!

  • Up to 80% OFF AWS Courses
  • Up to 30% OFF Microsoft Certs
Act Fast!

About CloudThat

CloudThat is an award-winning company and the first in India to offer cloud training and consulting services worldwide. As a Microsoft Solutions Partner, AWS Advanced Tier Training Partner, and Google Cloud Platform Partner, CloudThat has empowered over 850,000 professionals through 600+ cloud certifications winning global recognition for its training excellence including 20 MCT Trainers in Microsoft’s Global Top 100 and an impressive 12 awards in the last 8 years. CloudThat specializes in Cloud Migration, Data Platforms, DevOps, IoT, and cutting-edge technologies like Gen AI & AI/ML. It has delivered over 500 consulting projects for 250+ organizations in 30+ countries as it continues to empower professionals and enterprises to thrive in the digital-first world.

FAQs

1. Where can I move data to and from?

ANS: – DataSync supports the following storage location types: Network File System (NFS) shares, Server Message Block (SMB) shares, Hadoop Distributed File Systems (HDFS), self-managed object storage, Google Cloud Storage, Azure Files, AWS Snowcone, Amazon Simple Storage Service (Amazon S3), Amazon Elastic File System (Amazon EFS) file systems, Amazon FSx for Windows File Server file systems, Amazon FSx for Lustre file systems, and Amazon FSx for OpenZFS file systems.

2. Can I use AWS DataSync to copy data from other public clouds to AWS?

ANS: – Yes. Using AWS DataSync, you can copy data from Google Cloud Storage using the S3 API or Azure Files using the SMB protocol. Deploy the DataSync agent in your cloud environment or on Amazon EC2, create your source and destination locations, and then start your task to begin copying data. Learn more about using DataSync to copy data from Google Cloud Storage or Azure Files.

3. Does AWS DataSync preserve the directory structure when copying files?

ANS: – Yes. When transferring files, AWS DataSync creates the same directory structure as the source location’s structure on the destination.

WRITTEN BY Shaik Munwar Basha

Share

Comments

    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!