Voiced by Amazon Polly
AWS S3 is Serverless Simple Storage Service. S3 stores huge amounts of Data, such as archiving your data for compliance purposes, building data Lakes, and storing Media Files. You can upload a file in S3, like Audio, Video, Documents, etc. All the Files in S3 are uploaded as Objects in S3 Bucket. Objects consist of metadata and file. S3 can be accessed from Management Console and Programmatically accessible through AWS SDK and CLI. Even though you can store unlimited objects in the S3 Buckets. While uploading the objects in S3 Bucket through Management Console, there are a few Limitations, such as
- The maximum size of the Object in S3 is 5TB.
- The maximum file you can upload through the management console in the S3 bucket is up to 160GB. If you want to upload a file greater than 160GB, then we need to upload it programmatically through CLI, SDKs, or through REST API
With Single PUT Operation, you can upload a single object of size 5GB through SDKs, CLI, and REST APIs.
If you want to upload an Object of 5TB size, the Solution is Multipart Upload. Multipart Upload improves the performance of the Object upload as it uploads the objects and the parts in parallel in any order. In case of failure to upload any part of the Object, you can retransmit the part independent of other parts. It is recommended that if your Object Size is larger than 100MB prefer Multipart Upload. You can perform multipart Upload for an object from the size 5 MB to 5 TB in size. Multipart Upload has the following Advantages.
- Improves the upload throughput.
- Fast recovery from the network issues
- Pause upload and resume upload at any time.
Now let’s see How to perform a Multipart Upload Using CLI
A Step-By-Step Guide to Perform Multipart Upload.
Step 1: Install AWS CLI, after installation to check the CLI enter the following command.
aws-cli/2.11.1 Python/3.11.2 Windows/10 exe/AMD64 prompt/off
Step 2: Configure your IAM credentials for the user to perform the Multipart upload
Go to IAM and the user and create Access Keys from the IAM console to the user and do the Following
AWS Access Key ID [None]:
AWS Secret Access Key [None]:
Default region name [us-east-1]: us-east-1
Default output format [json]:
In the region, specify the region where your S3 bucket is created.
To perform a multipart upload, you must split the Object into multiple parts.
If you use Linux OS, use the Split command to split the file.
Now Initiate the Multipart Upload to do that. Use the aws low-level API create-multipart-upload command.
aws s3api create-multipart-upload --bucket --key
aws s3api create-multipart-upload –bucket demoica –key ‘demo.mp4’
Enter the Bucket name in which you want to upload the Object, and whatever key you specify here, the object name in the Bucket will be as mentioned in the command.
The Output looks as shown below.
Note: You get the Upload ID once you initiate the Multipart upload. Retrieve that upload ID as it is required in all the further steps.
Helping organizations transform their IT infrastructure with top-notch Cloud Computing services
- Cloud Migration
- AIML & IoT
In this Step, upload the part in the Multipart upload. To Upload the part, specify the part number. The part number will be any number between 1-10000. To upload the part, you use the upload-part S3 low-level command.
aws s3api upload-part [--body ] --bucket --key --part-number --upload-id
aws s3api upload-part --bucket demoica --key 'demo.mp4' --part-number 1 --body demo.zip.001 --upload-id
Specify the Bucket name key, Part no, and Body should contain your part to be uploaded and enter the same upload ID generated during the multipart initialization. The minimum part size is 5MB Output of this operation looks as shown below:
This command returns the Entity tag(Etag), which is the object’s hash. Retrieve the Etag and copy it in the text file as you need this ETag for the Further process.
Follow the same process for all the remaining parts of the Object. Just change the part number, and specify the local file part to be uploaded in the Body. Retrieve and copy the Etag Value of all the parts.
aws s3api upload-part --bucket demoica --key 'demo.mp4' --part-number 2 --body demo.zip.002 --upload-id
aws s3api upload-part --bucket demoica --key 'demo.mp4' --part-number 3 --body demo.zip.003 --upload-id
Now List all the parts by using the list part commands.
aws s3api list-parts --bucket demoica --key 'demo.mp4' --upload-id
In this Step, create a JSON File that contains all the Etag values, as Show Below. Save the file in your local system.
"ETag": "Enter your Etag Value for part 1",
"ETag": " Enter your Etag Value for part 4",
Now use the complete-multipart-upload command and complete the multipart Upload. Just change the –multipart-upload file to your JSON file and Bucket Name and Key and specify the Upload id as shown Below.
aws s3api complete-multipart-upload --multipart-upload file://fileparts.json --bucket demonic --key 'demo.mp4' --upload-id RqoiA52tcQfwbcOokKE33UH4JP42tlqGiSLsRcvDxfRuyfVsLcDuGwZ_UQYixURQMLMSCDBf1gnwfr0sse_mH8Jm1RuAxW8QRAp4NAs6QhLUQQ3UvKoiRdSbsgKN1lg1
Now your multipart is completed, and you can go to the console and check the Object in the Bucket. You can see the Object.
S3 is a Serverless Object storage service widely used for data storage. S3 Multipart upload is recommended way to upload larger objects. Once you initiate a multipart upload, Amazon S3 keeps all the parts of the objects until you either complete or stop the Upload. You will be billed throughout its lifetime for all storage, network bandwidth, and all requests you make for this multipart Upload and its associated parts. So it is recommended we recommend configuring a lifecycle rule so that all the incomplete multipart uploads are deleted after a specified number of days. This call for AbortIncompleteMultipartUpload action.
Get your new hires billable within 1-60 days. Experience our Capability Development Framework today.
- Cloud Training
- Customized Training
- Experiential Learning
CloudThat is an official AWS (Amazon Web Services) Advanced Consulting Partner & Training partner and Microsoft Solutions Partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best-in-industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.
WRITTEN BY Deepa Dharanendra Saibannavar