AWS, Cloud Computing

5 Mins Read

A Guide to Build Multiple Amazon S3 Buckets Using Terraform


Simple Storage Service, or AWS S3, is one of the most popular AWS services.

There is a good possibility that Amazon S3 buckets will be used directly or indirectly when your workload is deployed on AWS.

Automating your infrastructure as much as possible makes sense from a DevOps standpoint. In this blog, let’s see the demonstration of establishing an Amazon S3 bucket using Terraform, one of the most well-liked IaC tools.


  • AWS account that is active
  • Installed Terraform on Your System
  • Basic Terraform Knowledge: Terraform Credentials Setup
  • An IDE or simple editor like Visual Studio Code

Pioneers in Cloud Consulting & Migration Services

  • Reduced infrastructural costs
  • Accelerated application deployment
Get Started

Steps to Create Multiple Amazon S3 Buckets using Terraform

Step 1: Provider Statement

Terraform configuration files must be created and their contents specified before Terraform can establish an AWS S3 bucket on AWS.

Generate a file and a project.

As you can see, your machine generates a profile when you perform the AWS CLI setup.

This is how It looks like-


You’ll note that we specifically mentioned dealing with the AWS provider. You cannot work with AWS without completing this step, which is essential.

Step 2: Create a New Project Directory

Provider declaration is what you’ve specified. Instruct Terraform to download the provider-specific code and plugins required to interact with AWS.

Without first doing terraform init, the following error will appear when you attempt to run the terraform plan or terraform apply commands:


It proposes executing Terraform init and states that a provider is required, as you can see.

Let’s do that now.


When you run Terraform init, AWS-specific plugins are installed, and you are then prepared to use Terraform to construct your first resource.

Step 3: Build a configuration file in Terraform to create many Amazon S3 buckets.

Now that we are prepared let’s construct a terraform example configuration file to establish numerous S3 buckets on Amazon. As I already mentioned, we may do this using the count or for_each meta-arguments. Let’s examine each one in turn.

  1. Use count to create many Amazon S3 buckets in Terraform

You can use count as a meta-argument in your resource or module in Terraform. A whole number is accepted, and many instances of resources or modules are created.

Let’s deconstruct this a little. Preferably, only one resource should be created using the code below for the count to function.


Then you can create 5 resources by specifying count = 5


AWS S3 bucket names cannot be the same, which is the issue. Once a bucket has been made, neither you nor anyone else can make another one with the same name.

If you attempt to do so following the creation of the first bucket, all subsequent ones will fail with the following error:

BucketAlreadyOwnedByYou: Your prior request to create the named bucket was successful, and as a result, you are the owner.

You can enter all the bucket names you want to construct into a list-type variable.

For instance,


As you can see from the example above, the count argument uses the list length as its value. Also, the bucket name is set using the familiar to us var.bucket_list[count.index] or list[index] syntax to access list or array members.

This will make using Terraform to construct a large number of Amazon S3 buckets much cleaner efficient.

  1. Make a list of buckets. Using for_each in Terraform, create several S3 buckets

You can use the for_each meta-argument in a resource’s module or inline block. The way it builds many instances of resources based on a set or map is similar to how count works.

So this is how the for_each statement in the code creates several buckets in Terraform.



With all the bucket names we wish to construct, we have also established a variable called bucket_list.

After that, we declared for_each = toset(var.bucket list).

We used the built-in function toset to transform our list into a map because each only supports a map or set.

To use the bucket name from the variable, we then stated bucket = each.key. Each.key & Each.value are the same for a set. Hence, in theory, you can employ any of them.

Complete Code:

Your final code will differ slightly from ours depending on which option you select from the list above. For each is the preferred method between count and for_each, thus, I will use it instead.


Step 4: Use Terraform to deploy the Configuration to Create Many Buckets.

With our configuration, we are ready. This terraform configuration should now be deployed.

In the location where your Terraform configuration file is located, launch a terminal.

Run terraform init to initialize the directory with the relevant Amazon plugins.

Apply Terraform apply

We have successfully produced our resources; we need to validate them now.


Step 5: Create a bucket and verify it

With Terraform, you were able to create several Amazon S3 buckets successfully.

Three buckets are generated, as seen in the screenshot below.

But you may also check the same on the AWS console if you’d like.


Step 6: Clean Up

Finally, if you use this exercise to learn something, you can tidy up by deleting your generated resource.

terraform destroy

Hit Enter after typing “yes”

Your resources are obliterated as soon as you press enter.


In this blog, we learned how to use Terraform to create several Amazon S3 buckets. We discovered two techniques (count and for_each) that can efficiently build several resources without duplicating your code.

This makes our code cleaner and easier to maintain.

Making IT Networks Enterprise-ready – Cloud Management Services

  • Accelerated cloud migration
  • End-to-end view of the cloud environment
Get Started

About CloudThat

CloudThat is an official AWS (Amazon Web Services) Advanced Consulting Partner and Training partner and Microsoft Gold Partner, helping people develop knowledge of the cloud and help their businesses aim for higher goals using best in industry cloud computing practices and expertise. We are on a mission to build a robust cloud computing ecosystem by disseminating knowledge on technological intricacies within the cloud space. Our blogs, webinars, case studies, and white papers enable all the stakeholders in the cloud computing sphere.

Drop a query if you have any questions regarding AWS S3, Terraform and I will get back to you quickly.

To get started, go through our Consultancy page and Managed Services Package that is CloudThat’s offerings.


1. What are the rules for naming buckets in Terraform?

ANS: – Terraform will give a random and distinct name if it is left out. The word must be lowercase and no longer than 63 characters. Here you may find a comprehensive list of bucket naming guidelines. Prefix bucket – (Optional, Forces new resource) gives rise to a distinctive bucket name starting with the specified prefix.

2. When should we use Amazon S3 with Terraform?

ANS: – You can manually configure many Amazon S3 buckets if you need to deploy them for an application, but doing so takes a while, and you’d have to do it again the next time you need AWS cloud storage. Instead, you can speed up the process by using a Terraform template. The Amazon S3 bucket settings are contained in a Terraform template, which can be used to deploy numerous Amazon S3 buckets at once in a matter of seconds without going through the same process manually. Making the Terraform files is all necessary to deploy the Amazon S3 buckets.


Huda is working as the Front-end Developer in Cloudthat Technologies. She is experienced in building and maintaining responsive websites. She is keen on learning about new and emerging technologies. In addition to her technical skills, she is a highly motivated and dedicated professional, committed to delivering high quality work.



    Click to Comment

Get The Most Out Of Us

Our support doesn't end here. We have monthly newsletters, study guides, practice questions, and more to assist you in upgrading your cloud career. Subscribe to get them all!