Skip to content

TechDirectArchive

Hands-on IT, Cloud, Security & DevOps Insights

  • Home
  • About
  • Advertise With US
  • Reviews
  • Contact
  • Toggle search form
Home » Configuration Management Tool » Create an S3 Bucket with Terraform

Create an S3 Bucket with Terraform

Posted on 14/06/202301/07/2023 Imoh Etuk By Imoh Etuk No Comments on Create an S3 Bucket with Terraform
Create-S3-Bucket-with-Terraform

In this post, I will show you how to create an S3 Bucket with Terraform. Amazon Simple Storage Service (Amazon S3) is an object storage service offering industry-leading scalability, data availability, security, and performance. It is used by AWS customers all of sizes to store data such as analytics data, log files, static website files, application data, pictures and videos as well as for backup and retrieval of data. You may also need to install packages to EC2 using Terraform. Terraform is an Infrastructure as Code (IaC) tool by HashiCorp that lets you create resources on AWS, Azure, GCP, Kubernetes, and other providers. S3 Buckets are AWS storage services.

It’s common for a large company to set up a lot of S3 buckets for testing, development, and other purposes. Terraform is a great tool which you can on all operating systems. See how to Install Terraform on Linux. Terraform takes the burden of manually creating an S3 bucket off your shoulder. It reduces the burden you would have faced if we were to create several S3 buckets at once.

There are other IaC tools out there. The AWS-owned IaC tool is called CloudFormation while the Microsoft-owned IaC tool for the Azure cloud platform is Azure RM. In all, Terraform remains the only cloud-agnostic IaC tool that is compatible with all providers. Learn about IaC on Azure: An introduction of Infrastructure as Code (IaC), and how to create App Service Resource with Azure CLI.

Why Use Terraform to Create S3 Buckets?

You can manually configure many S3 buckets if you need to deploy them for an application, but doing so takes a while, and you’d have to do it again the next time you needed AWS cloud storage.

Instead, you can speed up the process by using a Terraform template. The S3 bucket settings are contained in a Terraform template, which can be used to deploy numerous S3 buckets at once in a matter of seconds without having to go through the same process manually. Creating the Terraform files is all that is necessary to deploy the S3 buckets.

Creating an S3 Bucket with Terraform

To be able to create S3 Bucket with Terraform you need to install Terraform first. To install Terraform, click here to get started. Here, I will create two files provider.tf and main.tf respectively. The provider.tf file will hold the following data:

terraform {
  required_providers {
    aws = {
      source = "hashicorp/aws"
      version = "5.3.0"
    }
  }
}

provider "aws" {
  # Configuration options
  access_key = "my-access-key"
  secret_key = "my-secret-access-key"
  region ="us-east-1"
}

While the main.tf file holds the configuration files for the bucket I want to create. The file looks like this:

resource "aws_s3_bucket" "demo" {
  bucket = "mydemo-bucket-tf"

  tags = {
    Name        = "My demo-bucket"
    Environment = "Dev"
  }
}

The provider.tf file simply specifies the provider as aws, the source as hashipcorp/aws and the version. It is advisable to always get the latest provider from the official Terraform AWS provider page.

terraform-provider
AWS Terraform Provider

Note, you can get all things providers for creating and managing S3 Buckets with Terraform by searching for S3 on the left search bar of the Provider’s page.

Provider-for-S3
S3 Bucket Provider

Running the Terraform Files to create S3 Bucket

To run those files, the first Terraform command to run in our terminal is terraform init. This initializes a working directory containing Terraform configuration files. This is the first command that should be run after writing a new Terraform configuration or cloning an existing one from version control.

create-init-tf-provider-file
Run terraform init Command

The next command is to run the “terraform plan” command. The Terraform plan command lets you preview the actions Terraform would take to modify your infrastructure or save a speculative plan which you can apply later.

Terraform-plan
Running terraform plan command

Lastly, run the terraform apply command to create the S3 bucket. The terraform apply is the command that actually carries out the planned changes to each resource using the relevant infrastructure provider’s API. It asks for confirmation from the user before making any changes, unless it was explicitly told to skip approval.

Terrform-apply
Running the terraform apply command

Please see How to grant public access to S3 Bucket using Policy, and “Amazon Relational Database Service: How to install Amazon RDS using Terraform“.

Enabling S3 Bucket Versioning using Terraform

In the last step of this post, let’s add the aws_s3_bucket_versioning resource to the existing code to enable S3 versioning for the bucket we have just created. To do this, search the Provider’s page for AWS S3 Bucket versioning

search-for-s3-versioning
S3 Versioning Provider

Next, copy the resource block as highlighted in the screenshot above and add it to your Terraform code. Remember to make some changes to correspond with the details of your S3 bucket.

Run the terrform plan command to preview your infrastructure and then run the terraform apply command to enable versioning for your S3 Bucket.

enabling-versioning
Enabling S3 Bucket Versioning

Verify the bucket has been created by signing into the AWS Console.

Bucket-created
S3 Bucket created using Terraform

Verify also that Versioning has been enabled by clicking Amazon S3 > Buckets > my-bucket > Properties

Versioning-disabled
Versioning not yet enabled
versioning-enable-console
versioning now enabled

In summary, Terraform is the most powerful open-source cloud-agnostic Infrastructure as Code (IaC) tool for provisioning resources in the cloud. It helps in minimizing errors that come with the manual way of provisioning infrastructure.

Rate this post

Thank you for reading this post. Kindly share it with others.

  • Share on X (Opens in new window) X
  • Share on Reddit (Opens in new window) Reddit
  • Share on LinkedIn (Opens in new window) LinkedIn
  • Share on Facebook (Opens in new window) Facebook
  • Share on Pinterest (Opens in new window) Pinterest
  • Share on Tumblr (Opens in new window) Tumblr
  • Share on Telegram (Opens in new window) Telegram
  • Share on WhatsApp (Opens in new window) WhatsApp
  • Share on Mastodon (Opens in new window) Mastodon
  • Share on Bluesky (Opens in new window) Bluesky
  • Share on Threads (Opens in new window) Threads
  • Share on Nextdoor (Opens in new window) Nextdoor
AWS/Azure/OpenShift, Configuration Management Tool Tags:AWS, s3, S3 Bucket

Post navigation

Previous Post: RDP Configuration Settings: Connect automatically to RDP session
Next Post: Enable or Disable Mac asking for Password after Sleep or Screen Saver

Related Posts

  • s3versioning
    Hosting Static Website and Versioning on AWS S3 AWS/Azure/OpenShift
  • add subheading 5 4
    Access EC2 Linux Instance via the Password AWS/Azure/OpenShift
  • Setup FSx File System 1
    Create and mount FSx File System: Join EC2 instance to AWS Managed AD AWS/Azure/OpenShift
  • How to create EC2 instance using Ansible
    How to launch an EC2 instance AWS/Azure/OpenShift
  • HYBRID BENEFITS FEATURE
    Azure Hybrid Use Cases and Benefits AWS/Azure/OpenShift
  • Continuous Deployment Pipeline Using AWS CodePipeline
    Setup a Continuous Deployment Pipeline with AWS CodePipeline AWS/Azure/OpenShift

More Related Articles

s3versioning Hosting Static Website and Versioning on AWS S3 AWS/Azure/OpenShift
add subheading 5 4 Access EC2 Linux Instance via the Password AWS/Azure/OpenShift
Setup FSx File System 1 Create and mount FSx File System: Join EC2 instance to AWS Managed AD AWS/Azure/OpenShift
How to create EC2 instance using Ansible How to launch an EC2 instance AWS/Azure/OpenShift
HYBRID BENEFITS FEATURE Azure Hybrid Use Cases and Benefits AWS/Azure/OpenShift
Continuous Deployment Pipeline Using AWS CodePipeline Setup a Continuous Deployment Pipeline with AWS CodePipeline AWS/Azure/OpenShift

Leave a Reply Cancel reply

You must be logged in to post a comment.

Microsoft MVP

VEEAMLEGEND

vexpert-badge-stars-5

Virtual Background

GoogleNews

Categories

veeaam100

sysadmin top30a

  • create Microsoft 365 Account
    How to create Microsoft 365 Account Network | Monitoring
  • ReasonsforBitLockerRecovery
    Reasons for BitLocker Recovery Prompt: Query the number of BitLocker recovery request Windows
  • images 5 1
    How to access MySQL Server from command Prompt Oracle/MSSQL/MySQL
  • feature image
    How to install PostgreSQL on Ubuntu Linux
  • image 24
    How to Resolve Microsoft RDP Connection Black Screen Windows
  • Local System Account
    How to Run a Program in Windows as the Local System Account Windows
  • You are currently signed in as
    How to fix “You are currently signed in as: Use a different account-this account is managed by your organisation” Microsoft Exchange/Office/365
  • Cross Domain Folders Access
    Cross Domain Folders Access: Reference account is Locked out Windows

Subscribe to Blog via Email

Enter your email address to subscribe to this blog and receive notifications of new posts by email.

Join 1,824 other subscribers
  • RSS - Posts
  • RSS - Comments
  • About
  • Authors
  • Write for us
  • Advertise with us
  • General Terms and Conditions
  • Privacy policy
  • Feedly
  • Telegram
  • Youtube
  • Facebook
  • Instagram
  • LinkedIn
  • Tumblr
  • Pinterest
  • Twitter
  • mastodon

Tags

Active Directory AWS Azure Bitlocker Microsoft Windows PowerShell WDS Windows 10 Windows 11 Windows Server 2016

Copyright © 2025 TechDirectArchive

 

Loading Comments...
 

You must be logged in to post a comment.