Skip to main content

Command Palette

Search for a command to run...

Step-by-Step Guide to Importing Existing AWS Resources into Terraform

Updated
5 min read
Step-by-Step Guide to Importing Existing AWS Resources into Terraform
M

Helping developers and engineers grow through cloud tutorials, backend projects, and honest tech reviews.

Manually managing AWS resources can get messy fast—one small change here, another click there, and suddenly you’re not sure what’s deployed or how. That’s where Terraform comes in. As Infrastructure as Code (IaC), it gives you version control, repeatability, and way less clicking through the console.

But what if you already have AWS resources live and running? Rebuilding them from scratch in Terraform sounds painful—and risky. That’s where terraform import helps. It lets you bring existing infrastructure under Terraform’s control without tearing anything down.

Whether you’re migrating from manual setups, starting to roll out IaC, or just want a cleaner way to manage your AWS environment, this guide will walk you through importing an existing resource—like an S3 bucket—into Terraform. You’ll get clear steps, real code examples, and a few bonus tips to avoid the usual headaches.


Why Terraform Import Matters

Let’s say you’ve got an S3 bucket that’s been running for months—maybe it was created through the AWS console or a one-off script. Now you’re ready to manage it with Terraform so you can version control it, track changes, and avoid the manual config drift that happens over time.

That’s where terraform import comes in. It connects existing AWS resources to your Terraform state file without needing to tear anything down or rebuild from scratch. From that point on, you can manage the resource declaratively—like it was written in Terraform from day one.


Step-by-Step Breakdown

Let’s walk through how to bring an existing AWS resource under Terraform’s control. We’ll use an S3 bucket as the example, but this same process works for pretty much any AWS resource—EC2, IAM, VPCs, etc.

1. Understand What terraform import Actually Does

Before you jump in, it's important to get what terraform import really does—and what it doesn’t.

✅ What it does:
It connects a real AWS resource (like an existing S3 bucket) to your Terraform state file, so Terraform can start tracking it.

❌ What it doesn’t do:
It won’t magically create .tf files for you. You still need to write the resource block yourself—even if it’s empty to start.

Think of it like claiming a resource: Terraform starts managing it, but you still have to tell it what it's managing. No config block, no control.

2. Write the Resource Block First

Start by defining the resource in your Terraform configuration. It can be empty for now—just make sure to give it a name that’ll match the import. For an S3 bucket:

# main.tf
resource "aws_s3_bucket" "my_bucket" {
  # Leave this empty - You'll circle back to this later
}

This block lets Terraform know you plan to manage a resource named my_bucket. The aws_s3_bucket part defines the type of resource (in this case, an S3 bucket), and my_bucket is the name you'll reference within your Terraform configuration.

3. Run terraform init

Before importing anything, make sure your Terraform environment is initialized. This sets up the AWS provider and configures your backend (like S3 or Terraform Cloud) if you're using remote state storage.

terraform init

If you haven’t configured your AWS provider yet, make sure to add this to a provider.tf file:

provider "aws" {
  region = "us-east-1"  # Adjust to your working region
}

After adding the provider block, run terraform init again. You should see output confirming that the AWS provider was successfully installed.

4. Run the Import Command

Now for the fun part—importing the real AWS resource into your Terraform state. If you have an S3 bucket named my-existing-bucket-name, run the following command:

terraform import aws_s3_bucket.my_bucket my-existing-bucket-name
  • aws_s3_bucket.my_bucket: This matches the resource type and name defined in your .tf file.

  • my-existing-bucket-name: This is the actual name of the S3 bucket in your AWS account that you want to bring under Terraform management.

If the import is successful, Terraform will update your terraform.tfstate file with the bucket’s metadata. You’ll see a confirmation message like:

aws_s3_bucket.my_bucket: Import completed!

5. Run terraform state show

The import doesn’t populate your .tf file—it only updates the state. To see what Terraform found, run:

terraform state show aws_s3_bucket.my_bucket

This outputs all the bucket’s properties within terraform:

id                  = "my-existing-bucket-name"
bucket              = "my-existing-bucket-name"
acl                 = "private"
versioning {
  enabled = false
}

Use the output to map out your resource block. Now update main.tf to match it:

resource "aws_s3_bucket" "my_bucket" {
  bucket = "my-existing-bucket-name"
  acl    = "private"
}

Now, Terraform fully manages the bucket. To confirm it, run terraform plan. If everything aligns, no changes will be found.


⚠️ Common Pitfalls to Avoid

Even experienced engineers run into these when using terraform import:

  • Skipping terraform init
    Without initialization, the import command will fail with a provider error.

  • Mismatched Resource Name
    If Terraform doesn’t find a matching block like aws_s3_bucket.my_bucket in your code, it won’t know where to import the resource.

  • Expecting Code Generation
    terraform import updates the state, not your .tf files. Be sure to run terraform state show afterward and manually update your config.

  • Incorrect Resource Identifiers
    For complex resources like IAM roles or WAFs, make sure you use the exact ID or ARN format AWS expects—sometimes it’s just the name, other times it needs the full ARN.


Bonus Tips

    • Bulk Imports with TerraformerImporting a single resource manually is fine—but if you're dealing with dozens, check out Terraformer. It auto-generates both .tf files and state from your existing AWS setup.

      • Use Workspaces for Safe Testing
        Create a
        separate terraform workspace when testing imports. It keeps your main state clean and reduces risk while you experiment.

      • Script Your Imports
        For repetitive imports like EC2 instances or security groups, use a simple Bash loop to automate it. Example:

  •   instances=("i-1234567891" "i-234567821")
    
      for instance_id in "${instances[@]}"; do
        terraform import "aws_instance.ec2_${instance_id}" "$instance_id"
      done
    

Ultimately

terraform import is one of the most underrated tools in the Terraform workflow. When combined with terraform state show, it gives you a safe path to bring existing AWS resources under Infrastructure as Code—without having to rebuild anything.

Now you’ve got the steps, the tools, and the confidence to start managing your cloud infrastructure the right way.

Hit a wall importing a resource? Drop it below and let’s troubleshoot it together.

Cloud/DevOps

Part 2 of 4

Explore articles focused on AWS, cloud computing, and DevOps best practices. From infrastructure management to real-world cloud solutions, this series covers essential topics for building, scaling, and securing modern applications in the cloud.

Up next

5 AWS IAM Misconfigurations You Might Not Know You’re Making

AWS Identity and Access Management (IAM) is the foundation of your cloud security—yet it’s also one of the most misunderstood services in AWS. If you’ve ever opened an IAM policy and felt overwhelmed, you’re not alone. The good news? You don’t need t...