Terraform backend: options and setup
What's the issue?
Without a remote backend, Terraform keeps state in a local file (terraform.tfstate). That causes:
- No sharing: Only one machine has the latest state; CI/CD and teammates do not.
- No locking: Two runs can apply at the same time and corrupt state.
- Easy loss: Deleting the file or losing the machine loses state.
For any shared or automated use (e.g. CD), a remote backend is used.
Options
| Option | Use case | Locking |
|---|---|---|
| Local (default) | Solo, disposable envs | None |
| S3 | Teams, CI/CD; state in one place | None (concurrent apply possible) |
| S3 + DynamoDB | Same as S3; safe concurrent runs | Yes (DynamoDB table used as lock) |
| S3 native locking | Newer Terraform; check Terraform docs | Yes (when supported) |
S3 is used as the remote backend. Locking is optional:
- S3 only: Simple. Create one bucket (e.g.
ohpen-terraform-state), enable versioning so you can recover from bad applies. Two simultaneous applies can conflict; avoid by running CD from a single branch with a gate. - S3 + DynamoDB: Add a DynamoDB table for state locking so only one
terraform applyruns at a time. Historically the standard; DynamoDB-based locking is deprecated in favor of S3-native locking in future Terraform versions, so it is not required.
How it is used in CD
The CD workflow passes backend config so the pipeline does not rely on a committed backend.tf:
terraform init \
-backend-config="bucket=ohpen-terraform-state" \
-backend-config="key=ohpen-data-lake/terraform.tfstate" \
-backend-config="region=eu-west-1" \
-reconfigure
Use GitHub variables or secrets (e.g. TF_STATE_BUCKET, AWS_REGION) so you can change bucket/region without editing the workflow.
One-time setup
-
Create the state bucket (if it does not exist):
- Name: e.g.
ohpen-terraform-state(must be globally unique). - Enable versioning (recommended for rollback).
- Restrict access (e.g. IAM + bucket policy) so only the CD role and operators can read/write.
- Name: e.g.
-
Optional – DynamoDB for locking:
- Create a table, e.g.
ohpen-terraform-lock, with primary keyLockID(string). - In backend config add:
-backend-config="dynamodb_table=ohpen-terraform-lock". - CD role needs
dynamodb:GetItem,PutItem,DeleteItemon that table.
- Create a table, e.g.
-
First apply: Run
terraform init(with the backend config above), thenterraform planandterraform applyonce so the backend state exists. After that, CD can run plan/apply normally.