A Step by Step Guide to Reduce Your AWS S3 Bill

A Large AWS S3 Bill

AWS S3 costs are pretty low. But when you are getting a $500 bill for storage every month, it is worth spending a few hours to bring that down.

Plan to do these steps over a few days instead of at one shot. Since the AWS Cost & Usage Report is not updated in real time, it can take up to 24 hours before you see the impact of your changes.

Which Bucket Is That?

Tag everything!

The first step is finding out what is contributing to your bill. S3 costs have quite a few dimensions (API Operation, Usage Type, Tags and so on), but the easiest thing is to categorize your cost by bucket.

If you have a static list of buckets, this is pretty easy - add a name tag to every bucket.

- Go the S3 Management Console
- Go to each bucket
- Switch to the Properties tab
- Scroll to the Tags section
- Edit and Add a Tag with Key name and Value as your bucket name

You can also do the same thing using the AWS CLI and jq. The following script outputs AWS CLI commands that you can run to tag buckets which are missing a name tag.

# script provided as-is without any implied warranty of fitness for a particular purpose
# loop through all buckets
aws s3api list-buckets | jq -r .Buckets[].Name |
while read -r bucketname; do
# check if it has the name tag (Warning: this script doesn't check the value)
if [[ -z $(aws s3api get-bucket-tagging --bucket $bucketname | jq --arg TAGFORNAME "$TAG_FOR_NAME" '. | select(try .TagSet[].Key == $TAGFORNAME)') ]]; then
# add the name tag to the existing tags
tags=$(aws s3api get-bucket-tagging --bucket $bucketname | jq --arg BUCKETNAME "$bucketname" --arg TAGFORNAME "$TAG_FOR_NAME" '.TagSet += [{ "Key": $TAGFORNAME, "Value": $BUCKETNAME }]' | jq -r '.TagSet | map("{Key=" + .Key + ",Value=" + .Value + "}") | join(",")')
# output S3 put-bucket-tagging command
echo "aws s3api put-bucket-tagging --bucket $bucketname --tagging \"TagSet=[$tags]\""

Wait for the AWS Cost & Usage Report to update (this will happen within 24 hours) and check out the S3 cost by bucket.

- Go to AWS Cost Management
- Navigate to Cost Explorer
- Filter by S3 service
- Group the chart by Tag > Name

You will now have a fair idea of which bucket is costing you the most.

If you are creating and deleting buckets operationally, you will need to categorize your buckets into logical groups (for example, based on which application needs them) using tags.

How Old is That Anyway?

In Cost Explorer, filter down to each bucket that tops your cost list and group by Usage Type. In most cases it will be TimedStorage that’s costing you.

If the bucket contents are not being used regularly consider deleting or glacier archiving / glacier deep archiving the contents.

- Select the folder or objects
- Expand Action and choose Edit Storage class
- Scroll to the Storage class section
- Pick a storage class lower down the list based on the required redundancy (check the Availability Zone column) and acceptable retrieval times

Most storage classes have a minimum storage duration - this means that you will be charged for that duration even if you switch it to another class within that time. This is on top of the charges for the new storage class - you will be effectively paying double. So make sure you pick the appropriate one on the first try.

That’s Going To Get Old

Remove old objects

If you are using S3 buckets to store data that you know is going to be stale after some time, setting up Lifecycle rules automates the whole deletion / switching storage class step.

- Switch to the Management tab
- Click Create a lifecycle rule
- If you want to limit the rule to a folder, you can do that be specifying a prefix
- Pick an appropriate Lifecycle rule action

Keep in mind that minimum storage durations still apply, especially if you set up rules to switch contents to glacier archive and then glacier deep archive shortly thereafter. Adjust your rule timelines so that your storage cost savings cover your transitioning cost.

Lifecycle rules are a good fit for code archives and backups. Keep in mind that unlike the Standard storage class, you’ll have to pay per GB when retrieving - so you don’t want to immediately deep archive your code artifacts if you’re going to be retrieving it for next week’s deployment.

Speaking of Time

Versions take up storage

When a bucket is versioned, older version of object are kept around and incur storage costs. To see how much storage older versions of an object are costing you, get the bucket level storage size with and without versions.

- Navigate to the path containing the object
- Enable or Disable List versions
- Select everything and choose Calculate total size from the Actions menu

Set up lifecycle rules to delete older versions which will not be required.

You can also use the same method to identify objects which were accidentally put in the wrong path and set up lifecycle rules to delete the ghost version.

You can also turn off versioning at the bucket level if you won’t be needing it at all.

Request Costs

Failed API retries cost you

If you see costs coming from the Requests Usage type, you can most probably figure out from the bucket what’s writing to it and if the cost is justified.

Keep in mind that failed attempts also cost you. So if you see charges and don’t see any objects in the bucket, it probably comes from attempts with insufficient permissions. Usually failures result in retries and you end up paying more for failures than if it had succeeded the first time around (even with the extra storage cost).


AWS S3 is pretty cheap, but a poorly managed S3 service can add to your cost.

Analyzing your S3 cost once in a while can pay off, even if S3 is not the biggest part of your AWS bill.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store