Amazon Simple Storage Service (S3) is one of the most widely deployed AWS Services, next to AWS EC2. It is used for a wide range of use cases such as static HTML websites, blogs, personal media storage, enterprise backup storage. From AWS cost perspective, AWS S3 storage is one of the top preferred resources. For every enterprise looking to optimize AWS Costs, analysing and formulating an effective cost management strategy towards AWS S3 is important. More so, understanding data lifecycle of the applications hosted is the key step towards implementing a good AWS S3 cost management strategy.

Making the most of AWS S3:

With AWS, you pay for the services you use and the storage units you’ve consumed. If AWS S3 service is a significant component of your AWS cost, then implementing AWS S3 management best practices is the way forward.

For example, if a business has opted for AWS S3 service and has provisioned 100 GB of it but has actually stored only 10 GB of files in it, then AWS would only charge for the 10 GB and not for the entire 100 GB provisioned initially. However, there are various factors that affect the S3 cost too, which many are unaware. Many AWS administrators tend to overlook S3 from cost management perspective because of this aspect.

To this end, we’ve collated few basic checks to get the S3 cost management right as AWS S3 usage grows:

  1. EC2 and S3 buckets should be in the same AWS region because there is a cost involved for data transfers outside of its AWS region.
  2. The Naming Schema should be chosen such that access keys generated ensures files are stored and distributed across multiple drives of the AWS S3 system. If the access keys are distributed evenly, the number of file operations needed to read and write the files will be less. This will lead to less spend costs as there is an additional cost overhead for read-write operations for S3.
  3. Only temporary access credentials of AWS S3 should be hardcoded into an application’s code that uses S3. There can be misuse of the S3 resources if access keys are exposed to third party. This can prove very costly, if access credentials are compromised in the future.
  4. Monitoring the actual usage of AWS S3 periodically is one of the best practices. By doing so, misuse of the provisioned S3 resources will come to limelight and help in curtailing data compromise.
  5. Files form the key object type, and are stored in S3. All files that are no longer relevant should be removed from S3 buckets. All files that are temporary files can be recreated through a computation process. All temporary files generated due to incomplete multi-part uploads should be cleaned up periodically.
  6. When using versioning for an S3 bucket, enable “Lifecycle” feature to delete old versions. Here’s why and how: With Lifecycle Management, you can define time-based rules that can trigger ‘Transition’ and ‘Expiration’ (deletion of objects). The Expiration rules give you the ability to delete objects or versions of objects, which are older than a particular age. This ensures that the objects remain available in case of an accidental or planned delete while limiting your storage costs by deleting them after they are older than your preferred rollback window.
  7. Try to send the data to S3 in compressible format, because AWS S3 is charged for the amount of units you’ve consumed.

Ultimately, every data stored in the S3 will have its lifecycle stages of creation, usage and then followed by infrequent usage. Just like content creation in a news website. The daily news created along with its images can be stored in AWS S3. Current news items will be accessed most and hence have to be quickly accessible to a reader. At the end of the week, the older daily news content can be moved to the AWS S3 RRS for faster, but slightly infrequent access. At the end of the month, they can be moved to an standard infrequent access storage type. At the end of the quarter, these content can be moved to the low cost rarely accessed archival mode of AWS Glacier.

This data lifecycle is applicable across domains including e-commerce and enterprise computing as well. Hence, leverage data’s inherent lifecycle for AWS S3 cost optimization.

You can also take advantage of Amazon S3 Reduced Redundancy Storage (RRS) as an alternative to S3, because it’s cheaper.

To Conclude:

Once you follow all the above hacks, start observing the bills. And don’t forget to follow other key best practices too. Use RRS where ever you can. Keep your buckets organized. Archive when appropriate. Speed up your data processing with proper access keys names. Use S3, if you are hosting a static website. Architect around data transfer costs. Use consolidated billing.

Finally, AWS provides a simple configuration mechanism to specify the rules of the data lifecycle and transferring of the objects across storage types. So, do take data lifecycle as well as into account when it comes to S3 cost management.

If you are finding it difficult to save on AWS S3 cost, then explore the intelligent Botmetric AWS Cloud Management Platform with a 14-day free trial. It can help you manage your AWS storage resource management and help keep them at optimal pricing levels at all times. For other interesting news on cloud, follow us on Twitter, Facebook, and LinkedIn.

--

--

Nutanix
Nutanix

Written by Nutanix

We make infrastructure invisible, elevating IT to focus on the applications and services that power their business.

No responses yet