4 public cloud cost optimization steps
4 mins read

4 public cloud cost optimization steps

It was recently reported that 80% of all IT budgets will be committed to the cloud, with demand for public cloud growing 36%.

But as businesses eagerly make their first steps towards public cloud adoption, their long-term cost optimization strategy can be easily overlooked — subdued by the fact that platforms like Azure, AWS, and Oracle make the initial deployment so easy and cost-effective, requiring little to no hardware purchases. The truth is that businesses need to have a long term strategy to manage their public cloud infrastructure properly, as data continues to grow at increasing rates and newer analytic use cases emerge with more compute needs, creating new scalability challenges related to costs, performance, and compliance.

How can businesses address these challenges early on in the public cloud lifecycle, so they can focus more on analyzing their data, and less on optimizing? Here are a few steps to establishing a long-lasting foundation for your public cloud implementation:

1. Establish a single data lake for all your public cloud data

A data lake enables businesses to store all types of structured, unstructured and semi-structured data in a central repository. Differing from a traditional data warehouse, data lakes store data ‘as-is’ and allows for schema on reading. This reduces costly ETL operations, saving big on time and effort. It also means greater flexibility on how data can be put to use for analytics and machine learning without duplication of data. Additionally, as data lakes can be built on Apache Hadoop, they can be hosted on low-cost, bulk storage provided by public cloud providers like AWS, Azure, and Oracle.

Establishing a data lake is key to your public cloud adoption and its optimization — as it provides a necessary framework for all your data and advanced data science projects, allowing you to immediately start improving public cloud cost optimization.

2. De-couple compute and storage

One of the biggest advantages of modern public cloud computing is the ability to decouple compute from storage. Compute and storage can exist independently, and can also scale independently. Couple that with the per-minute billing of compute and low-cost object storage, IT can optimize their cloud utilization to save big and realize ROI faster. As compute and storage needs will be different for every application and use case, IT needs to plan the deployments accordingly.

3. Leverage reserved instances for public cloud cost optimization

Most public cloud vendors provide options for you to reserve your compute instances, i.e. commit to use a compute instance (of a certain type) for a certain period. When you reserve an instance you could potentially save up to 50% over a regular on-demand instance. A key point to remember while reserving instances is that you cannot drop the instances or upgrade the reserved instance, until the end of the commitment period. So do plan on your reserved instances not based on savings, but based on the current and future compute needs of your application.

4. Monitor utilization for public cloud cost optimization

IT is evolving towards a more self-service model allowing for more autonomy to distributed DevOps teams. The public cloud fits perfectly into this game plan, as it provides greater flexibility to spin up new clusters without the help of traditional IT processes. This allows for real-time self-provisioning of infrastructure by teams. While this results in greater efficiency and faster ROI for the business, left unchecked it can also lead to a number of ghost systems that are not being actively used resulting in bloated public cloud bills. IT teams should train users on best practices and enforce strict guiding principles and policies to avoid this. IT can also use specialized software (like CloudChekr and Trusted Advisor) to gain visibility into cloud utilization across the enterprise, and view utilization-based public cloud cost optimization and security recommendations.

Conclusion

Over the past few years, public cloud vendors have been focusing on flexibility, price reduction, and greater cost transparency. Expect this to be the norm even in the future leading to continued savings for enterprises. However, it is important for enterprises to lay a proper foundation for internal public cloud utilization in order to realize the savings. The steps mentioned in this post will help you establish a strong foundation for your public cloud optimization strategy.

Learn more about the Solix Common Data Platform, a uniform data collection, retention management, and bulk data storage solution for structured and unstructured data, by clicking here.