Google is announcing today new storage tiers available from the Google Cloud, and specifically from the Google Cloud Platform.
Now customers can choose to have Google automatically store their data in multiple geographically dispersed data centers with the new Multi-Regional Storage tier, which costs 2.6 cents per gigabyte of data. Customers can now also opt for low-cost Coldline Storage for data they access less than once a year, for 0.7 cents per gigabyte.
The Nearline cold storage tier, which costs 1 cent per gigabyte and is best suited for data you access less than once a month, remains available. So does the “regular” single-region Google Cloud Storage option — but Google is cutting its cost by 23 percent. Starting November 1, it will cost 2 cents per gigabyte, per month. (In case you’re wondering, Google has moved regular Google Cloud Storage buckets onto the new Regional storage tier.)
Calls to application programming interfaces (APIs) to do things with Google Cloud Regional and Multi-Regional Storage are also going down. Starting November 1, so-called Class A types of API calls will cost half a cent for every thousand operations (a 50 percent price cut), while Class B API calls will cost 4/10 of a cent for every 10,000 operations (a 60 percent price cut).
Perhaps the most fascinating part of this storage shakeup is that Google is making it possible for customers to have Google automatically move data from one tier to another. Here’s how Google Cloud Storage product manager Kirill Tropin explains it in a blog post:
Any Google Cloud Storage bucket can now hold data in different storage classes, and the lifecycle policy feature can automatically transition objects in-place to the appropriate colder storage class based on the age of the objects.
This is the sort of thing that larger companies care about — using the most economical type of IT resources while balancing the needs of users and following regulatory requirements. This launch provides a sense of the type of cloud customer that Google is going after. And it’s understandable, because bigger public clouds like Amazon Web Services (AWS) and Microsoft Azure have also been taking on more enterprise workloads.
The move comes a week after AWS announced a major partnership with virtualization software company VMware that will enable VMware customers to use the software they’re familiar with to deploy and manage applications on AWS. But the head of the Google Cloud is now Diane Greene, cofounder and former chief executive of VMware, whose tools are widely used in enterprises.
With each month, it becomes a little clearer that with Greene at the helm of its cloud business, Google is serious about gaining share in public cloud, particularly from big companies. And for that to happen, you’ve got to have a rock-solid storage offering. Today, Google is a couple of steps closer to that.
Documentation for the new storage tiers is here.
VentureBeatVentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative technology and transact. Our site delivers essential information on data technologies and strategies to guide you as you lead your organizations. We invite you to become a member of our community, to access:
- up-to-date information on the subjects of interest to you
- our newsletters
- gated thought-leader content and discounted access to our prized events, such as Transform
- networking features, and more