Enterprises are looking favorably at the public cloud. It’s no wonder with Amazon’s immense success in helping savvy web companies build their infrastructure natively on the cloud, as well as the relative ease made by highly visible enterprise developers who have on-boarded and scaled their public cloud footprint when frustrated with slow on premise provisioning.
While exhibiting at AWS re:Invent, we saw both sides of the story, one where Amazon has been making great efforts to actively court the enterprise to gradually onboard more usage and adopt additional AWS services.
On the other, enterprise level CIOs and CTOs were carefully weighing the pros and cons of which environments to offload and when. We listened to many enterprise C Levels execs and identified five main challenges that prevent them from moving a substantial amount of load to the cloud:
- New investments need to be made to adopt off premise resources
With large expenses already made into on-premise infrastructure or data centers, the cost to move variable workloads to public clouds, both in terms of human expertise and management tools, is of a major financial and management concern. As a result small batches and test environments are slowly moved to the public cloud and carefully measured.
Storing sensitive and proprietary data on external environment carries risks. Despite successful use cases that Amazon provides, moving sensitive data and business pertinent environments to the public cloud can require authorization and nightmare levels of bureaucratic red tape. Ironically, implementing and following AWS best practices can potentially create an environment that is more secure, easier to manage, and better backed-up than ones hosted on-premise.
Quickly scaling up a cloud footprint can be difficult to manage and even harder to track, especially when changes are made over time and simply forgotten about. Widely available tools that assist in operations management, as well as pure costing and reliability measurement tools, can improve visibility over the cloud resources purchased, as well as help ensure they’re used effectively in line with the business’s goals.
- Fixed vs. variable costs
AWS best supports the execution of variable workloads, yet it can be economical for certain consistent workloads as well. It can be a difficult task to estimate which consistent and variable workloads are good candidates to move from on premise or private clouds to the public cloud. The good news is that AWS offers enterprise friendly services, such as AWS Reserved Instances, which allow for offloading consistent usage within AWS to longer term contracts in return for lowing pricing — up to 71% — when compared to on-demand EC2 pricing. Fine tuning between variable capacity and fixed capacity can be achieved within AWS.
- Lifecycle automation and knowledge management
Rolling over knowledge gained through launching public cloud servers over the different lifecycle stages such as PoC, test, staging, and production requires help. It is important to capture the gains from the public cloud’s elastic nature, make sure it remains a more financially-efficient infrastructure while staying flexible, and ensure that it’s better equipped for needs that change over time, which may require replication and automation.
Despite these challenges, a new cadre of analytic tools exist for onboarding and having enterprises benefit from the competitive public cloud pricing, elastic nature, and world class infrastructure provided by public cloud providers like AWS at a fraction of traditional IT tools, making it that much easier for enterprises to adopt the public cloud.
Cameron Peron is VP Marketing and Chief Customer Advocate at Newvem, a web-based cloud usage analytics service that enables CIOs, CTOs, IT managers, Developers and Operators to capture and improve the effectiveness of their public cloud operations and ensure their cloud infrastructure is in sync with business performance so they are financial efficient and contribute to a company’s profits. Follow him at @cameronperon.