As cloud adoption continues to accelerate, cloud leaders are defining ever more sophisticated app architectures that leverage both public and private clouds.

CloudBeat 2012CloudBeat 2012 will assemble the biggest names in the cloud’s evolving story to uncover real cases of revolutionary adoption. Unlike other cloud events, the customers themselves will be front and center. Their discussions with vendors and other experts will give you rare insights into what really works, who’s buying what, and where the industry is going. CloudBeat happens November 28-29 in Redwood City, Calif. Register today!

Earlier this year, a RightScale survey of more than 600 companies found that 68 percent of organizations have a cloud strategy that includes more than one cloud provider, with 53 percent choosing hybrid cloud (combining public and private clouds) and 15 percent choosing multiple public clouds.

These organizations are leveraging cloud management solutions to provide a level of abstraction from individual cloud capabilities and enable “workload liberation” — the ability to pick and choose the cloud infrastructure that is best for each application at any point of time.

Match up to the right cloud using app requirements

Combining a cloud management solution with a multi-cloud portfolio allows organizations to preserve choice while balancing technical, business and financial priorities. Here are a few examples of how cloud users match the right cloud infrastructure based to each application’s requirements:

Load variability

Applications with highly variable or relatively stable loads may be better suited for particular cloud providers. Companies like Zynga are leveraging the public cloud to handle the hard-to-predict spikes when they roll out new social games, while using the private cloud for more mature games with predictable loads.

Application requirements

Different applications may have different business requirements that influence the choice of public or private clouds. For example, CloudCare, a SaaS solution for practice management in doctor’s offices, runs web and application tiers in the cloud, while maintaining HIPAA-governed patient data in an in-house data center.


Organizations with international reach are looking to leverage clouds, both public and private, in geographies that are distributed to match the user load and optimize performance. For example, web sites that provide online access to large data sources will typically leverage a master database in one region and replicating slaves in other geographies.

Architecting for high availability

Companies looking to eliminate single point of failures are building disaster recovery architectures that leverage multiple clouds. For example, one user that leverages the cloud for heavy-duty processing replicates their data to a second private cloud in case of an outage. Other cloud users take advantage of multiple regions, with a DR deployment that leverages a “warm” replicated slave database along with non-operational stand-by servers.

Factors to consider for a multi-cloud architecture

In most of these scenarios, companies realize that they need to create and leverage a portfolio of public and private clouds. Whatever the motivation for creating a multi-cloud architecture, companies should consider several factors:

Minding the functionality gap

When considering a multi-cloud architecture, it is important to understand that each cloud provider, whether public or private, will provide different features and services. Each cloud may provide different options for storage, load balancing, network, and application services. Even when cloud providers have similar services, the APIs and the behavior of those services will vary.

When implementing a multi-cloud architecture, a cloud management solution can bridge this functionality gap and enable consistent interfaces, processes, and automation for your organization.

Securely connecting clouds

When working with multiple clouds, organizations will need to make connections between those clouds and between their own data centers and any public clouds.

In many cases, these connections may traverse the public internet, requiring companies to use integration strategies that also meet their security needs. Companies can leverage VPN solutions like OpenVPN, VNS3 from CohesiveFT, or CloudOptimizer from CloudOpt to create this secure connection over the public internet.

There are also options to create these connections between clouds using private networks. Cloud providers, such as SoftLayer and Google Compute Engine, enable you to create an architecture that spans data centers in different geographies where all traffic between your servers is on a private network.

For securely sharing data across clouds, you can leverage SSL database replication services that are part of many databases, including MySQL, PostgreSQL, and SQL Server.

Overcoming latency

Latency, the time delay associated with network traffic in distant geographies, is an important consideration for all of your cloud-based apps. A common approach to reduce latency is to deploy your applications in clouds that are geographically close to your end users.

When serving multiple geographies, multiple independent instances of the app might be acceptable. For example, a scalable website that has separate user bases located in Asia, North America, and Europe could have separate instances that serve those local regions. In that case, you might choose different clouds for each geography.

However, in some cases, a single instance of an application or a shared database needs to serve geographically distributed users. For example, an online gaming or social site that allows users in different geographies to interact may require a single shared database.

There are several technical approaches that can be used in these situations. One technique is to leverage WAN optimization technologies that aim to optimize the amount of data being sent over the network, thereby reducing latency of any transactions. A second technique is to deploy separate instances of the application in clouds located in different geographies, while connecting the databases through replication to provide a unified experience.

Controlling costs

One last thing: It’s very important to consider cost implications as you are designing and architecting your multi-cloud solution.

The first step is to understand the cost components associated with each particular public or private cloud option. You may also need to work with your financial team to accurately model the full costs of any internal clouds you may be developing. For example, in multi-cloud scenarios, bandwidth requirements can be a critical piece of the cost. In some cases, “data ingress” to a public cloud may be free, while “data egress” may incur higher charges.

Once you understand the cost components, you can now factor in cost considerations when choosing the right cloud for a particular application. For example, an application with high variability in load or one that will be running for a short period of time, may have lower costs when deployed in a public cloud, while an app with more steady usage may have lower costs in a private cloud.

Keep in mind that cloud providers are frequently changing their cost models and often lowering the cost of infrastructure services, so the cost profile of each cloud may change over time. A cloud management solution will give you the flexibility to choose the best cloud for your needs and to retain the ability to move clouds as needed.

Wrap up

The value of multi-cloud for a wide variety of use cases cannot be disputed. Companies of all sizes, from large enterprises to small startups, are seeing the value of taking a multiple cloud approach to manage their infrastructure.

After spending the last three years at RightScale helping companies develop and execute their cloud strategies, I’m seeing a growing adoption of a multi-cloud approach by industry leaders –- and I expect that momentum will only continue as organizations take advantage of these architectures to ensure success with their cloud-based applications.

brian-adlerBrian Adler is a Senior Professional Services Architect at RightScale, bringing years of experience working with multi-cloud environments. Brian advises on complex application architectures with customers and their cloud implementations, working directly with customers to dynamically configure cloud resources across multiple cloud infrastructures. Prior to RightScale, Brian held positions in systems engineering, architecting, hardware, and software in defense industry applications and as a software architect for Openwave.

Clouds photo via Nicholas A. Tonelli/Flickr