Posted on by Cloud Computingin
In 2011, Col. Timothy Hill, director of the Futures Directorate within the Army Intelligence and Security Command, urged industry to take a more open-standards approach to cloud computing. "Interoperability between clouds, as well as the portability of files from one cloud to another, has been a sticking point in general adoption of cloud computing," Hill said during a panel at the AFCEA International 2011 Joint Warfighting Conference. Hill's view has been echoed by many in the cloud computing community, who believe that the absence of interoperability has become a barrier to adoption. This posting reports on recent research exploring the role of standards in cloud computing and offers recommendations for future standardization efforts.
Avoiding Vendor Lock-In
Since the inception of the cloud, organizations have been transferring data and workloads to pooled, configurable computing resources. These resources include networks, servers, storage, applications, and services. One concern voiced by many organizations that use cloud-based services is vendor lock-in, which stems from the inability to move resources from one cloud provider to another.
Users want to have the freedom to move between cloud providers for many reasons. For example, a relationship with a vendor may not be working, service-level agreements may not be met, other providers offer better prices, or a provider goes out of business. In an environment without common standards, there is little or no freedom to move between vendors.
The cloud computing community has already developed numerous standards (some argue there are too many) by various forums, standards organizations, and nonprofit organizations including OpenStack, the Standards Acceleration to Jumpstart Adoption of Cloud Computing, and The Open Group Cloud Computing Work Group, to name a few. One issue explored in my research is whether we should create new standards or just leverage existing standards.
Some standardization efforts focus on codifying parts of a cloud-computing solution, such as workloads, authentication, and data access. Other standards focus on unifying disparate efforts to work together on a solution. In addition, due to their large market share in this space, interfaces used by Amazon have emerged as de facto standards.
The technical report describing my research, The Role of Standards in Cloud Computing Interoperability, explains how answers to questions about how standards can enable interoperability depend on several factors. Key factors include the type of service model that a cloud provider uses and the level of interoperability that an organization expects. Note that the cloud community typically uses the term interoperability to refer to portability, i.e., the ability to move a system from one platform to another.
Initially, my research identified four typical cloud-computing interoperability use cases that are supported by standards:
We found that workload migration and data migration can benefit the most from standardization. For example, standardization of VM (virtual machine) image file formats would allow organizations to move workloads from one provider to another or from private to public clouds. Standardized APIs for cloud storage would do the same for data.
Provider Service Models
In examining the issue of standardization through the provider lens, we looked at the three main service models:
Organizations select PaaS and SaaS specifically for these value-added features, and end up in a commitment similar what one experiences when purchasing software. Expecting PaaS and SaaS providers to standardize these features would be equivalent to asking an enterprise resource-planning software vendor to standardize all of its features; it's not going to happen because it's not in their best interests.
One challenge among standardization organizations is determining what areas of cloud computing to standardize first. In 2005, researchers from the European Union defined three generations of service-oriented systems. The development of cloud-based systems over time is analogous to the following classification of the way that service-oriented systems have evolved
Reaching this third generation of cloud-based systems will most likely be the focus of future research. This work will require cloud consumers, cloud providers, and software vendor groups to work together to define standardized, self-descriptive, machine-readable representations of
For now, standardization efforts should focus on the basic use cases of user authentication, workload migration, data migration, and workload management. Those efforts can then be used as a starting point for the more dynamic use cases of the future.
Even if vendor lock-in is mitigated, it is important for organizations to know that any migration effort comes at a cost--whether between cloud providers or local servers, databases, or applications. Cloud standardization efforts should therefore focus on finding common representations of user identity, workload (virtual-machine images), cloud-storage APIs, and cloud management APIs. Vendors influence many standards committees, and it is unrealistic to assume that each of these elements will have a single standard.
Agreement on a small number of standards, however, can reduce migration efforts by enabling the creation of transformers, importers, exporters, or abstract APIs. Such an effort could enable the dynamic third-generation of cloud-based systems.
What are your thoughts on cloud computing interoperability? Do we need new standards? Or can we live with the ones we have? Will we ever get to the third generation? Is it necessary?
The technical report describing this research, The Role of Standards in Cloud Computing Interoperability, may be downloaded at
To read all of Grace Lewis' posts on her research in cloud computing, please visit