BT

Facilitating the Spread of Knowledge and Innovation in Professional Software Development

Write for InfoQ

Topics

Choose your language

InfoQ Homepage Articles The Problem with Cloud-Computing Standardization

The Problem with Cloud-Computing Standardization

This article first appeared in Computer magazine and is brought to you by InfoQ & IEEE Computer Society.

 

Cloud computing has become an increasingly popular approach in recent years, with seemingly nothing but ongo­ing growth in its future. However, some industry observers say that the rapid growth has caused, and is also threatened by, the failure of compre­hensive cloud-computing standards to gain traction, despite the many groups working on them.

They say the lack of standards could make cloud computing trickier to use. It could also restrict imple­mentation by limiting interoperability among cloud platforms and causing inconsistency in areas such as secu­rity and interoperability. For example, the lack of standardization could keep a customer trying to switch from a pri­vate to a public cloud from doing so as seamlessly as switching browsers or e-mail systems. In addition, it would keep users from knowing the basic capabilities they could expect from any cloud service.

“Interoperability between offerings and the portability of services from one provider to another is very impor­tant to the customer to maximize the expected [return on investment] from cloud computing,” explained IBM vice president for software standards Angel Luis Diaz. Moreover, interoper­ability would keep users from being locked into a single cloud provider.

A lack of security standards - addressing issues such as data privacy and encryption - is also hurting wider cloud-computing adoption, said Nirlay Kundu, senior manager at Wipro Con­sulting Services.

With potentially sensitive infor­mation stored off-site and available only over the Internet, security is a critical concern, explained Vishy Narayan, principal architect with the System Integration Practice at Infosys Technologies, a vendor of consulting, technology, engineering, and out­sourcing services.

According to Lynda Stadtmueller, program director for cloud comput­ing at market research firm Frost & Sullivan’s Stratecast practice, an effec­tive lack of standardization makes it difficult for buyers to compare and evaluate cloud offerings.

Of course, cloud computing is relatively young, so the lack of stan­dardization - which usually occurs with more mature technologies - is not altogether surprising. And some experts say the market’s immaturity makes it too difficult for any one orga­nization to mandate standards.

There may be challenges to cloud-computing standardization along the way, and overcoming them could determine just how bright cloud com­puting’s future will be, said Winston Bumpus, director of standards archi­tecture at VMware, a virtualization and cloud infrastructure vendor. He is also president of the Distributed Management Task Force (DMTF), an industry-based standards consortium.

Cloud Standardization

At its most basic, cloud computing is simply the delivery of applications; security and other services; storage and other infrastructures; and plat­forms such as those for software development to users over the Internet or a private cloud.

Cloud computing appeals to many organizations because it minimizes the amount of hardware and software that users must own, maintain, and upgrade. In essence, users pay only for the computing capability they need.

Standardization issues

Many of today’s in-progress standards, summarized in Table 1, are being based in part on the US National Institute of Standards and Technology’s Special Publication 800-145, a document called “The NIST Definition of Cloud Computing (Draft).”

True interoperability requires translation of specific application and service functionality from one cloud to another, and this won’t happen without standardization, said Michael Crandell, CEO and founder of cloud-computing vendor RightScale. For example, there currently is no stan­dardized way to seamlessly translate security requirements and policies across cloud offerings.

A key standardization issue involves virtualization, which plays a critical role in most cloud-comput­ing approaches, said Willemstad van Biljon, cofounder and vice president of products for Nimbula, a cloud-infrastructure vendor.

Virtualization’s flexibility lets cloud providers optimize workloads among their hardware resources. This also enables users to, for exam­ple, connect to storage without having to know about server names and addresses, which would be the case in a traditional network.

In virtualization, hypervisors manage a host server’s processing and other resources so that it can run multiple virtual machines (VMs), using different operating systems and other platforms. Each cloud platform has its own type of hypervisor, noted Crandell.

Cloud systems utilizing different hypervisors won’t interoperate, in part because they don’t use the same data formats, noted Nimbula’s Van Biljon. Cloud platforms also won’t interoperate because their VMs don’t interact in a standard way with different network and storage archi­tectures, APIs, network connections, databases, and other elements, Cran­dell explained.

VM translation is an important issue to enable the preservation of security policy, network policy, and identity across clouds, said Van Biljon. Without standardization, moving a workload from one cloud platform to another requires creating a new VM on the second platform and then reinstalling the application, which can take considerable time and effort.

DMTF

The DMTF’s Open Virtualization Format, which debuted last year, is a first step toward hypervisor, and thus cloud-computing, interoperability.

OVF provides a way to move vir­tual machines from one hosted platform to another, noted Dave Link, founder and CEO of ScienceLogic, an IT-infrastructure-management prod­uct vendor. OVF standardizes use of a container that stores metadata and virtual machines and enables the migration of VMs between clouds. It also defines certain aspects of the VM and the application that runs on it, including size, CPU and networking requirements, memory, and storage.

However, users must manu­ally handle details necessary for interoperability, such as application-component interoperability.

The DMTF is still working on enabling a VM to run on multiple platforms and on defining how an application operates in the cloud, to perform functions such as load bal­ancing and session handling.

ANSI recognizes OVF as a standard, and it is also under consideration by the ISO, noted the DMTF’s Bumpus.

The DMTF’s Open Cloud Stan­dards Incubator subgroup is working to improve cloud interoperability via approaches such as open cloud-resource management standards, cloud-service-portability specifica­tions, and security mechanisms.

IEEE

IEEE Working Groups P2301 and P2302 are developing comprehensive standards that will address migration, management, and interoperability among cloud-computing platforms.

P2301, Draft Guide for Cloud Portability and Interoperability Pro­files, will serve as a metastandard, providing profiles of existing and in-progress cloud-computing stan­dards from multiple organizations in critical areas such as cloud-based applications, portability, manage­ment, interoperability interfaces, file formats, and operation conven­tions, said Steve Diamond, chair of the IEEE Cloud Computing Initiative and managing director of the Picosoft technology-business consultancy.

The purpose is to avoid having multiple standards address the same issues while having no standards addressing others, explained David Bernstein, chair of the P2301 and P2302 Working Groups and man­aging director of the Cloud Strategy Partners consultancy.

P2302, Draft Standard for Inter­cloud Interoperability and Federation, defines the topology, protocols, func­tionality, and governance required for cloud-to-cloud interoperability and data exchange.

Diamond said the two working groups haven’t yet established their roadmaps and probably won’t finish all of their work until 2012.

Open Grid Forum

The OGF - a community of grid-computing users, developers, and vendors - is developing the Open Cloud Computing Interface. OCCI specifies multiple protocols and APIs for various cloud-computing management tasks, including deploy­ment, automatic scaling, and network monitoring.

The APIs will incorporate aspects of other cloud-computing-related APIs, such as those used in GoGrid and Amazon’s Elastic Compute Cloud. The development of APIs will enable interfacing and interaction among different infrastructure-as-a-service (IaaS) platforms.

Organization for the Advancement of Structured Information Standards

OASIS has two technical commit­tees working on cloud-centric issues.

The IDCloud Technical Commit­tee is trying to resolve security issues regarding identity management in cloud computing, to make sure people using cloud resources are who they say they are. The group is also developing guidelines for vulnerabil­ity mitigation, considered important because of cloud computing’s open, distributed architecture.

The Symptoms Automation Frame­work Technical Committee is working on ways to make sure cloud-comput­ing providers understand consumer requirements - such as capacity and quality of service (QoS) - when designing and providing services.

Storage Networking Industry Association

The SNIA’s Cloud Data Manage­ment Interface standardizes cloud storage in three key areas.

CDMI’s client-to-cloud storage standard addresses the way a user interfaces with cloud-based storage resources. The cloud-data-manage­ment standard deals with issues such as QoS and encryption. The cloud-to- cloud-interactions standard focuses on the way stored data can be moved among clouds.

CDMI is now an SNIA architecture standard. SNIA also plans to submit it for ANSI and ISO standardization.

Standardization Challenges

The sheer number of standardiza­tion efforts, led by both vendors and standards bodies, are muddying the waters. They make it difficult for ven­dors and users to determine what is and isn’t going to emerge. Moreover, there are multiple standardization efforts in some cloud-computing areas and none in others.

According to IBM’s Diaz, even though the standardization efforts are important, they aren’t doing enough. He said they need more customer input, to make sure users’ needs are met. A group of companies and other organization recently formed the Cloud Standards Cus­tomer Council to supply such input.

An important issue is getting all vendors on the same page in terms of what a standard will cover, noted Dave Lithicum, founder and chief technology officer of cloud-computing consultancy Blue Moun­tain. Most vendors, he explained, have their own agendas, making the standards-adoption process difficult, frustrating, and so time-consuming that some organizations give up.

Another challenge is the mar­ket’s youth, said James Staten, vice president and principal analyst at Forrester Research. Because cloud computing is not yet established, the technological landscape could change substantially, making a new standard obsolete.

The technology and marketplace will need to mature and stabilize a bit before standardization require­ments become apparent, added Steve Crawford, vice president of marketing and business development at cloud-services provider Jamcracker.

Many industry watchers say that cloud-computing stan­dards development will occur but that timing is a key factor.

The standardization process will crystallize during the next five years with the emergence of a few core specifications forming a baseline that all cloud vendors will need to support, predicted IBM’s Diaz. Soon after, he added, domain- and indus­try-specific standards will appear. Stratecast’s Stadtmueller estimated that standards addressing security and interoperability will emerge within five years, once providers, governments, and compliance organizations agree on what constitutes a secure cloud environment.

On the other hand, said James Thomason, chief architect at IaaS software vendor Gale Technologies, cloud-computing standards will take “an excruciatingly long time to emerge.” He predicted that vendors will try to standardize their own implementations at first to gain a competitive advantage, initially creating ineffective specifications. Thus, he said, standardization will occur only when the market can no longer tolerate a lack of interoperability.

Jamcracker’s Crawford agreed, stating that standardization will be driven by consensus among large enterprise customers who insist on interoperability or by vendors recog­nizing that standards development is needed to drive further adoption.

There will be hiccups along the way, he said, but they will accentuate the need for standards, particularly to address interoperability.

About the Author

Sixto Ortiz Jr. is a freelance technol­ogy writer based in Amarillo, Texas. Contact him at sortiz1965@gmail.com.

Computer, the flagship publication of the IEEE Computer Society, publishes highly acclaimed peer-reviewed articles written for and by professionals representing the full spectrum of computing technology from hardware to software and from current research to new applications. Providing more technical substance than trade magazines and more practical ideas than research journals. Computer delivers useful information that is applicable to everyday work environments.

 

 

Rate this Article

Adoption
Style

BT