Architectural Requirements Of The Hybrid Cloud
Cloud computing continues to gain momentum as a description of service offerings based on a virtualized data center infrastructure and provided over the Internet on an as-needed basis. Public clouds, such as Amazon EC2, first brought attention to this model, followed by private clouds built within an organization, as exemplified by IBM's Blue Cloud initiative. Both public and private clouds have been found to have advantages for the enterprise, but most analysts now agree that the real power of the cloud concept lies in a marriage between the two -- the hybrid cloud.
A hybrid provides services using a mixture of private and public clouds that have been integrated to optimize service. The promise of the hybrid cloud is to provide the local data benefits of the private clouds with the economies, scalability, and on-demand access of the public cloud. The hybrid cloud remains somewhat undefined because it specifies a midway point between the two ends of the continuum of services provided strictly over the Internet and those provided through the data center or on the desktop. Today, almost every enterprise could be said to have an IT infrastructure containing some elements of both extremes. Meshing them into a common structure is what becomes interesting and offers a range of new possibilities in handling local and cloud-based data, but it also introduces a range of complexities in data transfer and integration.
Cutter Benchmark Review issue, "Cloud Computing and Software as a Service: The Hyper, the Hype, and the Facts" (complimentary download).
In its most mature form, the hybrid cloud is a private cloud linked to one or more external cloud services, centrally managed, provisioned as a single unit, and circumscribed by a secure network (see Figure 1). Each cloud will have a similar infrastructure and will be based on standards permitting interoperability, making it possible to optimize processing and data location according to such issues as load-balancing requirements, regulatory and security concerns, efficiency of operation, and data-transfer necessities. Each cloud will be used for different purposes, depending on available services and costs, and movement between clouds will be simple and relatively painless.
Figure 1 -- The hybrid cloud: integrating multiple clouds in a secure network.
This vision of the hybrid cloud is, at present, a projection. Currently, interoperability is somewhat limited at various points, including at the virtualization hypervisor level; data transfer also remains problematic, as is integration between applications in separate clouds. But these problems are being worked on, and market demand will help to ensure integration.
IMPLEMENTING CLOUD-BASED SERVICES
In fitting public cloud services into existing IT infrastructure, several important issues need to be considered:
- Level of integration required with existing infrastructure, including any need to share data with applications, to access data from local storage, or to store data locally
- Available infrastructure services in the cloud, such as data protection, cloud-based storage, and complementary services
- Requirements for data transfer to and from the cloud and resultant networking needs
- Compatibility of cloud services with existing programs and data storage, including any requirement for data migration or conversion
- Licensing issues that might affect the use of cloud-based services or limit ability to integrate these services with existing infrastructure
- Compliance issues in ensuring that the provided services meet needs for transparency, auditability, and security that are relevant to your firm's operations
- Vendor operations, including business practices, stability, service-oriented architecture (SOA), security, as well as backup and disaster recovery
Requirements will vary depending on the services needed. Where the object is simply to source virtual machines for testing purpose, for example, there are likely to be relatively few concerns, and the emphasis must be on simplicity. However, when the object is thousands of workstations and the basic desktop services to be used by the firm, it is clearly important to make sure there are service guarantees in place and that the solution is capable of meeting requirements.
Convergence in a number of areas within the data center environment has created a new vision of infrastructure, bringing together several technologies based on virtualization to create a seamless, secure, and scalable platform for service delivery that meets the needs of complex IT environments. This "smart infrastructure" is the basis of both public and private clouds. This architecture is efficient and agile -- but must also meet a range of new requirements. A smart infrastructure is capable of adapting and adjusting to a wide range of new conditions, backed by centralized management and automation.
Virtualization improves efficiency and provides a high degree of flexibility. Centralized network and service management across physical and virtual environments provides the capability to balance workloads and ensure that resources are available to handle jobs. Automation of management tasks and provisioning provides an efficient infrastructure that does not require constant supervision. Centralized management and control of security also become an imperative because of the need to oversee a more complex environment with potentially greater exposure to risk.
With cloud computing, virtualization is extended using large numbers of processors within a controlled environment that eases deployment and provides for efficient centralized management. Integration of multiple clouds is the next evolutionary step, as services and facilities available from public clouds are linked to internal data center clouds. The hybrid cloud is the final step in that integration.
Next: Tools for the Hybrid Cloud
For adequate operation of a hybrid cloud, smart infrastructure is essential. Without the agility and centralized management provided by a highly virtualized and centrally orchestrated environment, the complexity of managing a range of integrated and flexible environments will quickly get out of hand. Orchestrating the multiple environments of the hybrid cloud is now supported by several tools enabling central management as well as through standardization. A key technology emerging is the Open Source Eucalyptus project, which enables the creation of a private cloud that is, initially, interoperable with Amazon's EC2 cloud.
Eucalyptus, which stands for Elastic Utility Computing Architecture for Linking Your Programs to Useful Systems, is not only compatible with Amazon Web Services (AWS), but it is designed to bridge between public and private clouds to enable hybrid cloud infrastructures. Eucalyptus is beginning to have a significant impact on the definition of the hybrid cloud, as major vendors pick it up and incorporate it into their offerings. It is now provided as a component for several popular Linux distributions.
Another open source tool that is likely to be significant, but is not as far developed, is Deltacloud, sponsored by Linux vendor Red Hat. Deltacloud is designed to be more open than Eucalyptus, and it is not tied to the Amazon interface.
Other tools have been developed to aid in solving the problems of bridging public and private clouds and meeting such needs as data transfer, management, and security. One significant product is VPN-Cubed, which is used across clouds, as a cloud connector, or within a cloud, offering a VPN "overlay network" to isolate multiple cloud networks. Other tools include Zimory, which enables rapid data transfer between public and private clouds; Elastra's enterprise cloud management software; and Egnyte's Local Cloud, which blends a hosted online solution with one that is on premises.
Major vendors, such as IBM, are also looking at integration issues, mainly by incorporating public cloud access into private cloud solutions. IBM has a new initiative with Juniper Networks to demonstrate how hybrid public/private clouds can interoperate seamlessly and securely, based on its own cloud architecture, orchestration, and management solutions.
The level of integration between the hybrid cloud and public clouds can range from basic system-to-system networking up to integration at the data level. Data integration is a complex issue, particularly where it involves bringing together software as a service (SaaS) applications from the cloud with internally supported applications. Depending on the applications and level of integration required, extensive changes in data architecture may be required. Data location and the need to transfer data depend on a range of issues that are specific to the organization, including regulations regarding data location, security concerns favoring local storage, network bandwidth availability, and application type. Some applications require extensive data from back-office operations; others may work with information that is largely stored in external clouds.
If most data is already on premises, then data integration is likely to work best if it occurs on premises. If the bulk of data is external, then a hosted or on-demand approach to integration may work better. Numerous options for cloud-based storage and data warehousing are available as a service. There are also consultancies and applications that focus specifically on integration -- some for internal use, some offered on an SaaS basis, and some available as part of a complete consultation and integration package.
Standards are an important part of the integration picture, but they are only beginning to emerge. The Open Virtualization Format (OVF) provided by the Distributed Management Task Force (DMTF), for example, sets an import standard for use between virtual machine hypervisors. Thanks to widespread use and the efforts of the Eucalyptus project to create an open source version, the AWS APIs are also becoming standards.
Integration at this level is likely to remain problematic long after the system-to-system issues are resolved. Each case is unique. Although standards continue to evolve, and concerns are slowly eroding, data integration will always require special attention.
The hybrid cloud is often viewed as the ultimate cloud result for the enterprise. While this appears to be a matter of simple inevitability, it masks a wide range of significant issues that need to be considered and resolved. A hybrid cloud provides numerous advantages in agility, efficiency, and cost. But integration of all components is at an early stage, and there are likely to be numerous hybrid cloud strategies going forward. There is no single answer.With some form of hybrid cloud likely to be inevitable, however, there is an added impetus to implement a virtualization-based smart infrastructure in the data center. The advantages of a mixed public/private cloud environment can be realized only to the extent that some level of interoperability can be achieved between the internal and external systems. Integration on both a hardware and a software level are now moving solidly into the clouds.
ABOUT THE AUTHOR
Brian J. Dooley is a Consultant with Cutter Consortium, and is an analyst and journalist with more than 20 years' experience in analyzing and writing about IT trends. He has written six books, numerous user manuals, hundreds of reports, and more than 2,000 magazine features. Mr. Dooley is the founder and past President of the New Zealand chapter of the Society for Technical Communication. He initiated and is on the board of the Graduate Certificate in Technical Communication program at Christchurch Institute of Technology, and he is on the editorial advisory board for Faulkner Technical Reports. Mr. Dooley currently resides in New Zealand, where he maintains a Web site at http://bjdooley.com.
This article was reprinted with permission from the Cutter Consortium.