Of all the hype cycles I have witnessed since Y2K -- including ERP, CRM, Supply Chain, BI, ECM, XML, Web services, SOA, MDM and others that fell flat on their face -- none has arisen with the rapidity and immediate reality of cloud computing.
As an evolutionary shift in infrastructure hosting and management, cloud computing supports contemporary trends that include software as a service (SaaS), application hosting, outsourcing and virtualization, which well get to. Because things are progressing rather rapidly, forgive me for laying a little quick groundwork; for as many people I meet who are versed on the topic, I find many more in our industry who are unaware.
Computing clouds are publicly available and virtually limitless grids of managed servers and/or storage that can be called on as needed, scaled up and down as needed and paid for as used. Yes, weve heard about this before. IBM launched its first On Demand utility computing center back in 2003 as a signature project of the Palmisano regime, and the topic has been written about ever since. Some of us also remember the Web server farms of Exodus Communications or Global Crossing and how they were consumed in the dotcom bubble.
What has changed over time is largely about shifting business priorities, but also about the huge performance improvements and lower cost of CPUs, the endless horizontal scalability of shared grid computing and a newly competitive service market that is just beginning to commoditize infrastructure management.
What really upped the stakes were the huge clouds built by consumer-facing companies (including Google, Yahoo, e-Bay and Amazon) to run their own infrastructure. Once these guys got good at network computing (were talking really good), they came to realize they might resell their grids to you with a simple proposition: Why spend months and millions of dollars building and owning a data center when you can have the equivalent of a Fortune 500 analytic data store up and running in a matter of hours at a cost of thousands per month?
Its compelling stuff, even if we are at the very front end of a nascent trend. I recently interviewed Vertica Systems CEO Ralph Breslauer for an upcoming issue of DM Review and will speak to one of his customers soon. Vertica sells analytic columnar databases to mainstream companies, has its share of blue-chip clients and delivers its product in three ways. You can install it on your own hardware; you can buy an appliance that runs the software; or you can pay as you go and scale up or down on a monthly basis on Amazons EC2 (Elastic Compute Cloud) and S3 (Simple Storage Service).
Jeff Barr, Senior Web Services Evangelist at Amazon, said in a seminar that Amazon took its cue from outside developers who reported that 70 percent of the effort involved in putting an application online had nothing to do with coding. It was all this low-level muck, he said, referring to the data center, bandwidth, power, cooling, operations, monitoring and staffing that comes with in-house hardware.
This is not about selling books or music. Amazon has e-commerce services for business, but they separately know network computing like most IT departments never will. Amazons managed Web services are pretty much turnkey and come complete with service-level agreements, queueing, load balancing and failover across thousands or hundreds of thousands of common boxes. Need another 10 terabytes for your data warehouse? Dont panic, just add a node on the cloud and youre back in business within minutes.
Being first to market with clients on the cloud (as Breslauer claims Vertica is), offers advantages from the ISVs point of view. We can get a proof of concept going in a half hour rather than having to wait for IT to procure hardware and set things up. Some Vertica clients already see the cloud as a lower-risk proposition since they're skipping the upfront capital expenditure and paying month to month. For companies wanting to run a three-month marketing project, or for those who deal with seasonal spikes (taxes, audits, school systems, Christmas), there is no long-term agreement or leftover hardware to repurpose.
What Goes Up
We mentioned in our last column all the excitement at the Forrester IT Summit over the growth in outsourcing of application infrastructure and a new mix of on-premise and hosted corporate applications. While some see the cloud as anything that is outsourced to a Web-based infrastructure, there are also those who insist that, by its nature, the cloud must be a multi-tenant grid in order to be a true utility to users. I also found that some software/service providers consider themselves part of the cloud, though were tending to see the cloud as simply the low-level infrastructure that enables their services, or the services that an end-user company buys on its own.
After the conference, I followed up with Forrester analyst Kyle McNabb, who kindly laid out what Forrester Research sees -- for now -- as the three on-premise software alternatives available to enterprises (the condensed bullets that follow are from Forrester).
*SaaS: In its most rigid definition, SaaS is built from the ground up to be multi-tenant at all layers of the stack database, server, and application. All users run the same code, with customizations and configurations stored as metadata parameters. SaaS is sold on a subscription or term license basis that includes upgrades, maintenance, and usually some level of support.
* ASP (Application Service Provider): This term typically refers to applications built in client-server, single-tenant architecture that are being exposed to customers as an on-demand service, often by a third party rather than the provider that created the software. Like SaaS, firms typically use this software through a Web client on a subscription basis. Upgrades follow a similar schedule to traditional licensed applications.
* Application outsourcing. Application outsourcing is when firms buy licensed software and pay someone else to manage it. Some application providers offer this service directly, but firms can also hire service providers like Accenture or Infosys for application outsourcing. In this model, the customer has control over customizing the source code and deciding when to upgrade to a newer version. Depending on the application, firms can access the application through a Web browser or a desktop client.
McNabb told me that most of his clients dont make much of a distinction between SaaS and ASP, though technology vendors certainly do. Those with more SaaS-oriented models claim theyre able to offer more cost effective solutions. However in the world of collaboration and content management, I think its premature to figure out which option offers more cost savings to an enterprise.
McNabb says concerns over security, intellectual property protection, and performance have many clients questioning whether they can trust a cloud-based SaaS model now. For the time being, the relative control and near-term cost effectiveness makes ASP options look attractive to bigger enterprises. As Cisco Systems CEO John Chambers noted in his Forrester keynote, There is no such thing as a secure data center, just degrees of security. The question businesses need to resolve is whether the client or provider is best suited to manage such issues.
Finally, the idea of server virtualization, which shares the resources of a computer over multiple environments, throws another curve into the mix. I asked VMwares CTO Steve Herrod what he thought virtualization meant to the cloud.
We see virtualization as having a really good chance at bridging the gap between software running on premise and off premise, Herrod said. You might have two sites with a virtual server on each side that coordinates activities. You could see that in a security umbrella placed over a certain class of workloads as companies start to leverage hosted resources. From the provider end, if you can offer high-end services [through virtualization] such as better security and back-up or disaster recovery, the hosting party can provide a lot more value than basic computing resources.
Breslauer has a similar take on the mix-and-match approach. Within enterprises youre going to have this virtualized environment and the cloud is really no more than one big public environment with tons and tons of servers, he says. My view is that big enterprises are going to have their own internal cloud and use external clouds for different things, and the boundaries will begin to blur. The next deployment option planned for Vertica is an internal virtualized environment due by the end of the summer.
In summary, the deployment of computing clouds means that something is likely to change forever, though the newest and best standards of infrastructure have always receded inexorably to the status of a dependable dial tone. Its what executives always demand, in their car or their computing infrastructure: an investment and a responsive throttle.
There are many questions still to be answered and companies will not rush to cloud computing with their eyes shut. Some will never accept the idea of offloading critical data infrastructure unless and until the practice becomes ubiquitous. Massive parallel processing on an endless stream of common boxes does seem destined to become another new standard, not a revolution, but just as powerful and durable as electronic data interchange became 25 years ago. In the utility construct, its the same thing as knowing that your bread fits into your toaster and that your toaster plugs into any socket on the wall. As it goes with all standards, youll know the cloud is working once youve forgotten its there. In the meantime, it might be wise to remember that it appears to be arriving fairly soon.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access