April 28, 2011 – While virtualization is undergoing general widespread adoption, enterprises have yet to buy into added data downtime protection across all applications, according to a new report from consultancy Aberdeen.

The report, “Reduce the Cost of Downtime Protection: Tier Your Virtualized Applications,” surveyed 135 organizations of varying sizes across industry verticals on lowering unscheduled downtime with virtualized applications. Although increased virtualization protection can be expensive, it can limit unscheduled downtime to minutes per year and save headaches from lost application data, according to the survey. Enterprises noted their options for protection, or absence thereof, in four main ways: added protection from high availability hypervisor support, cluster or high availability software, and fault tolerant servers, or an absence of extra protection.

In tiering levels of application importance and cost, 73 percent of enterprises have decided on some form of protection for virtualized mission-critical applications, and 65 percent have opted for increased downtime protection for databases and email, respectively. On the bottom end of the protection spectrum, enterprises reported no extra virtualization protection for 60 percent of application testing and development, and reduced back up dedications for department specific applications (42 percent) and Web applications (41 percent).

This at a time when, overall, 49 percent of servers have been virtualized, with an expected uptick to 66 percent of servers at the completion of current projects, according to Aberdeen figures.

Report author Richard Csaplar, senior research analyst in virtualization and storage, says virtualization growth, innovation and cost reductions should lead enterprises to investigate and deploy wider ranging applications. However, he says about 30 percent of enterprises have not explored tiering across all applications, while another 20 percent are getting more out of their virtualization efforts from that type of review and appropriate adoption.

“This really changes the way you need to think about your infrastructure. You need to think about it being more dynamic, with more options for moving applications around, for locating them at the right place at the right time. All of these things are becoming the norm, rather than the exception,” says Csaplar.

To find the right amount of virtualized protection, the survey recommends an inventory of business applications, with emphasis on the impacts of unavailability of each, and an investigation of high availability features from vendors and under your current usage. With an enterprise-appropriate tiering plan for applications in place, Csaplar says it’s also then easier to tier data for storage, so that work becomes “twice blessed.”

To obtain a copy of the report, click here.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access