The decoupling of a particular computer from the software system that tells it how to operate, the applications that run on it or the data stored on it is known as virtualization. And the mass conversion of physical servers, data storage devices and desktop computers into virtual machines is prompting a sea change in how data centers on Wall Street are configured and the speed with which new capacity can be brought to bear or new employees put to work.
But "The challenge financial services firms have is that they want to take zero risk" with key applications for buying and selling stocks or analyzing risks, said Gary Chen, research manager for enterprise virtualization software at IDC, a Framingham, Mass. technology research firm. "They want that mission-critical application to have whatever resources it needs, to operate effectively, with no constraints."
This means that mission-critical operations, so far, have not driven the adoption of virtualization. Instead, the biggest driver of virtualization to date has been in the cost-saving arena--chiefly, the ability to save space, power, capital and operating expense on maintaining servers in a data center. The operations of scores of physical servers, each loaded with a single operating system and a key application running on it, are consolidated into far fewer servers, each able to run multiple images of operating systems and designated applications.
In effect, virtualization can mean replacing 100 servers in a data center with as few as 10. Natixis Capital Markets, the New York-based investment banking unit of Paris-based Natixis, for instance, managed to put eight or more virtual servers on each of its IBM blade servers, using Virtual Infrastructure software from VMWare, the best-known name in virtualization of servers based on Intel-type microprocessors.
Where it used to have 240 physical servers, each dedicated to specific applications and operating systems, it has just 70 now. The five-year-old initiative allowed the company to put off buying a new generation of hardware, saving it $2.1 million and more than $200,000 in annual operating expenses.

Business Agility


Following server consolidation, the next step is to begin converting more computing resources into virtual equivalents that can be assigned on demand to particular uses. This includes not just processing power, but also memory, data storage and bandwidth, a.k.a. networking.
The ability to aggregate resources and immediately reassign them from control panels in the data center helps a company grow its business. "Our clients tell us they can deploy virtual servers 30 times faster than physical,'' said Thomas Bittman, an analyst with Stamford, CT-based research firm Gartner, in a blog posting on the stages of virtualization. "They also tell us that customer demand roughly doubles when they deliver faster."
On Wall Street, the impact can be dramatic. A case in point relates to last year's collapse of Bear Stearns and the elimination of Lehman Brothers--"virtually" overnight. This set up a big market opportunity for companies with the "virtual capacity" to take on new business, overnight.
When a financial firm suddenly dies, its customers go elsewhere. Companies that can, quite literally, turn up new capacity overnight win, those that don't lose. One big-name firm that was ready when the credit crunch hit was Zurich-based Credit Suisse, which had begun to virtualize its data center operations around the world four years ago.

Hurdles for Full Virtualization


Full virtualization means making everything in the data network exist as digitally defined products or resources, including the processing power and software that traders or employees draw on from their desks. In operation, it's a lot easier, cheaper and faster to create a master copy of core code a user needs and duplicate it on a server in a data center than to walk down a hall or fly to a remote office to install it on a physical computer on a desktop.
But desktop virtualization projects are only in their earliest stages on Wall Street. The most bullish adopter to surface in the past two years is Merrill Lynch. At this time last year, the company planned to have about 10 percent of its 63,000 desktop computers converted into virtual equivalents by the end of 2008, and as much as 50 percent by the end of 2013.
However, after September 15, when Bank of America announced it would buy Merrill Lynch, the project slowed down. The combined company has not commented on where the project stands.
Early this year, though, Bank of America named Andy Brown as its senior vice president and head of strategy, architecture and optimization. That move was seen by The 451 Group, an enterprise technology research firm in New York, as a signal that the project would pick up speed again, since Brown championed virtualization at Merrill. But instead, Bank of America has indicated that Brown's time is largely occupied by overseeing the merging of the technical infrastructures of Bank of America and Merrill.
"We are focused on integrating the systems of Bank of America and Merrill Lynch to bring the best service to our clients, support for our teammates and return for our shareholders," said Christopher Feeney, corporate communications executive for enterprise technology and delivery at Bank of America. "We are still in the planning process and have no decisions to announce at this time."
In the meantime, the biggest hurdles for full virtualization at securities trading firms are security and privacy. Financial services firms worry about what Chen calls "I/O stream security"--the safe passage of data in and out of an information processing system. The concern surrounds the ability of the piece of code known as a hypervisor to separate streams of data properly on a given server. A hypervisor is the signature piece of code in virtualization that is installed directly on a piece of hardware and decouples operating systems and applications.
Also of concern are streams of data that might not get where they are supposed to go, because a reassignment falters. Or, in a worst case scenario, streams that get picked off by an unauthorized user, on what are essentially shared and reused communications channels.

Into the Cloud


Beyond full virtualization is the ability to move operations "into the cloud." The idea, says Kevin Vogl, vice president of virtualization for cloud provider Champion Solutions Group in Boca Raton, Fla., is for a financial services firm to own and operate only enough computing resources to handle the requirements of an average day. For peak needs, when trading volumes surge or a large number of new customers suddenly signs on, you would put "golden images" of virtual servers identical to your own on machines physically located at and operated by a provider such as Champion. "Instead of your server farm it's our server farm,'' said Champion's chief executive Chris Pyle.
Users would make a reservation for a certain amount of processing power, memory, data storage and network capacity. Then, when there's a burst in activity, pieces would launch automatically and the user would be charged for the exact amounts of capacity used. Champion plans to introduce such a virtual computing capacity service in the next 60 days. Customers will pick from pictures of servers, desktops, cables and other devices to reserve the "racks and rows" of capacity they might need. But they will be looking only at their virtual equivalents. In effect, they will be picking digital images onscreen of the digital images that later will be placed on servers hosted by the service provider.

The Four Phases of Virtulization

1. Server Consolidation

  • "Decouple" the logical functions of a computer from its physical location
  • End practice of tying one server to one application
  • Run multiple applications and operating systems on the same hardware
  • Run eight or more "virtual" servers on one physical machine 


2. Business Agility

  • Increase ability to reassign computing resources as needed
  • Convert memory, data storage and bandwidth into virtual equivalents
  • Reassign resources, on the fly, without downtime
  • Build in the capability to take on new customers or execute new products, overnight


3. 'Full Virtualization'

  • All elements of the data network become digitally defined
  • Key: Convert desktop computers into virtual workstations
  • Automate the provisioning of new services based on rules
  • Automate disaster recovery

4. Beyond the Data Center

  • Calibrate virtual operations to meet average daily needs
  • Reserve capacity for peak needs and surges at outside data service provider
  • Use same master images, wherever applications are repeated
  • Closely control replication, to avoid 'virtual sprawl

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access