Virtualization of computing resources is no longer just about saving costs, by consolidating servers, applications and bandwidths into on-demand “virtual” equivalents, said Collean Healy, general manager of financial services for Microsoft at TradeTech.

The “exciting” part now is portability, she said on a panel on “Tapping Into the Minds of Today’s CTOs/CIOs As They Adapt to Evolving, Innovative Financial Markets” at the trading technology conference which concluded Wednesday at the Marriott Marquis.

Virtualization itself “really is a couple-decade-old concept,” from the days of mainframe computing, where operating systems and applications are separated from the devices that use them.

That separation now occurs on banks of blade servers, where master images of operating systems and applications are kept. Then copies are made, as needed, to support individual users or projects, for only as long as they are needed and only where needed.

This gives portability of operating systems and applications “regardless of where you are,” said Healy. This enables “a lot of productivity and innovation,” as well as access anywhere at any time from any device.

There are still basic cost savings drivers at work, though. She cited the use by RiskMetrics Group of virtual resources in the computing “cloud” to handle peak or overload risk evaluation work.

Peter Kelso, managing director and chief information officer for Deutsche Bank Investment Management Americas, said virtualization is especially useful in developing new applications or on project work.

A development server can be configured for one week, turned off for the weekend, relaunhced on a Monday and kept in operation only for as many days as needed. Then, it gets wiped away.

And it only makes sense to make use of the “tremendous capacity out there” in the cloud, from sources such as Amazon and Google, he said. They built their huge data centers for other purposes, but use of excess capacity is cheap.

“It’s ridiculous not to take advantage of it,’’ he said.

Take advantage of tremendous capacity out there “I’ts ridiculous not tot ake advantage of it.”

But virtual capacity in the cloud is not as “mature” and proven as virtual capacity inside a company’s own data centers and networks, said Hunter Smith, chief technology officer of Acadian Asset Management.

Techniques for creating virtual machines and capacity inside the enterprise have been honed for a decade, he said. Not so, in the cloud.

Be careful, too, about what kinds of work you try to put into virtual capacity in the cloud, said Marc Rieffel, chief technology officer of WorldQuant, an investment management firm.

In high-frequency trading, a firm needs to control its own peak capacity. Low capacity utilization is a price you have to pay, to get speed when needed, he said.

And overnight batch processing of trade information, if it’s steady, doesn’t need to be exported, either, he said.

This article can also be found at SecuritiesIndustry.com.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access