Many investment firms dealing in complex asset-backed securities did not have sufficient computing capacity to figure out the risks they were taking, as the global financial crisis unfolded, said the managing director and chief technology architect of Bank of America Merrill Lynch Monday.
"I would argue that regardless of the subprime disaster from a policy perspective, most of Wall Street did not have the compute power in place to calculate their real risk on this stuff,'' said Jeffrey M. Birnbaum, speaking at a late morning session of the 2010 High Performance Computing Conference at the Roosevelt Hotel. "Because it is just too enormous.''
Exotic instruments known as collateralized debt obligations squared, for instance, were extremely complex. Where "simple" collateralized debt obligations were backed by a pool of bonds, loans and other credit instruments, a CDO-squared would be backed by slices of CDOs, known as tranches. CDO-squared products allowed banks to resell the credit risk they took in acquiring CDOs.
"CD0 squared? The computational power you need to actually get a reasonable answer is phenomenal,'' said Birnbaum. "Five years from now, it may be within your power to be able to run models that tell you reasonably what your risk measures really are on some of that stuff."
But not in 2008.
The "sea change" Birnbaum sees sweeping computation on Wall Street is the advance of low-cost commodity processors, with multiple "core" processors on single chips, combined with parallel programming techniques that tie together their work by the hundreds or thousands, in smart ways.
The changeover from sequential or linear programming, he noted, is requiring a massive change in the mindset of programmers. Particularly older programmers are used to writing programs where one instruction follows another, not where different tasks get dealt out to different processors at different points in the process and where results are pulled back together again.
Programmers will be dealing with "hundreds and hundreds of cores and threads.''
"Five years from now you will be confronted as a programmer with an abundance of parallel threads. What are you going to do with them?,'' he asked his audience, largely of Wall Street programmers.
"If you have no clue, if you have no thought process around how to do this, you're not going to be able to write high-performance systems,'' he said. "That's it."
"The lifeblood of Wall Street will depend on our ability to compute. To do ever more computation,'' he said.
And that will mean getting more and more skilled in parallelism, knowing what components of a program need to be written particularly for parallelism, what language or languages make the most sense, in each case, and use techniques like the creation of "fancy blob stores" for replicating information on trades and instruments being analyzed to deliver the level of computing that will be required.
"You have to have a higher level of understanding of what's going on" in parallel programming, than in serial programming, he said.
Bank of America Merrill Lynch was formed by a merger in late 2008, in the wake the credit crisis.
This article can also be found at SecuritiesTechnologyMonitor.com.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access