April 3, 2012 – Firms needs to do things "faster, smarter and dirtier,” according to hardware designer Andy Bechtolsheim, co-founder of Arista Networks and, earlier, Sun Microsystems.

Maybe so, but a smarter, faster future lies in taming the big data challenge, according to Larry Tabb, founder and chief executive of the consultancy Tabb Group.

Those were the primary messages delivered by Tabb and Bechtolsheim during the second keynote session of the 9th Annual 2012 High Performance Computing Linux for Wall Street, held Monday at the Roosevelt Hotel in New York.

The demands of data have led to extreme advancements in chip making, Tabb said. Citing Moore’s Law, the techno-historical law stating that the number of transistors that can be placed inexpensively on an integrated circuit doubles roughly every two years, Tabb said that it is likely that a single chip could contain nearly a trillion transistors 20 years from now.

What can you do with 1-trillion transistors was the question Tabb presented to the audience.

One key, he said, was using such massive computing technologies to handle immense amounts of data now needed by financial service firms, including the huge and accelerating amounts of messaging and information related to ultra high frequency trading. Tabb said that U.S. securities messaging amounts have grown 30 percent every six months over the past decade.

Regulators are also driving much of the immense demand for trading and investing data. They are demanding greater accountability and greater access to all information. Tabb expected this pressure to continue for another seven years. E-trading adoption, he said, is almost fully penetrated in the U.S.

Addressing the needs of big data would include, of course, faster switches and much more advanced memory systems, such as Flash. But traders and investors will have to do better than that, according to Arista founder Bechtolsheim.

Bechtolsheim said that as competition increases, firms will need to embrace three sets of strategies.

  • Faster. They will, of course, have to make their systems faster and cut down on latency. However, Bechtolsheim warned that a lot of work has already been done in this area by many people and the potential to gain vastly superior advantages in this area by any one person is “pretty limited.”
  • Smarter. Firms will have to continue to find ways to develop more structured data, and find smarter ways to look at it and drill down in the analytics. Develop smarter methods for predicting overvaluations, undervaluations, and the like. Again, there has been a lot of work done in this area.
  • Dirtier. Firms will have to start working more with unstructured, or dirty, data, such as information from Tweets, the Web and other nontraditional sources.

Bechtolsheim said that co-locating of trading equipment with exchanges' matching engines in the same data center as well as infrastructure hubs in different locales will continue to be vital in the near future. When that becomes too expensive, he said, they will need to consider methods of bringing the infrastructure to the data.
This story originally appeared at Securities Technology Monitor.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access