Now that President Obama has signed the Dodd-Frank Wall Street Reform and Consumer Protection Act into law, the dust is just beginning to settle.

The last wave of regulation, highlighted by The Sarbanes–Oxley Act of 2002, concentrated on process. This new regulation is all about data.

The Dodd-Frank bill gives birth to an oversight body that will have the power to collect the information it believes is needed to determine overall risk in the nation’s financial system. At the Securities Industry and Financial Markets Association Technology Expo in June, there were discussions about potentially needing to track all trades across the system, not just those that were executed. This is almost an unfathomable data deluge problem. However, other industries have realized that this problem can be turned into an opportunity provided they change their view on what is and what is not needed.

Many companies saw data volumes increase by an order of magnitude when prior regulations have passed. This time, it could be two orders of magnitude.

The designs for networks of data storage that were conceived just 20 years ago simply will not cut it.

New designs will focus on removing any items that could be deemed unnecessary because they can be better managed elsewhere. An example is broker-less messaging used for streaming market data.

This week, for instance, Direct Edge formally launched its two trading platforms, EDGA and EDGX, as full-fledged stock exchanges. Making it possible was its ability to toss out dedicated messaging servers and use low-cost, high-powered industry standard servers running software geared specifically to message handling. By moving to Ultra Messaging technology from 29West, a subsidiary of Informatica, Direct Edge was able to improve performance 500 percent while cutting infrastructure costs in half. That’s rethinking data infrastructure.

Rethinking the problem has enabled technology to be used beyond the front office for streaming market data so that one technology can be used for any mode of messaging: streaming, persistence and queuing. Having a high performance universal messaging infrastructure gives an information technology organization the opportunity to rethink what is required to run operations efficiently and accurately. Infrastructure can be dramatically reduced. This will be important in order for IT to help the business remain agile in a world of regulation that is increasingly focused on tracking and evaluating the data.

Rethinking data infrastructure goes beyond recognizing that data is in motion. As data settles in systems, it still needs to be synchronized with other systems.

Flexibility means that you can support integration for any latency and in the mode that is most appropriate for the task at hand.

Some systems will need to handle data in real time. Others can perform with periodic or batch updates.. Having modern data integration infrastructure, means you can quickly and easily understand your data as well as the reasoning behind pervious changes or integration points. If this isn’t in place, seemingly straight forward regulatory requests can become a painful business exercise of trying to figure out what is the accurate data, where is it found and through what means did it get there.

Controlling the data explosion is the first step but ultimately businesses should be finding ways to harness it for new value. Mine your data exhaust.

Your data infrastructure, after all, is going to be called on to generate a lot more of it.

Visit SecuritiesTechnologyMonitor.com to comment.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access