Federal Reserve Governor Daniel Tarullo last week brought the need for data standards and centralized data management to the attention of the U.S Senate during his testimony before the Subcommittee on Security and International Trade and Finance.

Tarullo proposed the establishment of a new centralized system of data collection and monitoring to encourage greater standardization of reference and entity data.

His goal isn’t exactly altruistic: the Fed wants to take on the role of systematic risk overseer in the U.S. if not the globe. And it needs data to analyze to do it.

Tarullo isn’t the only one advocating such a repository. So has the European Central Bank which already come up with some sort of blueprint. And a whole bunch of data management experts and academics are calling for the creation of the National Institute of Finance to come up with such a repository as well. However, the NIF’s repository would also include transactional and position data. That means information on the deals which financial firms execute.

But firms within the securities industry are known for safeguarding data as a prized possession. Can they come together on how to open up?

That is the multimillion if not multibillion dollar question which must now be answered – and solved – if regulators are to get a better grip on systemic risk and investors get a better handle on just what it is they are buying.

Starting is the hard part. Somebody has to take the first step -- creating data standards – that means the identification codes and descriptions of the data stored in the repository. Then there are the costs of building a data repository and obtaining regulatory approvals.

There are already a multitude of identification codes, not only for instruments, symbols and counterparty data but for the myriad of attributes that collectively comprise the identity of a financial transaction. Among them are CUSIPs, Sedols, ISINs, RICs and BICs, to name a few. There are even differences between data vendors on how financial instruments and counterparties should be identified.

Financial Intergroup Advisors, a New York consultancy specializing in risk management issues, says it is in talks with financial institutions and vendors to create such a global data repository in the financial services industry. The new proposed Central Counterparty for Data Management would be a central access point for data describing financial instruments which firms trade, the companies which issue them and the companies which trade them.

The data utility would be owned by the financial services industry and could save global firms about $40 billion annually by eliminating repetitive functions. It would take the same data now retrieved from disparate sources, cleansed and stored in multiple data silos. All the financial firms would have to do is access the same data.

And what about the data standards?

Allan Grody, president of Financial Intergroup, says that financial institutions would create the common standards themselves. They wouldn’t have much choice if they were compelled to do so by the G-20 Financial Stability Board, an organization charged with creating the framework to mitigate systemic risk.

But Stuart Plane, vice president of Cadis, a London-based data management software firm, is skeptical. He says that financial firms can’t even rely on creating internal standards so how are they supposed to collectively work together.

Plane believes that if regulators required issuers to use XBRL protocol to tag all their data, not just financial reporting information there would be no need for any kind of centralized data repository. “Having a centralized repository isn’t practical since all firms have different data requirements and manage their data differently,” he says.

But Grody insists that using XBRL alone won’t cut it. “It’s a tagging language but doesn’t provide standardization for how the information should be presented,” says Grody. It’s like the bar coding system  but without the universal product code embedded within it. The computer can find it but that does not tell you the data is correct or unique.

Okay, now the standards have been developed. Where’s the seed money? Grody says that one of the large financial institutions which is CCDM working group, could provide its technology and facilities to seed the CCDM. The wheel wouldn’t have to be reinvented.

Grody says that Financial Intergroup has already lined up some of the world's largest financial institutions to work on the CCDM. Financial Intergroup has also found private equity firms willing to fund the project, if necessary. But so far none of the dozen likely suspects contacted by Securities Industry News on Monday were willing to disclose they are backing the CCDM. Nor would they come out in support of the concept. If this is such a great idea, why the secrecy?

Grody says that confidentiality agreements preclude his revealing the names of the CCDM's financial backers and supporters. But he insists that the idea of a central data repository is the same solution advocated by the Group of Thirty, the iinfluential Washington, D.C. think tank. And the concept is just as good if not better than the idea of a universal bar code, now the de facto standard in the retail industry.

What’s next?

Grody’s partner Richard Tinerven says that the CCDM doesn’t need any legislation to get started. It just needs the “political will” of the large financial institutions to share non-strategic reference data, he says. And to  guarantee each other that the data is accurate when each executes trades and settles trades.

However, that means firms would have to put up some capital in a "guarantee fund" of sorts. In the event a transaction fails to settle because of poor data received from the CCDM -- an unlikely occurrence according to Grody -- all of the owners of the CCDM would be liable. It's a way of mutualizing risk -- a scenario which takes place among clearinghouses when a trade doesn't settle due to counterparty failure.

The CCDM would have to apply to the Securities and Exchange Commission, or another regulator, to be exempt from the need to register as a clearing organization. Such an exemption has been received by Omgeo, a post-trade communications software vendor.

But whose “political will” will come first?

Will regulators come forward to mandate the creation of a data repository and come up with the standards for it or will financial firms do it on their own?

Neither side is known for its quick action. Hopefully, one of the two will come up with an answer – and soon. The securities industry doesn’t have much time left to wait.

This article can also be found at SecuritiesIndustry.com.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access