Incomplete, inaccurate or obsolete data is the primary reason that financial institutions are losing up to $120 million a year through operational risk. That is the conclusion of a recent survey by the Risk Waters Group and SAS, a market leader in business intelligence.

The survey, the largest ever conducted about operational risk management, included interviews with 400 risk managers at 300 financial institutions. It looked into the losses suffered through operational risk and what companies in the financial sector are doing to reduce them.

Respondents from 28 percent of the companies feel that the difficulty in collecting the volume of data required to accurately identify and manage operational risk represents the major obstacle to preventing losses. Another 33 percent blame poor data quality as the major stumbling block.

"Quality of data has been a major issue for businesses for many years, and this survey shows that the problem has by no means gone away," said Peyman Mestchian, head of risk at SAS U.K. "The focus on operational risk management has increased with regulations such as Sarbanes-Oxley and Basel II, but once again, companies are coming up against the problem of finding and interpreting the data that they need."

The issue of data quantity and quality has led to a rise in the implementation of internal loss databases and self-assessment tools. Nearly 50 percent of companies surveyed have implemented a database and 45 percent have established an assessment tool. However, more than 90 percent of respondents admit that they have yet to invest in the modeling and analysis tools required to make sense of the data once it has been collated.

"While the new regulations that are coming into effect have forced companies to implement the systems needed to collect and store data, few have tackled the problem of what to do with it once they have it," continued Mestchian. "Data has little value unless it can be turned into worthwhile information and, as yet, many companies seem unwilling or unable to take this step."

Further results of the study indicated that a perceived lack of functionality in existing software packages combined with fears about the high cost of modeling and analysis packages were the main obstacles to implementation. Sixty-eight percent of respondents had built at least one operational risk system in-house because it was seen to be cost-effective and could be designed to meet the specific needs of the business.

"In-house operational risk systems have done a reasonable job up until now, but only a minority of those software systems are both scalable and flexible enough to cope with what is now being asked of them. Trying to update outmoded systems is proving costly, time- consuming and often ineffective," said Mestchian. "The real challenge here is not putting a couple of databases together. Any good software vendor can do that. The trick is in having an effective methodology for combining qualitative and quantitative data and linking external data with internal data. SAS has invested in this area and has developed powerful techniques to address these problems."

For a summary of survey results, please visit http://www.sas.com/news/preleases/090803/news1_addition.html.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access