In honor of the 15th anniversary of DM Review magazine, the editorial staff interviewed several thought leaders who have history with DM Review to share their knowledge and insight about the technology industry in general and DM Review, specifically. Today, we take a look at where we've been and where we're going.

The first issue of the precursor to DM Review, Data Base Management, debuted in February 1991 and focused on major database management systems, database design, quality assurance and relational technologies. Rob Mattison, an original DM Review columnist, says data warehousing (DW) and business intelligence (BI) were unique phenomena in IT history. "In the early days, you had the giant computer companies - IBM, Burroughs and Xerox." Each dominated their own segment - business, government and office - and all the operating systems were proprietary. "Then came Apple, Windows and UNIX. The smaller hardware got more powerful and the revolution began."

Next came the relational database craze - IBM DB2, Oracle, Sybase and Informix - that would displace the older proprietary IMS, IDMS, Datacom-DB and others. "Then came client/server and distributed computing," Mattison recalls. "Only after all of those historical confluences did data warehousing make any sense."

Mattison believes people today don't appreciate the context of the data discussion that was going on at the time. "The 'theories' of data management were very bizarre. Rule number one was, 'No data should ever be stored more than once.' Eventually, people realized that data redundancy could be a good thing, and so data warehousing was born."


As the industry evolved, the magazine name changed to Data Management Review, and the focus moved from filling the information needs for large-system users to providing quality information about database technologies on all platforms. The PC stirred up quite a revolution within the enterprise, especially in the end-user community. Sid Adelman, a long-time DM Review contributor and "Ask the Experts" moderator, reminds us that it was the lower cost and performance capabilities of hardware technology that made so many applications affordable and practical. "An increase in the number of users, more complex queries and multiple terabyte databases at a reasonable cost spurred the advent of desktop technologies," he says.

Mattison agrees that the key drivers of the industry have always been more powerful, stronger CPUs, more memory and cheaper disks. It is the speed and capacity of these systems that has allowed enterprises to actually execute strategies that have been around for a long time. "People were making use of data mining and statistical analysis tools even in the earliest days. These systems, however, required massive CPUs and storage - some systems even required 64MB of RAM and 200MB of hard disk! In those days, you needed an entire room to hold the computer to handle that kind of capacity," says Mattison.

From basic data management, DM Review's focus expanded to discuss the requirements for developing an integrated corporate information infrastructure. Bill Inmon has been talking and writing about the subject since 1981. He says much of the momentum for the DW industry came out of frustration at the end-user analytical level. "Early applications housed processes and data that were designed to help the clerical community. The analytical community had no such favors; yet, there was a real need for analytical processing. Many times it was said, 'I know the data is in my corporation somewhere, if I could just find it.'" Inmon watched as data warehousing slowly freed data for usage across many boundaries. For the first time, "the analytical user had real domain with the data warehouse," he says.

Key Developments

For the past 15 years, DM Review has covered major trends important to the IT and business community. Inmon considers the main business drivers of this time period to be the ability of database management system (DBMS) software to handle large volumes of data, the increased sophistication of analytic software, the increase of storage volumes with a corresponding decrease in unit costs, and the inclusion of data warehouse and analytical processing within the enterprise resource planning community.

Larry English, a DM Review contributor since 1995, sees a similar picture. "Key developments include the technologies of multidimensional data management, which allow us to see patterns based upon questions or dimensions we know we need to analyze; data mining, which allows analysis; and data visualization, which provides the ability to see the significance of the information in ways that make the truth jump out at us."

Limited by computing power, the early data miners were statisticians with Ph.D.s and mathematical expertise. Says Mattison, "It took months to develop a single model and program it." BI has come a long way since then. "Today you can buy a data mining tool that performs calculations that were unimaginable even 10 years ago."

Joyce Norris-Montanari, long-time DM Review editorial adviser, states that all corporate information will need to come together in the future to be accessible by a great number of users. While most companies started with a simple reporting environment, she believes business needs have evolved to require more.

Lou Agosta, a columnist since 1995, says, "Over the long run, Moore's Law, [which states] that processing power doubles roughly every 18 months, has made a huge difference. This is a watershed process, if not a specific event, that continues to make a difference. A tipping point was reached around the year 2001 such that ROLAP is nearly as powerful as proprietary OLAP, even if it is still not as widespread."

As powerful as computers have become, they cannot perform really big joins on the fly without performance regard for data movement. Service-oriented architectures (SOAs) are now seen as the next evolutionary step in integration. They provide more generic and lightweight coupling of data and application resources. Agosta says a tension is created because SOAs enable information as a service but can show up as the exact opposite of data warehousing - especially if the latter is a big data store. "The race between growing volumes of complex data, computing power and the bandwidth to manage it is expected to continue. It will not be dull."

Top Challenges

Processing power and capacity are not by themselves an answer to changing business needs and other dynamics. The alignment of systems with operational processes can leverage technology but requires considerable business intervention. The top challenge today, Agosta believes, is to use business intelligence systems to optimize operational (transactional) practices that already exist. "Most companies are not taking advantage of the breakthroughs in processing power and software to design businesses and business processes that are as agile and responsive as the demands coming at companies."

The paradigm shift of the Internet has abolished the boundaries of BI in time and place, thus changing the competitive landscape. "The key outside issue affecting the BI industry is that the world is flat," Agosta says. "There is a level playing field between America's Midwest and Bangalore, India, or Shanghai, China. An abundance of fiber-optic bandwidth along with the Internet, open source, outsourcing, offshoring and information infusion into business processes mean that the innovations on which improved productivity depends will require ever-expanding communities of collaboration, communication and cooperation." More tactically, a key challenge of data warehousing is to get the data out in a timely way, Agosta believes. "Going forward, the advantage will be to those enterprises that are able to perform sustained real-time updates of the data warehouse, providing access to low-latency data in a timely way."

Inmon says that the top challenges  today include coping with an ever-mounting volume of data, integrating unstructured data into the data warehouse, managing the lifecycle of data within the data warehouse, tightly coupling business and technical metadata into the data warehouse, and managing the ever-mounting costs of the data warehouse.

According to English, the top issues continue to be the proliferation of disparately defined databases (including silo-designed data warehouses and data marts) and the failure to address information quality as a proactive process-improvement initiative.

Some things never change. In 1994, the need for IT and business communities to communicate effectively was addressed in this quote from an interview with H. Thaine Lyman of Deloitte and Touche: "It is critical that you translate techno-speak into business-speak." We are still talking about bridging the gap between business and IT. History has shown us that a main business driver for BI/DW is the need for information. DM Review strives to present new and better ways to tackle persistent challenges for improved results. May looking back on 15 years of excellence inspire us for years to come.


Click here to see the DM Review Timeline in a PDF file.


Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access