The 10-year journey of DM Review has followed the evolution of the management of data from mainframe databases to the Corporate Information Factory where data is distributed to all decision-makers within the enterprise. We have witnessed change at Internet speed and have replaced green bar reports with advanced business intelligence products. We have seen the Internet move from a static Web-page environment to a highly dynamic environment supported by databases that can also be accessed by wireless devices. We have seen businesses change from being product-driven to customer-driven. Those of you who have been in the industry for less than 10 years have experienced more technological change in this cycle than your counterparts experienced in the previous 40 years. This article reflects on the changing technology landscape as covered in the pages of DM Review. I hope you'll enjoy this journey depicting the technological progress we've made since the debut of DM Review in February of 1991.

In 1991, enterprises were focused on the impacts of client/server in relation to mainframe systems. The selection of the right DBMS to support client/server and network versus relational issues were of paramount importance. CASE, expert systems, SAA and the AD/cycle were in vogue as was process modeling. In addition, we were also developing the repository-based data dictionary. Most DBAs were focused on the wholesale movement of applications and data from the mainframe environment to client/server. In addition, the role of the DBA was expanded. DBAs became leaders of data management teams that were responsible for providing accurate, reliable and shareable data. Disaster recovery and restores were hot issues with regard to client/server databases. In reality, all of the major database management issues that we had solved in the mainframe environment were now being reintroduced in the client/server environment.

In 1992, downsizing of mainframe applications into distributed client/ server applications to create improved efficiencies was key to most organizations. Enterprises started to focus on the implementation of a corporate data repository that would become a critical component of their information infrastructure. Information engineering, a systems development methodology, allowed organizations to identify a complete set of data requirements for designing and implementing more effective databases and systems. Bill Inmon introduced the data warehouse to our readers by differentiating traditional data processing (OLTP) from informational processing which required the management of data over a longer time period. The information warehouse contained all of the information that was required by the knowledge worker for making decisions. Bill Inmon also talked about meta data technology and how it would advance the current state of query and reporting tools. Data quality also surfaced as an issue regarding database information to ensure the accuracy, completeness, integrity and timeliness of enterprise data resources. By gaining a thorough knowledge of corporate data, many innovative enterprises achieved faster project completion, improved data quality, reduced risk and a rapid return on their investment with data profiling and mapping.

In 1993, object-oriented technology made its debut, promising to be the next silver bullet in information technology. Object- oriented technology later became embedded in client/server systems, making it easier to distribute data and applications across the enterprise. The business mantra for corporations was rightsizing, and client/server computing promised to create a distributed data processing environment that would increase profits for all companies. The graphical user interface (GUI) was the year's hottest technology; and at one trade show that year, I counted more than 24 GUI vendors. The GUI was considered the glue that held client/ server systems together. UNIX started to become a "heavy hitter" in the world of data processing as client/server continued to dominate all new initiatives for applications. Enterprise modeling was becoming strategic for enterprises that wanted to gain all of the benefits of the information age. Reverse engineering of legacy systems was also popular as it capitalized on the existing investment and value of the development that had already been completed. Other technology advances in 1993 included neural networks, multimedia, visual programming, imaging, EDI, video teleconferencing and pen-based computing.

In 1994, the major corporate focus was downsizing. Developing an IT infrastructure for efficiently managing dynamic and complex configurations of heterogeneous applications, platforms and users would not only determine what companies would be the most successful, but what companies would survive. Corporations also started to focus on decision support and how the data warehouse could help companies make better decisions by integrating the tremendous number of data stores that existed within their enterprises. As data warehouses grew in size, parallel processing provided the means to reduce processing time and provide decision-makers with information in a timely manner. It was also about this time that DBMS vendors recognized that decision support was different from OLTP and started implementing real OLAP capabilities into their databases. UNIX databases, GUIs, powerful servers, repositories, BPR, CASE and RAD tools were helping to drive client/server and object technology into every department within the enterprise. Client/server middleware and gateway technology helped provide the connections between applications and databases and launched a whole new group of products to facilitate distributed processing. Data replication was being explored to reduce the load on networks that were becoming major bottlenecks as the number of users was increasing exponentially due to the explosion of client/server systems. Database-oriented middleware played a very significant role in enterprise application integration (EAI), allowing a large number of enabling technologies to process source or target system information. The Personal Computer Memory Card International Association (PCMCIA) continued to establish standards for integrated circuit cards and promote interchangeability among mobile computers. These standards helped provide the foundation for wireless devices of the future.

In 1995, data warehousing and the emergence of the World Wide Web (WWW) were two of the most dominant themes. The WWW would have a profound effect on communication and global business, provided that a bulletproof infrastructure could be developed to support it. The establishment of a sound data warehousing infrastructure became a high priority for most enterprises due to the tremendous successes of data warehousing implementations at Wal-Mart and Fidelity ­ both became leaders in their respective industries as early adopters of the technology. Once a scalable data warehouse framework was in place, an enterprise could organize and utilize data more effectively, institute cross-functional systems and align IT and corporate goals, thereby enabling it to compete in the changing business environment. In addition, new classes of products started to make significant inroads into larger enterprises for query and reporting and multidimensional analysis. Although the query and reporting tools were prevalent in client/server operational areas, they were now being utilized within a business intelligence framework to provide meaningful information for decision making. Iterative development of a data warehouse started to gain acceptance as large- scale efforts took years to complete, though the key premise was that an enterprise data model must be completed prior to development. The enterprise data model was built incrementally by subject area, as was the data warehouse. Each iteration provided benefits that would help fund the next iteration. This required a combined effort from those executing the business vision and IT management, which was responsible for the development of the information infrastructure. The organizations that succeeded in the information age built their business intelligence strategies on top of a fully integrated data warehousing foundation. One of the major benefits of data warehousing that was driving significant ROI was data mining. Many enterprises were utilizing data mining to reveal trends, explain known outcomes, predict future outcomes and identify factors that could lead to a desired outcome. These companies gained the ability to make better business decisions, increase revenue and improve competitive advantage. Data mining provided insights that were not readily apparent in large volumes of data by utilizing intelligent agents, many of which were developed in the mid-eighties as a result of artificial intelligence (AI) discoveries. Savvy corporations used data mining to develop marketing strategies, target mailings, adjust inventories, minimize risks and eliminate wasteful spending based on an analysis of their data. As data visualization software evolved, it made the results of data mining more understandable to the end users, eliminating the need for a statistician. Data visualization capitalizes on using graphical elements to easily see patterns and trends that reduce dependence on statistical analysis and allow marketers to better understand and respond to customer needs. Early data visualization technology set the stage for visual data mining technology that would dramatically improve information exploration and retrieval in data intensive applications. Visual data mining technology would eventually move to the Web where its impact would be enormous.

In 1996, corporate intranets were being developed to facilitate the exchange of information and knowledge throughout the enterprise. The use of electronic media for commerce, e-mail, marketing and research had become a way of life and shortened communication time. Electronic media also provided access to a wider variety of information, the ability to reach globally to remote users and a reduction in the overall cost of communications. Scalability was also becoming a critical issue for the support of the expanding data capacities, increased number of users and complexity of in-depth data analysis. The process of turning data into knowledge and knowledge into action for business gain became known as business intelligence. It was an end-user activity that was facilitated by various analytical and collaborative tools and applications as well as a data warehousing infrastructure. It encompassed all types of data including hierarchical, relational, text, spatial, audio and video. Knowledge management included the extraction, documentation and cataloging of unwritten business rules and contextual information based on experiences that individuals "stored" in their minds. Business intelligence focused on how organizations accessed and used data to achieve organizational objectives. Business intelligence tools included ad hoc query and reporting tools, managed query environments, online analytical processing (OLAP), data mining and data visualization. Access to information, regardless of location, was a must in the data-driven business world. Data replication made it possible for remote locations to access up-to-date information at any time. As business intelligence continued to evolve, the integration of query, reporting and OLAP into a single product became critical for the end user. The complete business intelligence suite started to become a reality. The introduction of commercially available natural language query products that converted the English language into SQL queries offered the promise of eliminating the need for end users to understand the syntax of SQL. Also the introduction of the exploration warehouse provided a safe haven for exploratory and data-intensive ad hoc processing such as Web farming. Web farming, the systematic refining of Web-based information resources for business intelligence, promised to enhance the coverage and richness of the data warehouse ­ transitioning it to an intelligence center or knowledge base for the enterprise. The next major wave in data warehousing was the creation of multidimensional and relational data marts. Data marts focused on individual business areas rather than on the entire enterprise. These types of data marts predominantly utilized the "star- schema" popularized by Ralph Kimball.

In 1997, the data warehouse became the cornerstone of an integrated knowledge environment that provided a higher level of information sharing across an organization, enabling faster and better decision making. Claudia Imhoff and Bill Inmon introduced the Corporate Information Factory (CIF) which focuses on the creation of an information architecture that is responsive to all information needs of the enterprise. The CIF is a logical architecture developed to deliver business intelligence and business management capabilities driven by data from business operations. The CIF has proven to be a stable and enduring technical architecture for any size enterprise desiring to build strategic and tactical decision support systems. An integral part of the CIF is the operational data store (ODS) that came into existence as a bridge between the day-to-day operations of the business and the data warehouse. An ODS is a subject-oriented, integrated, detailed, current, volatile collection of data used to support tactical decisions.

Electronic commerce manifested itself in the form of a new business model in which companies with similar interests and markets entered into liaisons for mutual benefit. These tightly knit groups of companies created joint business processes and conducted them electronically over the Internet via an extranet. The extranet was the first foray into what we now view as trading exchanges or markets.

As data warehouse sizes continued to explode, so did the need for storage. In fact, storage needs have continued to double on an annual basis each year. The development of fibre channel technology helped in the movement of images and large data volumes over networks. Fibre channel offered a robust and flexible way to build high-performance systems that can be interconnected with networks and storage systems. It helped alleviate the bandwidth problem, and its incorporation into parallel technologies was instrumental in alleviating many of the bottlenecks associated with storage and networks.

In 1998, a major paradigm shift occurred as most companies began to focus on customers rather than product. The development of a convergent marketing methodology helped companies recognize and keep valuable customers, allocate resources to increase customer value and manage the attrition of non-profitable customers. The customer-focused data warehouse was instrumental in enabling businesses to custom-tailor their marketing programs and campaigns to achieve one-to-one marketing. Enterprise performance management and balanced scorecard were the first major attempts to streamline the executive suite, replacing the executive information systems of the seventies and eighties. The balanced scorecard translates mission and strategy into objectives and measures, organized into four different perspectives: financial, customer, internal business process, and learning and growth. The scorecard provides a framework ­ a language ­ to communicate mission and strategy; it utilizes measurement to inform employees about the drivers of current and future success. By articulating the outcomes the organization desired and the drivers of those outcomes, senior executives hoped to channel the energies, abilities and specific knowledge of people throughout the organization toward achieving the long-term goals.

In 1999, the market shifted to an analytical application and business intelligence solutions focus as expenditures for Y2K and ERP-related software started to decline. Analytic applications automate planning, forecasting and predictive modeling activities. Customer relationship management (CRM) became a major focus for most enterprises, and analytical applications were developed around a customer-centric data warehouse. CRM became a corporate-wide practice focusing on improving a company's ability to promote customer loyalty. Packaged analytic applications, which included fraud detection, delivery chain management and customer churn, were being purchased and implemented by most enterprises. Fraud detection identifies anomalies in many areas including telecommunications, taxes, credit card usage, healthcare and insider trading. A fraud detection system identifies the characteristics of individuals who have committed fraud, uses those characteristics to create a model and launches it against a large database to determine those likely to commit fraud. Next came the introduction of enterprise information portals that provided business users with a single interface to business information to assist in the decision-making process. Enterprise portals opened the door to previously inaccessible business information.

In 2000, the focus for most enterprises was a shift from Y2K to e-commerce, e-business, business-to- business Web strategies and bandwidth expansion. E-commerce involves the transactional exchanges between buyers and sellers on the Internet. These sales, conducted at Internet speed, enable products to be sold to a wider audience, whether it was business to business (B2B), business to consumer (B2C), or consumer to consumer (C2C). E-business is defined by Gartner as "any net-enabled business activity that transforms internal and external relationships to create value and exploit market opportunities driven by new rules of the connected economy." E-services also started to replace traditional in-house applications. Application service providers (ASPs) began hosting the application software and technical infrastructure, much like time-sharing in the seventies. Trading exchanges and supply-chain management have allowed enterprises to reduce expenses and increase profitability. A commerce chain (integrated supply and service) has the potential to become the optimal competitive unit in the e-economy. The introduction of the enterprise knowledge portal combines the benefits of enterprise information portals and knowledge management by integrating access to expertise and embedding application functionality. The shifting of focus from internal processes to customer centricity across the entire enterprise has become mission-critical. Customer data integration is essential for a unified view of the customer and for successful business. Storage strategies that include SAN, NAS and near-line storage have started to meet all of the requirements for the access and storage of data. XML has gained wide acceptance as the new infrastructure for exchanging data between internal applications and between organizations over the Internet. XML provides an easy-to-use, common data vehicle so trading partners can quickly build electronic business relationships and easily integrate transactions with their internal production systems. XML is to this decade as EDI was to past decades. To truly drive customer profitability, Web companies will need to look beyond current personalization tools to new scalable systems that allow prompt analysis of massive amounts of clickstream data. All companies that collect and use customer data must understand that careful consideration of privacy issues will lead to increased customer satisfaction.

As we focus on the next 10 years, IT must help identify and create new business opportunities, extend current business models to the Web and help improve the quality of customer service. Data management professionals will be required to be driven by business requirements, not software packages; to portray problems, solutions, issues and opportunities in business terms, not technical terms; and to provide cost/ benefit analysis and end-user impacts for all projects. Remember, technology is a tool to support the business ­ it is not the end result. IT must make it easy for the CIO to sell technical ideas to the business owners. Business analysts involved in day- to-day activities of rapidly restructuring markets must be able to reset the sequence of process flows without requiring programming intervention by IT. Analysts must put themselves in the customers' shoes. If a decision or action makes it more difficult for customers to transact business, the analyst should rework the solution. In this era of e-everything, how you acquire, structure, access and utilize your data will be a key factor in your ability to be competitive and increase profits. In a declining economy, it is more critical than ever for enterprises to use data warehousing to determine the right course of action. Will e-business be able to transcend the development of customer relationships where trust and respect were built by human interaction? If anything provides a roadblock to the vision of global capitalism, it will be the traditional communications issues and problems that are prevalent in most corporations today. The journey is just beginning. The technology developed over the last 10 years pales in comparison to the accomplishments we will witness in the future. Watch the pages of DM Review for even more exciting developments in the years to come.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access