Don’t believe rumors that the mainframe, the historic powerhouse for corporate computing and data warehousing, is too old to compete in today’s big, fast and pervasive business intelligence (BI) world. Criticisms that this workhorse is too costly, too complex and incapable of handling comprehensive BI systems are nonsense. 

At age 40, the mainframe may be old, but it’s not dead. In fact, the maturing BI environments with their emphasis on sophisticated analytic capabilities now require scale, security and performance - all of which are the well-documented advantages of the mainframe platform. Combining sophisticated BI, statistical and data mining capabilities with the mainframe’s proven platform may provide businesses just what they need to reach a new level of competitive competence.


For many companies where improving their competitive stance through advanced analytics is a must, this technology is appealing. In our first installment (see the November issue of DM Review), we began debunking the top 10 myths surrounding the death of “old technology” mainframes in the BI environment. We countered claims about the mainframe’s high cost, predictive analytics shortcomings and administrative complexities with facts.


The truth is that the mainframe’s total cost is less expensive than other environments given cost reductions achieved by centralizing data. It is designed to provide a balanced system optimized for a mixed workload (making it capable of handling predictive analytics applications along with other workloads). Its goal is minimizing administration costs through extensive self-optimizing capabilities.


Myth IV


Mainframe BI solutions are more complex than server-based solutions. The complexity of BI solutions is driven mostly by their scope and magnitude. The workload is often very intensive and, with knowledge about the workload characteristics, any skilled mainframe administrator can plan for the needed capacity to handle any current and future requirements. The ease of mainframe administration will be further addressed in the next article in this series (January 2009 DM Review), with Myth 7.


That said, there are BI offerings that simplify the users’ ability to access and leverage analytic capabilities across an organization, just as they would other data sources, providing targeted interfaces that align with individual skill levels to deliver intelligence in the right business context. Others have made it easier to increase the capacity of the BI environment as the data volumes grow - making it simpler to handle one of the more difficult and complex aspects of that environment. In the mainframe, data capacity can be increased without requiring expansion to additional servers to accommodate data growth.


Another reason for BI complexity is the constantly changing environments - hardware and software components are being continually upgraded to newer versions. These updates can cause significant technological headaches as administrators try to ensure the compatibility of various components. Within the mainframe environment, system and software upgrades are traditionally backward compatible. Not only is this a significant factor in reducing the costs associated with moving from one release to another, it also significantly reduces the complexity of such a move.


Another complexity killer is having the operational and BI environments residing on the same platform. As noted in Myth 1, overall processing costs are reduced when data is centralized on the mainframe. Also, in an existing mainframe environment, many initial costs for new solution deployment have already been paid. Using a central server such as the mainframe generates additional savings from reduced network costs, decreased reserve processing power, storage space, and consolidated security and disaster recovery approaches. Reduced complexity and cost, shared processes, tools and procedures, and streamlined compliance and security are benefits attributed to this co-resident feature. The reduced administrative efforts in this environment lessen both complexity and overall costs.


Myth V


Mainframe solutions are too inflexible to handle BI applications or respond quickly to changing requirements. The inflexibility, when it exists, usually is not due to the computing facility. The mainframe environment provides extremely high availability and an environment that has proven stable for mission-critical applications. Its data integration capabilities are based on the same well-thought-out industry data models created for verticals such as banking and retail. These models give implementers a productivity boost and also ensure that most if not all the necessary attributes to support BI have been included and documented. Having a well-documented and thorough set of models to use for implementation certainly reduces the need to constantly change the ultimate database schema.


Popular hub-and-spoke architectures like the Corporate Information Factory can be fully deployed on a mainframe environment. The data warehouse hub itself can take full advantage of the mainframe database management system and its optimization prowess. An agile staff with responsive processes can deploy data marts quickly due to their co-resident status with the warehouse. BI implementers can structure them so that the organization can quickly adapt them to meet changing business needs. The mainframe also offers sophisticated virtualization capabilities that:

  • Share processor memory, input/output (I/O) and network components among multiple operating environments. These capabilities allow administrators to quickly isolate workloads, share resources among workloads and enable communication for workloads internally with an in-memory TCP/IP network.
  • Simplify IT infrastructure by supplying management tools for operation, maintenance and accounting.
  • Create, provision, deploy and manage virtual servers for optimum performance and flexibility.

In addition to these capabilities, the mainframe has industry-leading development and acquisition technologies to meet the requirements for data security and availability, encryption, auditability of all data acquisition processes and timeliness of data loads. These ensure that changes to existing or implementation of new BI capabilities will be handled expeditiously.


Myth VI


Releasing a BI solution into production in a mainframe environment is excessively laborious. The mainframe platform has all the necessary tools, techniques and technologies to create a sophisticated and easily sustainable BI environment. Most of the difficulties in releasing BI applications into production come from the lack of a rollout plan with processes that ensure all the moving parts fit seamlessly. Production release processes need to reflect the importance and complexity of the application.


Many of the steps in the release process, such as the testing and certification of the application, are totally independent of the platform. The robust mainframe environment supports migration from development to test to production environments quickly, and once these processes are established, their complexity is due to the rigor required to ensure a problem-free release of the BI application into production - not because of the technological platform being used. Other production support mechanisms discussed in the previous myths article include substantial support in performance, continuous availability, architecture and ease of use.


In the January issue of DM Review we will address myths regarding the technology currency of the mainframe, its minimal integration support, the perceived lack of mainframe resources and its performance capabilities.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access