Even after 40 years, the mainframe isn't too old to deliver in the rapid-fire world of modern BI. Rumors to the contrary are among the myths we have unmasked as misinformed attempts to minimize the mainframe's inherent suitability for the task. In the light of day, few would sneer at cost-reducing centralization of data, ease of administration, industry-leading development and acquisition technologies, and a unique ability to manage mixed workloads with nimble reliability.
Companies looking for a platform for BI should consider the computing world's trusted ally, the old reliable mainframe. Home to a full 70 percent of the world's transactional data, the mainframe has a proven history of competitive costs, reliability, security and capability.
In earlier discussions, we looked at a number of misconceptions related to the mainframe that suffused the IT world (November, December 2008, DM Review). These misconceptions have led to an assumption that BI belongs on alternate, separate servers - a move founded when BI solutions were being delivered to departmental, nonproduction analysts who could work around the limited availability and the data currency issues that separate environments fostered. Today, with the integration of BI into operational processes of business, these restrictions are unacceptable and, as we rethink this trend, we must also rethink the value of the mainframe.
In this installment, let's put to rest the remaining mainframe myths, illustrating that corporate computing's workhorse is fully capable of keeping pace with changing BI requirements.
The mainframe is old and does not offer modern capabilities. Calling today's mainframe old technology implies that not much has changed in the mainframe since the introduction of distributed servers. Nothing could be further than the truth. Heavy investments in mainframe research and development have resulted in many new features. Sure, the mainframe had to catch up to newer server platforms in terms of Java and open programming. Investments in these areas during the last 15 years have enabled the mainframe to now provide an open programming environment that fully supports Java and C++.
BI providers have worked to supply innovative offerings that take advantage of the mainframe platform's enhancements, ensuring synergy between software and hardware. The mainframe supports modern BI components, including a true real-time or low latency operational data store (ODS), high-speed data management technologies and high performance in mixed workloads.
The mainframe actually has been a trendsetter by providing innovations that have been adopted in other environments. For example, partitioning was first developed on mainframes in the 1980s, and predecessors of workload management were first offered in the 1970s. These are relatively new enhancements to other competitive platforms. Mainframes have remained competitive as packaged solutions (operating system and middleware) and have come into the BI market with a range of warehouse offerings. These offerings automate installation, migration and maintenance tasks, making them easy to perform and less costly overall.
As BI workloads integrate into business systems, a key consideration is the ability to effectively control the various workloads a business needs to run. The mainframe's capabilities provide for dynamic BI mixed workloads, BI and transactional workloads and a combination of data warehouse and data mart workloads on the same system. This kind of facility addresses varying levels of query concurrency, resource adjustments to exploit available capacity and work priorities, protection against "killer" queries, maintenance of consistent response times regardless of workload and dynamic resource allocation to react to loads. This is particularly important in a BI environment due to a wide range of query run times, types of queries and varying user sophistication.
Having a single centralized copy of data accessible to all users makes sense, but it can be a nightmare if you cannot control and differentiate between the critical work and less important queries in the system. Simple queries for operational support must execute quickly; queries for strategic analysis can tolerate a slower response, and complex analytical queries and data mining work can afford a slightly longer time to results. These fine points in managing workloads differentiate the mainframe from other server choices. Today, these differences are no longer just critical in OLTP environments; they are mandatory in BI environments as well.
The mainframe has an operations model that addresses security, resilience, workload and capacity management, storage management and business process integration. This model has proven to be nearly infallible. From a security perspective, the mainframe's OS has the highest government security rating. In today's business environment, this must be a key factor when considering servers to host your business's data, your most critical asset.
There is minimal data integration support for the mainframe. Advances in mainframe hardware and software technologies have optimized the OS platform for enterprise data integration and warehousing. The mainframe environment for BI leverages popular capabilities such as parallel processing without requiring any design changes, while simultaneously supporting batch and real-time operations. Both are important to the evolving BI requirements, particularly when integrating BI into operational processes.
The latencies associated with data integration and transfer also can greatly impact an enterprise's ability to use its BI to make better decisions. Much enterprise data is still hosted on mainframes. When raw data resides there, the ETL process is streamlined because the data does not need to be migrated to another platform. A single system designed to manage data integration, cleanup and transformation means the overall process is easier and less costly to manage. With the advent of operational BI and the prominent role of the operational data store, keeping all or most data resident on the mainframe means that a real-time ODS - the goal of many implementers of this technology - becomes feasible.
Few mainframe resources know BI. With an open programming model, many BI skills available in server environments are completely transferable to the mainframe environment with minimal additional training. For a company with an existing mainframe, the IT staff already has appropriate application programming and maintenance skills. (To ensure expertise into the future, it's essential to maintain a pool of mainframe-savvy IT professionals.) Educators at more than 300 schools are investigating or actively teaching mainframe technologies today, and the number of participants is growing continuously. Different skills are needed for work performed on the mainframe. For example:
- COBOL programmers are in demand. There may be a shortage of COBOL programmers within the U.S., but there is an ample supply in the offshore market. Still, the shortage of COBOL skills is a concern for companies that need to maintain COBOL-based systems. This is not a major issue in BI because COBOL is rarely used as the programming language for new BI applications. Mainframes can use Linux and Java to further reduce any dependence on COBOL programmers, particularly in BI.
- There is a shortage of systems administrators, managers and operators, but it's not restricted to the mainframe environment. Job posting sites show more unfilled positions of this kind for the UNIX and Windows world than for the mainframe, and here again, the mainframe is adapting to the market. Mainframe management is being simplified and given a Windows-oriented, easy-to-use graphical user interface to enable lesser-skilled people to manage mainframes and to appeal to people who are accustomed to Windows.
The mainframe can't perform as well as newer solutions. Query performance is influenced by factors including work executing on the system, the disk environment, the database design and indexes. Mainframes have a demonstrated ability to support BI workloads. Often, platform decisions are made at the execution time of a single query on a server. That is an unrealistic measurement because a BI workload does not process queries serially. Processing a mix of queries (simple and complex) is the typical requirement. A server's ability to handle a workload of mixed queries is a more appropriate and realistic way to evaluate a BI system. It's also an extremely challenging evaluation that is difficult to perform because the complexity of the BI workload makes it difficult to run realistic tests. To manage this complexity, the platform requires a workload management function, for which the mainframe has one of the most vigorous available.
Mainframes manage workflows quite efficiently by prioritizing queries according to their importance and performance requirements. For example, long-running scans may run at a lower priority while shorter queries can get their results fast without waiting for the longer-running ones to finish, allowing no query to monopolize processing or memory resources. Also, the mainframe has the ability to differentiate between users - some with higher priority than others.
New online utilities to simplify and speed common database management tasks are useful to BI administrators. These include the ability to set workload goals so that the mainframe will manage processing, memory and input/output resources available to the operating system image to achieve those goals. An administrator might set a goal to complete 80 percent of all queries within five minutes or less. If the goal is not achieved, the system generates a report to the administrator indicating recommended actions to meet the goal.
Finally, consider the mainframe's handling of unplanned outages and the very few circumstances requiring a reboot. Because of their legacy in the operational environment, these machines are extremely fault tolerant; many applications operate for years without outages. That level of availability is unique to the mainframe.
Should you consider the mainframe for your BI platform? Now that we've dispelled the myths surrounding this workhorse's ability to do business in the 21st century, your answer should be a resounding yes. Significant resources have gone into ensuring that the mainframe and its BI capabilities are modern, sophisticated and have the same capabilities found in alternative platforms.
Given that data volumes and workload processes are growing at amazing rates, it is remarkable that mainframe data center staffing levels have not changed significantly despite these increases. It is testament to mainframe strategy and philosophy that this environment continues to be easy to install, configure, service and also to be quite capable of handling BI analytics.
The mainframe platform has demonstrable reliability, scalability, security and excellent mixed workload performance capabilities needed to support traditional strategic and tactical as well as near real-time operational BI. Clearly, the mainframe isn't dead. Rather, in many ways it has been reborn. Decision-makers seeking BI solutions would be wise to investigate innovative analytics and BI applications for the mainframe platform.
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access