Mark Twain once wrote that a report of his death was an exaggeration. This is also true of claims that the mainframe is a near-death technology in the mission-critical world of today's robust business intelligence (BI) applications. Conventional wisdom says the mainframe - the "powerhouse" of corporate computing - is simply too costly, too complex and incapable of supporting a comprehensive BI system. Not so.

A critical look at the true cost, complexity and capability of the mainframe - home to 70 percent of the world's critical transactional data - reveals a competitive BI platform. Countless innovations make the mainframe a reliable, cost-effective, large-scale platform capable of satisfying core applications and BI needs. Far from being dead, the mainframe is a valuable asset to a company wanting to implement BI and analytic applications.

Yet myths persist about the mainframe's BI capability. The intent of this three-part series is to debunk the top 10 mainframe myths and illustrate why current BI solutions can be successfully deployed on a mainframe. After all, when choosing where to place BI applications, companies should weigh all their options, as well as consider applications requirements two and three years down the road. If they make decisions based on misconceptions concerning the relative cost, complexity and capability of mainframe computing environments, they might overlook the solution that best fits their needs.

BI has come a long way from the early days of simple reports and back-office statisticians. Today's BI is in the boardroom and spread throughout the organization as every part of corporations demand decision support. This new BI has ramifications not only in BI architectures, but also in the technologies used to support these environments. There's tremendous pressure on the BI environment as terabytes of data are accessible, response times mimic operations and globally dispersed users - both sophisticated and novice - produce simple queries and complex models.

Beginning in the 1990s, distributed servers were used increasingly throughout the enterprise to handle departmental and other workloads. Distributed systems were particularly attractive for BI workloads as the server architectures matured and the software environment developed, including improved operating systems, database management systems and data access and delivery tools. In time, server technology evolved across all platforms. As distributed systems continue to emulate more of the mainframe's unique functions, such as partitioning capabilities, virtualization technologies and workload management controls, the mainframe also has matured, supporting more of the software vendors and offerings that drive the BI market today.

Why does the reliable mainframe have such a bad reputation for BI capability? Like Twain, it's misrepresented.

 

Myth I

 

Mainframe total cost of ownership (TCO) is too high. Deploying any new environment can be expensive, regardless of platform choice. Hardware and software expenses are only part of a solution's overall costs. When using price as a criterion for selecting a BI platform, a true comparison requires considering the TCO. For any application environment, total cost has many components, including labor, hardware, software and electricity. The most expensive component of any solution is the staff required to support the system.

In a distributed server environment, costs go up linearly with additional workload. Adding capacity means adding servers. Each additional server increases the human resources needed to manage and maintain the environment. In the mixed workload mainframe environment, initial hardware costs are higher, but the per-unit cost of incremental capacity decreases as the total workload grows. With the mainframe, incremental capacity can often be added without increased staffing to manage and maintain the environment. In an existing mainframe environment, many of the initial costs for deploying a new solution already have been paid. This makes incremental costs associated with adding BI capabilities much lower than those for a new environment. Creating a data warehouse from data that may already be housed on the mainframe and adding a user tool gives users immediate access to valuable BI capabilities.

Further, by offering dedicated specialty processors, the mainframe has recognized the need to target capacity to address specific workloads. These provide a high-speed engine that reduces overall processing costs when data is centralized on the mainframe. The economy of this solution helps break down the walls between transactional data stores and BI, enterprise resource planning (ERP) and customer relationship management (CRM) applications. This also minimizes the need to maintain duplicate copies of data across a pool of discrete systems while providing high levels of security for critical corporate data. By reducing the need for multiple databases and consolidating applications onto the mainframe, the platform's inherent strengths are leveraged to manage the concurrent sharing of data by batch, online transactional processing (OLTP) and online analytical processing (OLAP) applications.

A sound IT architecture involves separate environments for development, quality assurance and production workloads. In a distributed environment, typically this is accomplished with separate servers, each requiring a redundant copy of the operating system, database management system and application software, data and utilities. There is cost associated with each copy, and copies must be synchronized. Often, production problems are encountered because of differences between the production environment and the development or quality assurance environments.

The physical movement of data is another costly challenge. With approximately 70 percent of the world's critical transactional data already residing on the mainframe, there are substantial costs related to moving data from the centralized server versus leveraging the single copy of data. Using a central server, such as a mainframe, eliminates transporting data to another platform, thus reducing audit and control complexities and, more importantly, security costs. Eliminating an extra server for housing the data warehouse further reduces costs by avoiding duplication of the operating system, database management system and additional copies of other software. For companies with significant workloads, the mainframe clearly offers an alternative that provides a lower cost of ownership than a distributed environment.

Myth II

 

Predictive analytic applications are not available on the mainframe. IDC defines advanced analytics as software that includes data mining and statistical software (previously called technical data analysis). It uses technologies such as neural networks, rule induction and clustering, among others, to discover relationships in data and make predictions that are hidden, not apparent, or too complex to be extracted using query, reporting and multidimensional analysis software. This market also includes technical, econometric and other mathematics-specific software that provide libraries of statistical algorithms and tests for analyzing data.1

BI workloads have developed over time, requiring significant processing power, memory and input/output (I/O) bandwidth. The mainframe is unique in the market because it is designed to provide a balanced system optimized for a mixed workload; that is, the cache size and structure, the internal bandwidth and the I/O structure and bandwidth are heartier when compared to CPU speeds on other platforms. This leads to an increase in its relative capacity to do mixed workloads as opposed to its relative capacity when dedicated to single applications.

The mainframe has decades of proven processing performance and undisputed reliability. It also supports a number of databases that have specific features relevant to BI activities. BI software offerings also leverage the latest technologies through an intelligent client or Java client, which allows the server platform to be immaterial. The mainframe remains the workhorse while the business community interfaces through their favorite technology or technique.

Myth III

 

Mainframe administration is more costly and complex. The goal in the mainframe environment is to minimize administration costs with extensive self-optimizing capabilities. Mainframe administration activities already exist for key operational applications running in that environment. Proven administration support is in place to handle disaster recovery, security, regulatory compliance, performance management and application management. The incremental addition of the BI workload does not substantially add to the existing mainframe administration workload. This fact alone substantially reduces the overall cost for BI environments, compared to separately managed, disparate BI environments.

The integration of BI solutions into the operational workflow increases the visibility of planned and unplanned outages. There are two dimensions to this issue. First is the impact of the failure on work disruption. The mainframe's reliability is widely recognized. It continues to set the bar against which other environments are compared. Next is recovery, which due to the mainframe's reliability and growth path is not needed as often as other environments. Recovery entails more than getting the platform and application systems up and running. The database also must be restored, tested to ensure that it's not corrupted and synchronized to reflect business activities that occurred while it was out of service.

The mainframe's innovative hardware features and operating recovery capabilities have strengthened over its

history. Downtime (in the rare instances that it occurs) is often reduced to seconds. Component failures are often invisible. Anticipation of future workload is very difficult. In the BI environment, this difficulty is compounded by the unpredictability of growth (which brings about business capabilities that were not anticipated during development) and by an increasing tendency to change from a strategic BI environment to an operational BI environment.

In the server environment, companies may need to install entirely new systems, either when the system is configured to its max or when the server technology has grown to the point that the original system can no longer be upgraded. At that point, distributed server replacement can be very complex, and the applications, whether a mission-critical OLTP system or a BI environment, suffer some downtime during conversion. Thorough testing also is critical to minimize the risk of failure.

Alternatively, mainframe capacity increases can be made simpler with vendor services allowing upgrades with additional capacity already residing on the initially installed system.

The mainframe has BI muscle with some advantages over other platforms. Still not convinced? Next month, we'll topple myths regarding the mainframe's lack of data integration support and BI-savvy resources as well as its supposed weaknesses compared with newer solutions.

Reference:

  1. Dan Vesset and Brian McDonough. "Competitive Analysis." IDC, June 2007.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access