For years we heard “IT is the business.” Today that refrain is quickly changing to “data is the business.” By now, we all know the value that big data can bring to the enterprise. In the 90’s, data mining and BI efforts were used mostly for after-the-fact reporting and problem solving. Today, the emphasis is on big data as a means of predicting and influencing the future, and those organizations that put Big Data to work will be more agile and competitive.

Big data analytics are one reason mainframe systems – coincidentally referred to as “Big Iron” – are making a resurgence. Since their inception in 1964, mainframe computers have been used by corporate and governmental organizations for critical applications, bulk data processing, ERP and transaction processing. Mainframe systems are unparalleled for their reliability, scalability and security, and mainframe data is a treasure trove of information and strategic insights.

Together these attributes make the mainframe an ideal platform for conducting data analytics. But there are several obstacles for organizations looking to leverage their mainframe assets for big data initiatives.

Mobile, Big Data Are Driving the Mainframe’s Longevity

In spite of some industry prognostications, the mainframe has a very real place in the future of enterprise computing. According to one new CIO survey, 88 percent of respondents assert that the mainframe will be a key business asset over the next decade. In addition, 81 percent said that the mainframe is now running more new and different workloads compared to five years ago. What’s driving this?

The Mobile Explosion: According to Rubin Research, in 2004, the average mobile user executed one mobile transaction per day. Today, that number is 37, and by 2025, the number will increase five-fold. Many organizations are struggling to keep up. The good news for existing mainframe users is that their system of record delivers more than enough horsepower. According to IBM research, the recently announced IBM z13 mainframe is capable of handling 100 times the volume of Cyber Monday’s transactions in a single day. Recent studies have also found that in many instances, mainframes are more cost-effective and generate more business revenue per infrastructure dollar than commodity servers. This is because the gradual reduction in commodity server cost has not kept pace with the volume increase of mobile computing loads.

Data Analytics: The mainframe is ideally suited for today’s compute-intensive data analytics applications. Further data from IBM shows that the volume of mainframe transactions on any given day dwarfs the number of Google searches, Twitter tweets, YouTube views and Facebook Likes combined. Beyond sheer computing muscle, the mainframe offers another important Big Data advantage: the ability to getting as close to real-time transactions as possible.

By conducting analytics so close to real-time transaction data, organizations are in a stronger position to unlock real-time, actionable insights. Consider, for example, a mobile commerce transaction being conducted on the mainframe. Real-time analytics can enable a retailer to accurately predict other products that may interest the consumer, right as the transaction is being processed. The retailer can then target the customer “in-the-moment” (with a follow-up email, for instance) when they’re most apt to buy.

Additionally, keeping resident transaction data on the mainframe for analysis reduces the risk and time involved in migrating data. That’s not to say that mainframe data should exist in a silo and never be moved.  The maximum big data benefits result when organizations harness and analyze all their data – both within and outside of the enterprise. So, mainframe data may need to be analyzed in parallel with other data types, in a data lake model.

Challenges Remain

The mainframe undoubtedly has great potential in the big data arena, but there are significant challenges that need to be addressed in order to tap the full cost and performance advantages.

Cost: Monthly licensing charges (MLCs) can comprise up to 30 percent of mainframe-related costs. MLCs are typically determined by looking at the highest rolling four hour average (R4HA) of mainframe CPU utilization, for all applications on each logical partition. Mainframe costs can be controlled and even reduced if workloads are spread out in a way that minimizes collective peaks, which can drive averages up.

Accessing the Data – As mentioned above, mainframe transaction data should be easily accessible for inclusion in larger Big Data initiatives. The problem is, the traditional mainframe environment is an arcane “green screen.” Newer generations of IT are unfamiliar with it – placing a lag on their ability to access and manipulate mainframe data.

Exacerbating these challenges is the fact that mainframe experts, largely Baby Boomers, are retiring in droves. Non-mainframe experts need guidance for working with the system, but fewer and fewer people will be around to ask. According to the above-mentioned survey, at the same time CIOs are expressing optimism about their mainframes, a large majority (70 percent) are also expressing concern about transferring mainframe knowledge. Additionally, 39 percent currently have no plans in place for addressing the looming mainframe skills shortage.

Mainframe ISVs Rise to the Challenge

IBM accomplished a huge feat with its new z13, calling it “the most sophisticated and powerful computer IBM has ever built.” But hardware is only half the puzzle, and software for better managing mainframe systems also needs to be addressed.

Today, mainframe ISVs are teaming up to offer integrated solutions enabling financially intelligent mainframe workload management. These solutions help mainframe users identify opportunities to move workloads around, minimizing collective peaks and lowering MLCs through proactive application tuning.

When it comes to creating a familiar “look and feel,” these ISVs are also modernizing the mainframe environment with Eclipse-based interfaces; replicating modern functions like copy and paste and offering visualization tools for enterprise programs and data. This removes much of the complexity surrounding the environment and makes it easier for non-mainframe experts to access and manipulate mainframe applications.

Conclusion

IBM has billed the z13 as “the system that will tame big data,” and successful big data initiatives promise greater agility. But the concept of agility runs contrary to being bogged down by excessive IT costs and poor worker productivity. In order for mainframes to achieve their excellent potential as a platform for big data – and more generally, a platform for the future – addressing the twin issues of cost and data accessibility is key. Fortunately, today’s mainframe delivers more agility – not just in terms of cost- and on-the-job IT worker agility, but also agile user access to mainframe management tools which are constantly being refreshed in accordance with evolving user needs.

Dennis O'Flynn is VP of product development at Compuware.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access