The IT environment within organizations continues to change rapidly, and old rules no longer apply. Timelines are shrinking, projects are multiplying, and tasks are becoming more complex. Plentiful and affordable IT resources have led to extremely high business expectations. At the same time, IT departments have not grown enough to keep pace with those expectations.
For these reasons, organizations are struggling with ways to manage the vast amounts of incoming data in order to derive meaningful insight from it all -- insight that drives sound business strategies.
The challenges in doing that are clear; they include provisioning for visibility into data sources and dependencies, unraveling data complexities, and identifying and detecting problems. It all adds up to making the vast amounts of data available more agile -- at a time when production hyper growth clashes with the hard fact that time-to-market is an essential element of modern day business success.
Old rules made new for managing change.
There was a time when IT change was easily managed through larger staffs, bigger budgets, more training, and newer products and technologies. Harold Leavitt, the famous management theorist, believed at that time that these strategies were sufficient. His “Leavitt Diamond” was a blueprint for organizational change specifying Structure, Tasks, Technology, and People as key factors. But he also strongly stated that it was the interrelationships between these factors that were most critical. If all factors are not able to move simultaneously, organizational change is doomed to failure.
Leavitt’s thinking can most definitely be applied to today’s IT realm. While some current change factors may be unique to IT, there is no doubt that a comprehensive approach is essential. Essential, too, is an approach that focuses on the interrelationships of factors (today, those factors are represented by data) to successfully managing the pace of change).
Focusing on inter-relationships.
Today, IT must think in terms of the interrelationships of data and data sources, and how to make those relationships more agile than ever. Infrastructure is no longer siloed; hardware is both physical and virtual, and data must move fluidly within it all. Big Data strives to make sense of all of this movement, by accessing hundreds, even thousands of sources for even the simplest of functions.
In order for this all-important data agility to take place, the modern IT organization must also be highly agile. Workload automation, as a core tactical component, can help organizations achieve this goal in the most immediate and cost-effective terms.
Modern automation improves orchestration.
Workload automation solutions, of course, have been around for years. What makes modern workload automation solutions so well suited for increasing data agility and reducing the cost of constant change, however, is the vast improvements that have turned these solutions into sophisticated platforms that can comprehensively manage and coordinate virtually every aspect of IT, including the interrelationship of data that is so key.
Why the interest in workload automation? Automation improvements have been shown to cut the time to build and deploy processing tasks in half. In an era when custom scripting, manual updates and outdated change management protocols expose organizations to business risk, the pre-set, templated and tested job steps in intelligent automation systems can not only reduce coding time, but also provide IT with reliable, proven logic.
Automation can also help organizations improve data insight with intelligent analytics and reporting designed to easily identify, monitor, and manage workflows and systems for faster, more reliable performance – therefore getting the most out of data within agile environments. The improved visibility and reduced time for troubleshooting and issue resolution is invaluable in significantly increasing data’s -- and the IT department’s -- usefulness, focus, and ability to perform nimbly toward business goals.
Intelligent automation: key to data agility.
The trick to using automation for data agility is adopting a comprehensive, intelligent approach that crosses boundaries and makes fast inter-relationships possible.
Today, the average organization has anywhere from three to eight separate job schedulers or automation tools. Most of these come from one or more of three categories: applications; infrastructure, virtualization and grid platforms; and operating systems.
While it’s not necessary to abandon these schedulers altogether, one advantage to a comprehensive, cross-platform workload automation solution is that it would allow IT to develop unified job strategies, as well as master execution of mission-critical jobs. It could also provide a single point for overseeing and tracking the multiple applications, operating systems, and mix of physical and virtual resources involved. That would certainly help inter-relationships to be facilitated, and data to become an agile tool for delivering real insight.
The true value of intelligent automation is its ability to see and manage the inter-relationships between the hundreds of nodes and processes in IT. The hard truth is that it’s no longer possible for today’s overextended IT staffs to define, create, and execute the functions of IT without automation.
The list of tasks, with all their complexities and demands, is simply too long. Intelligent workload automation platforms can provide true IT and data agility, enabling IT to do more with less.
(About the author: By Benjamin Rosenberg is president at Advanced Systems Concepts, Inc.)
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access