The past was a blissfully simpler time when the full digital data supply chain could be plotted from left to right, without much deviation from that of a straight line. This is when electronic operational data was predominantly the structured kind.
Unstructured information stayed on paper (where it belonged), and automated flows were facilitated by A2A and B2B integration models.
Of course, over time, entropy plays its part. The once-linear nature of integrating process workflows grew more dynamic and thus more complicated. In response, business infrastructures did, too. Managed File Transfer (MFT) systems developed organically, point solution by point solution, leaving companies struggling to cope with emergent integration demands placed on the shaky foundation of legacy technologies.
This is why the current speed of innovation is forcing some companies into a guessing game when it comes to the following questions:
What type of scenario will generate the next truly game-changing shift in the market?
What is the right iteration of integration technologies that can handle that shift?
Still, innovative enterprises are doing what innovators have always done – building a vision of what is yet to come. And the best guess, thus far, is to work toward developing an integration framework that addresses the escalating scope of integration demands – from systematic processes to citizen empowerment.
Dealing with the Frequency of Change
In order to get ahead of the curve, some organizations are attempting to plan further out by betting on only certain trends. It can prove to be a risky roll of the dice, however – a gamble that current investments can handle future information models even as ones today are rapidly increasing in complexity.
In fact, development is exponential, and as such, the frequency of changing integration scenarios has become ever more challenging to most organizations. Not only is data more difficult to manage because of bigger volumes, greater variety, and the increasing demand for velocity, but also infrastructures now extend beyond the ground floor.
In addition to foundational integration middleware layers, the cloud is certainly a component of most modern data integration models. However, due to one or possibly a combination of several security, compliance, operational, and cost concerns, most business are avoiding the all-or-none approach to cloud adoption.
Instead of adding separate technologies with a, “Let’s make this work, whatever the cost” mentality, enterprises are attempting to intelligently connect the dots by opting for exploratory or partial migration. A unified single platform approach to integration reconciles the separation of cloud and on-premise infrastructural components. And by thinking about solving integration problems with solutions rather than point products, businesses are approaching the issue with a next-generation mindset.
Everything but and…
Beyond the cloud, is the everything-and-the-kitchen-sink world of Big Data. Despite the hype, realizing value of Big Data downstream remains a next-generation use case for many companies. However, methodologies to best operationalize Big Data are widely accepted. For instance, Big Data patterns are typically schema-on-read (more ELT over ETL), and so organizations are looking for data integration tools that fully mitigate informational degradation.
Therefore, the integration platform must include the accelerated carrying, transformation, and ingestion capacities necessary to handle Big-Data volumes while optimizing productivity on the human side through process automation, and preserving 100 percent of the data-streaming processes into distributed Big-Data storage systems.
Adapt or Die
It may be an understatement to imply that many of the disparate aggregate enterprise systems put in place to handle the comparatively simple latticework of today’s connectivity simply won’t stand up to the integration tasks that lie ahead. After all, anything that remains constant for long, that refuses to move ahead, forms a standing wave in the whitewater of constant and swirling innovation. Richard Branson said it nicely, “A company that stands still will soon be forgotten.”
Hence; the need to start looking for that next-generation integration solution sooner, rather than later.
With the rise of cloud architectures, mobile applications, and the kind of collaboration that drives creative business processes, progressive enterprises are looking to do more than span the bedrock of traditional and operational integration layers. Eventually, Big Data must be strategically folded into the integration sphere.
(About the author: John Thielens is the chief technology officer and data scientist at Cleo, a maker of enterprise data integration, managed file transfer and Big Data Gateway solutions. He can be reached at firstname.lastname@example.org).
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access