© 2019 SourceMedia. All rights reserved.

Everything done in enterprise information management should drive ROI

The ability to "sell" enterprise scope information management is intrinsically connected to its value proposition in clear business terms. Data and technology people can be very bad at articulating this and too focused on how their method, product or service works instead of what business problems they can solve.

In the end, it is all about money...whether the cost is worth the adoption to the business in terms of its ongoing value.

The goal here will always be to have the minimal amount of "stuff" doing the maximum amount of "value added things" at the "least cost." This has been a compelling argument for the big data and AI crowd in recent years, but the expense of these solutions in infrastructure, specialized skills and poor implementation has in many ways tainted the message of how to achieve return on investment in the EIM and data insights marketplace... the perception to the business is that sorting data is expensive and needs huge justification.

This creates a very challenging environment for enterprise information management innovators committed to the less is more paradigm to business value...so such innovations need to get better at making their case stand out to business leaders... or the money munching will continue unabated and businesses will have no choice but to spend tens of millions of dollars on questionable results.

Yet, a few of us out there are challenging this status quo, making the argument for a more business foundational approach based on solid business ROI... slowly and steadily gaining support and followers in EIM circles, preparing not to be the "next big thing," but instead the foundation that makes the next big thing hype cycle... well... a thing of the past.

data debt.jpg
Network servers hdd in a data center. Swallow depth of field

So how do we translate putting data on a diet into pounds and pence over a 5-year period, so to speak? Well, this article will endeavor to do just that based on a business with 50 operational data silos, 15 BI databases of varying complexity with accompanying data feeds and 200 operational-focused integrations... the business turns over £500M a year and has an operating cost of £450M.

The goal is to justify adopting and executing one of these foundational approaches at a total TCO (including software, people and skills and everything else) of £2m per year against the value it creates or releases from the existing data estate...so here we go.

Part 1 - The Cost of Excess

Storing and moving data costs money and also copies are less valuable than the original. In a large enterprise, this adds up to a significant cost. According to many studies, even the use of cloud makes very little difference to this.

An empty single database instance of fairly modest size is likely to cost in the region of £100K to provision and manage over a 5-year period before you even think about what you actually use it for and put data into it. You can expect each integration on average to have a similar 5-year TCO.

So the headline cost of just storing and moving operational data for this example business is considerable, in the region of £5m per year, every year....or £100k per year per operational silo.

When we get into the Big Data and Insight space, due to the increase in the need for specialised resources, software and infrastructure, data preparation and more complex transformations, annual TCO costs can be between 5 to 20 times higher. So let's assume out of the total of 15... 3 are big, 5 are moderate and 7 small. This leads to approximately, an additional £12m per year.

So just having and managing this size of data estate will cost in the region of £17m per year or £85m over five years. This cost only covers running this estate and any dedicated resources needed to manage the day to day running of its specific needs and technologies... included because those resources are data management related in a specific nature bound to that data technology and therefore would not be otherwise needed.

We've not even got to the impact of transformational change yet (this we will cover in another section of this article).

Now, as you would expect that the case to have all of this will at some point have been justified in terms of the business value it will generate or release, any reduction in the size or operating cost of this data estate would be money on the bottom line. You would also expect that any new additions would be at least cost neutral against the value it creates to justify existence.

The foundational approach can justify and deliver its entire ROI in this area alone, specifically based on achieving the minimum of a 12 percent reduction on data estate operating costs via:

  • Reducing the need for certain integrations and databases only existing to support cross functional data consolidation or lack of interoperability, rather than direct business operational value or insight.
  • Reducing the number of specialised resources and skills needed to advise on, manage and support technology driven data methods, models and solutions that provide questionable business value.

Part 2 - The Cost of Confusion

An enterprise data landscape is complex and rife with conflicts in view points and tribal interpretations. Businesses operate better when all its people are able to use the business language they know and work with every day, but can also communicate effectively with everybody else who might have differing perspectives.

You can try and solve any confusion in business communication by prescribing only an agreed common meaning and use across all you data, or in other words, by forcing everybody to agree and adopt the same business language. This has been attempted many times and has mostly fallen foul of the mortal enemies of such approaches...choice and consensus.

Another approach is to understand and support the tribal nature of business language though linguistic, semantic and structural interoperability methods, thus preserving all viewpoints simultaneously... which is where I sit.

Either approach, though, should have the goal of reducing confusion and misunderstanding across the business as a whole... because misunderstanding leads to bad assumptions and they in term lead to wasted effort and poor business choices... and poor choices cost money.

Putting a cost to this is an interesting challenge, poor choices are very likely to have a negative effect on the business, even possibly directly reducing customer revenue. They manifest themselves in the data space through an increased effort to get to the right answer... which we could call the baseline, the position where cost is neutral or value is created from data.

Taking into account the operating cost of this example business (£450M pa), even a 1 percent reduction in operational cost or increase in revenue through minimizing confusion of business language and making better choices leads to a saving of £4.5M every year...and this is a very modest assessment of the scale of loss and/or missed opportunities here... but let us roll with it for now.

This gives us potentially another £22.5M of operational efficiency savings over the five years.

Part 3 - The Cost of Technology Change

In the data world, change in constant. Businesses are bombarded with new technologies, methods, fads and hype cycles in a seamlessly unending churn of reinvention. Adopting any new data technology comes with a multitude of costs... infrastructure, specific skills and time to adopt are just a few.

When such changes are result neutral, meaning they replace the old with the new but essentially do the same thing in a different way, the business case for that change is quite often overlooked... it is driven through a desire of technologists and data practitioners to get into the "next big thing," rather than with a more critical business lens of cost vs value.

New technology and methods take time to embed themselves and create their promised value. Mistakes and poor implementation choices are made initially and then refined as skills are gained, this is a natural learning curve that all technological change systemically introduces to the data endeavor.

Also, the more specialized the new technology is to a specific data problem, the more items of data technology you need to cover the entire enterprise data management scope. Complexity and specialism is commonplace within data industry solutions and methods... and many solutions do not play nicely with each other.

We also need to consider the frequent and disconnected use of office productivity tools to manage the enterprise data landscape. As useful and accessible as these tools are, they create a fragmented, chaotic and disparate overall data management solution.

All this together creates a disparate set of data technologies, methods and tools and a systemic data landscape interoperability issue. Choosing a more foundational and future proof approach, however, has the ability to reduce this technology and method complexity footprint and also reduce the need to adopt every new invention as it hits the streets... choosing only ones that uplift business value in real terms.

Attributing cost to this is based on the simplification and increased forward stability of the enterprise data management technology landscape... and therefore a reduced ongoing cost of enterprise data management activities.

Enterprise class data solutions are not cheap, you would expect an annual cost running into the region of around £400K per year and if you have a few of them, say five in total all specializing in a specific area of data management, that soon adds up. That's £2M per year or £10M over five years.

So, harmonizing them into a fewer number with greater foundational scope saves you money, let us say we can remove three of them and replace them with a single solution at the same annual cost of a single one... that has saved a further £800K per year or £4M every time you can do this.

Part 4 - The Cost of Ignorance

In today’s world of ever increasing data regulation and compliance, not knowing what data you have, where it lies, how it moves and how it is processed at detail is becoming a very risky position to be in for a business. Fines can run into the millions and the reputation loss can be severe.

According to a recent study by IBM, "Average Total Cost of a Data Breach Has Increased to £3 million – Global Study. The latest global study by IBM Security and Ponemon Institute found that the average total cost of a data breach is up 6.4 over the previous year and now costs businesses an average of £3million each."

So, just having one of these events over the five year period costs you £3M...let us assume this example business was unfortunately enough to suffer one.

A foundational approach would have greatly reduced the risk of this happening, so no breach... no cost.

At this point, a quick recap on where we are up to on the Data Debt tally... £39.7M over 5 years of identified savings... maybe investing £10M to save £29.7M does not look like quite a bad option as you might first of thought, right?

Part 5 - The Cost of Apathy

So if we do nothing... the problem won't get any worse right? Well it certainly won't get any better. Apathy, especially by senior management, puts the technologists and vendors in control of the data endeavour... and we all know they love the next big shiny thing more than anything else...so the data technology churn will persist and costs will inevitably rise over time.

So let us adjust that total cost so far with an increment, year on year, to reflect continual technology demand and uplift, continued confusion and ever increasing data management overheads. £39.7M at an annual increase of, say 2 percent per year on year now becomes £41.3M

Adopting a more foundational approach puts the business back in charge, controlling future data technology uplift.

Part 6 - The Cost of Transformation

Programs of business and digital transformation are intrinsically linked with the enterprise data landscape. Knowing your starting point is key to establishing what must be changed and creating a smooth transition between what you have now and what your program is attempting to achieve.

If you do not have that foundational view of your current data landscape freely available and accessible to all transformation participants...they have to go looking for it, attempt to understand it and reconcile any gaps in knowledge. If you then do not keep your foundational view up to date, every new program will have to repeat this time consuming and resource heavy process.

Agreed, the process of keeping all this up to date comes with a cost... but it is well known that not having it leads to more cost in the long run and unless you are going to not bother documenting it anyway (which puts you at risk of non-compliance with a whole host of data regulations and possibly that second data breach!), the general consensus is that documenting your enterprise data is better than not and you would have done it some way anyway, probably in architectural and design documents hidden away in you corporate document repository of choice.

To try and put a figure on this, let us remove the confusion angle because this has been covered previously, but look instead and the effort taken to relearn what you have got and resolve issues within a program of change.

Certainly, my experience is that data investigation generally puts many stakeholders in a room many times to thrash out the data landscapes involved. Many of these will be bought in consulting and systems integrator resources with a heavy daily price tag. With a foundation in place, it would be simply a case of identifying the data that you have that is impacted and the defining the changes. Without it, you have a more significant job to do.

So let us say that every program requires one additional FTE (£600 p/d * 240 days) just dealing with this re-investigation and clarification activity and you spend four hours per week in conference calls and meetings to validate the results with 6 people participating (6 * £400 * 0.5 days * 52 weeks) in the discussion over a 1-year program timescale... and you have 10 ongoing transformations every year.

This comes to around £2M per year or another £10M over the full five years, just spent re-investigating what you should already know... crazy yes?

The Final Total

The data debt total, the cost that represents money that doesn't need to be spent if you choose a foundational approach, now comes to a very significant £51.3M, or a saving of £41.4M over five years if you spend £10M on a good foundational data fabric accessible to the entire business... and this saving is in a quite modestly size business operation... just medium sized... as the size of the enterprise increases, you can pretty much scale this up proportionally, at least, for most business industries.

So choose a foundational data approach and get control of your data debt. It may well be the best £10M you will ever invest.

For reprint and licensing requests for this article, click here.