Over the past few years we’ve witnessed the rise and fall of big data. The Gartner hype cycle articulates that story quite well as we’re finally over the immense inflation of big data. In fact, Gartner removed the term completely from its hype cycle in 2015. Today, however, it’s not hype to say that data is only moving in one direction.


In terms of data volume growth, not a single industry in the business world has escaped its impact. New data sources come online daily and as this explosion in data outpaces Moore’s Law, its effect on our lives is becoming amplified. It’s clear this trend is not going away. It’s only going to get worse. And it isn’t just a growth problem: Data complexity is also on the rise.

Not long ago if you were an in-market customer seeking a vendor, and had connected with either a business intelligence (BI) company or a services provider, there’s a chance you would have been told a story similar to this: The sky is falling and if you didn’t buy a Hadoop cluster, hire a bus load of consultants and pay for a multi-million dollar engagement, your competition was going to gobble your marketshare.

What type of results did that experience typically yield? Not all engagements are created equally and while some businesses had success, more often than not they failed due to a variety of reasons. From what I’ve seen in the field, it was clear that much of the blame rests on expectations set too high based on hyperbolic claims.

Because of this, much of the truth about traditional BI companies has been brought to light. Rather than rely on the latest techniques to collect, prepare, make sense of, and visualize data, organizations are still relying on a cadre of professionals to manually code solutions in SQL.

While this may have been an appropriate approach to take on a data bottleneck in the past, in today’s tumultuous environment it does not have the efficacy required.

This is due to three primary problems:

1. It is too expensive. Taking on a difficult data problem the old-fashioned way requires a significant investment to pay for the professionals doing all of the manual coding. And, don’t forget about those on-site visits that will incur travel costs during the consultative phase and when something breaks — it’s only a matter of time and it will.

2. This approach will not be able to scale as data growth and complexity continues to amplify. Two things that will remain constant are data growth and the increase in data complexity. What may have taken a team of 10 in 2015 may require a team of 20 in 2016. That’s not even factoring in how data from all-new sources needs to be integrated, which neatly brings me to the next problem.

3. The time-to-value takes too long to achieve in the agile environment today’s businesses are facing. Remember, all-new data sources come online daily and require integration into a system to obtain value. If you’re going about this the traditional way it means at least a couple months of manual coding for a handful of data sources, if you’re lucky.

Clearly, the days of traditional BI solutions are soon going to be over. If the several reasons above didn’t sway you, here’s one more.

Access to the best approach to solve today’s growing data problem is getting easier. Artificial intelligence, more specifically its subset machine learning, has gained significant traction in recent years and now more professionals are putting it to use in order to sniff out better insights for today’s businesses.

While big data may have garnered big headlines, there was a lot left to be desired. Machine learning, on the other hand, has yet to be given the recognition it deserves. Machine learning is not a marketing fairytale like big data as it is rooted in computer science and delivers results when used correctly.

(About the author: Katrin Ribant is the co-founder and chief solutions officer at Datorama, a marketing analytics innovator. Katrin lives at the intersection of data, digital and marketing to help brands and agencies connect data to real-time business insights and actions. Katrin was previously EVP of data platforms at Havas Digital where she led the development effort for the industry’s first data management platform, Artemis. Fluent in seven languages, Katrin holds degrees in psychology, neuropsychology, and neurolinguistics and brings a wide perspective to data analytics. When Katrin is not helping Datorama’s global customers solve their biggest marketing data challenges, she says she can be found walking NYC’s busy streets scouting for street art.)

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access