Man vs. Machine: Is the Big Data Battle Valid?

Published
  • April 10 2015, 11:39am EDT

Ever since businesses started embracing big data analytics, a raging debate has been going on about whether this will replace domain experts in the long run. Machine-learned-algorithms without any doubt are superior in their computing power, crunching enormous volumes of data, discovering hidden insights and enabling faster and transparent decision making. However these in no way imply that domain expertise is beginning to turn redundant. On the contrary, what we see happening is that domain expertise and big data analytics are complementing each other to make a transformational impact in multiple industries.

Let me elucidate it further with a few instances from the telecom world where man-machine collaborated approach has been quite successful in getting the decisions right.

A communication service provider (CSP) client in Africa was trying to increase the uptake of its voice and data services among its existing subscribers with a campaign designed to entice subscriber to use more. While, the campaign initially took off very well and resulted in higher revenue for the CSP, it was observed that the response drastically declined at a certain time of the year. No existing data in the system could explain this phenomenon. When a domain expert looked into the conundrum, it was found that this particular period coincided with vacation time for the local residents. At this time people spent most of their time with family and interactions on the phone remained very limited. This factor could not have been explained without the knowledge and experience of a domain expert or someone can say unlimited historical data and processing power but that is not a possibility in many practical scenarios.

Healthy Dose of Skepticism

While machines have tremendous computational powers, the ability to parse through enormous amount of historical data, sometimes even with all that power machines can reach sweeping conclusions that are logically correct but lacks a touch with reality. It important to deal with data with an informed skepticism.

Take, for example, another scenario where a communication service provider is trying to increase the 4G user base. Segmentation algorithms recommend promoting 4G packs among all those medium- and high-value subscribers who are using 3G services now. Here again domain specialists can add value by refining the segmentation criteria to exclude 3G dongle users from this campaign as it may impact the CSPs net margin. So, while the machines got the targeting right but blindly executing the machine’s recommendation would have resulted in revenue loss for the CSP.

Yet another example relates to fraud management in telecom. Specific algorithms are designed to determine and alert when fraudulent activities are detected in the network. An individual whose international calls suddenly spiked to highly sensitive and terror prone areas of the world could be reported as a fraudster by such algorithms. However, it could be because the said individual is a new diplomat of a mediating nation. In this case acting on the machine’s insight and recommendation alone would have resulted in a wrong decision offending the political powers.

Ask the Right Questions

Human expertise is not just important at the decision point, in order to ensure that an algorithm works perfectly, the right questions must be asked. A number of concepts, hypothesis and regional factors, regulations and so forth go into designing the right algorithm. Without a domain expert these inputs may not be accurate and the algorithms may not work as expected. So, while algorithms have certainly got smarter, faster and even better than humans in some areas, there are still voids in the system that only humans can fill.

Journalist Tim Wu of The New Yorker was trying to reach a conclusion on the same debate in his story on how Netflix scores big on its viewership with the help of algorithms and ended up conceding that the secret algorithm is actually a human – its chief content officer, Ted Sarandos. He sums up his story well with this quote by the man in charge of data at Netflix:

I presented Sarandos with this theory at a Sundance panel called “How I Learned to Stop Worrying and Trust the Algorithm,” moderated by Jason Hirschhorn, formerly of MySpace. Sarandos, very agreeably, wobbled a bit. “It is important to know which data to ignore,” he conceded, before saying, at the end, “In practice, it’s probably a 70-30 mix.” But which is the 70 and which is the 30? “Seventy is the data, and 30 is judgment,” he told me later. Then he paused, and said, “But the 30 needs to be on top, if that makes sense.”

What is true for one industry may not entirely hold good for the other. The degree of importance that should be given to human expertise may vary from one enterprise to another. However, for a successful implementation of a big data strategy, it is important to ensure that necessary check-points in the form of domain expertise are incorporated in the plan.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access