© 2019 SourceMedia. All rights reserved.

3 key elements to make data monetization possible

Data is a company’s most valuable resource and is at the core of modern business success. Extremely prominent in today’s business world, data is considered the ‘holy grail’ of enterprise analytics initiatives that are crucial for digital transformation and market differentiation.

However, according to a recent study by McKinsey, many respondents lack a data-and-analytics strategy at their companies even when they recognize the need. Sixty-one percent of respondents who recognize that data and analytics have affected their core business practices say their companies either have not responded to these changes or have only taken ad hoc actions rather than develop a comprehensive, long-term strategy for analytics.

Businesses that are not realizing the full potential value of data are leaving untapped opportunities on the table and are at real risk of being disrupted by companies that are driving forward with an analytics agenda.

While the value of it is undeniable, achieving transformational-level success can be tricky and navigating the journey of infrastructure, vendor solutions, processes, techniques and tools can be challenging. Organizations that are pursuing analytics often make the critical mistake of focusing on the technology rather that starting with the strategy and desired business outcomes, which can ultimately hinder their ability to monetize, or in other words, get value from their data.

data monetization.jpg

In my experience, there are three key elements needed to become more effective at leveraging big data and advanced analytics to achieve data monetization and gain competitive advantage: People, process and technology.

Building High Performing, Integrated Teams

Monetization of data relies on the people performing the analytics and building the systems that operationalize them. There are several different teams that need to work together to make that happen. The folks you hear the most about these days are the data scientists, who collect, analyze and interpret massive amounts of data to identify patterns using advanced analytics technologies.

There are also data stewards, who use data governance processes to ensure fitness of data elements and understand what the data means. Data analysts do more traditional analytics and provide the operational reporting that keeps the organization on track. An R&D function is also critical for identifying potential new innovations, along with agile development teams who put the data products and solutions into production. Successful organizations have these groups working together on projects from start to finish so that everyone is working toward the same end goal in a cohesive, streamlined fashion.

Establishing Industrialized, Scalable Processes

As organizations gain maturity in data science and analytics, there are no shortage of use cases and ideas. It’s imperative that organizations establish a process upfront to identify and prioritize business use cases that will have the most benefit with the lowest barrier to implement.

To understand the correct initiatives to pursue, make data-driven as opposed to gut instinct decisions. Look at the data you have, analyze it, evaluate the technical feasibility and then determine the value, ensuring that there is a business case to be made to move this idea forward. Go for the quick wins first! The worst-case scenario is to operationalize the analytics and then realize that it doesn’t have a measurable effect on the business.

Another critical area for process excellence is enabling data scientists to be productive, day in and day out. Success depends on their ability to find and apply patterns in data – so get them what they need quickly, then get out of their way! This means provisioning analytics environments in minutes, not months, and having a safe space for them to test out ideas without breaking anything.

To achieve this, organizations need to automate the end-to-end process and proactively identify and eliminate any gaps or delays in the discovery and monetization lifecycle. For more on these topics, see these related blogs: “Industrializing the Data Value Creation Process” and “Applying a Factory Model to Artificial Intelligence and Machine Learning”.

Providing a Modern Architecture

Modern teams require a modern architecture. Without getting into specific technologies, in my experience, the enabling big data technology should be architected with the following design principles:

  • Data that is made available easily and freely, but in a controlled and secured manner, without creating physical copies that sprawl throughout the IT estate.
  • Secluded “sandbox” environments so users can freely experiment with the data using tools of their choice, without the risk of corrupting the “master” or production data.
  • Environment elasticity that scales up and down based on the user workload and analytical requirements.
  • Compliance with all governance, security, and regulatory rules and processes.
  • A cloud-ready mindset, be it private, public or hybrid cloud deployments.

Rather than provisioning static, monolithic environments with a limited set of tools, IT should focus on providing the core infrastructure as a service and give users the freedom to choose their own tools so that they can innovate at the pace the business requires.

Monetizing data efficiently requires the right people with the right skills and instilling industrialized, enterprise-wide processes. Additionally, in order to scale without excessive costs, the right architecture and infrastructure must be deployed. Bridging these three equally important components together is the key for organizations to successfully monetize their data and build competitive advantage.

For reprint and licensing requests for this article, click here.