In the book, “The Singularity is Near,” author Raymond Kurzweil envisions a future where information technology advances so far that it allows the human race to break the barriers of our biological limitations.

The result is a world where the use of technology to enhance our bodies, and the world around us, produces a civilization where problems, such as pollution, disease and hunger, are solved. Kurzweil ’s ability to see how information technology can be used beyond the bounds of its initial purpose produces solutions to problems that were unsolvable.

The thought process Kurzweil practices can teach us a lot regarding  the application of big data to business, specifically when it comes to combining data sources (traditional, new and unconventional) and breaking down business-as-usual processes. Similar to Kurzweil’s vision of singularity, with the rapid acceleration of technological capabilities, the affordability of the technology and the amount of information produced by big data sources, a new world of possibilities opens up;  the merging of data can solve problems that were traditionally viewed as unsolvable. With these capabilities in place, a singularity of data integration can occur and a business can transcend its traditional models and processes for solving problems, developing new products, increasing revenues and operating more efficiently.

This is not to say organizations aren’t integrating data; it is done all the time. But creativity and an organization’s willingness to try new ways to solve old problems are missing. The lack of creativity is often driven by a myopic view of data and its value to the business. This can be a function of data silos that are highly complex to integrate, but more often, it stems from the reluctance to search for new solutions, considering the time and cost associated with big data projects.

However, successful organizations have learned to explore and experiment with a fast-fail mindset. They take a business problem or thesis and work with small, well-defined data sets, model, integrate, analyze and examine the results of the experiment. They then seek to understand what worked and what didn’t, refine the process and repeat. This methodology is especially useful when applied to big data applications such as social media analytics, competitive intelligence and sensor analytics. Big data applications can require the integration of many disparate and unconventional data sets and are a prime target for fast-fail. Fast-fail solves one of the biggest challenges facing big data singularity – we intuit there is business value in these applications, but without some empirical data to prove the worth of a large investment, these programs are unlikely to launch. 

Learn from every failure. It will happen, but it will be contained and understood. Knowledge will be gained and applied to the business and to future big data experiments. Business value will be derived from every experiment. Insights and innovations will occur over time and will often completely change business models and processes as a result. 

With the surge in the amount of data and the increased variety of sources of data in the world of big data, the ability to look beyond the obvious use of traditional and to unconventional data sets is a real competitive advantage for organizations savvy enough to invest in data exploration. They are able to envision the business use of big data driven by points of data integration, with the ability to articulate the vision to a level where integration can be automated. This doesn’t happen overnight and there is no set recipe for success, but it is possible. Organizations that embrace data as an asset and invest in the exploration of big data will have a huge advantage of their competition. So look beyond the current and traditional, like Kurzweil, to transcend the bounds of the accepted. Begin to experiment and explore with big data. The results might completely change your business.