Everywhere one looks, sensors in myriad devices are generating massive volumes of data.

Businesses that harness this sensor data and turn it into usable information have a huge competitive advantage, taking full advantage of new ways to be efficient and innovative companies. The old barriers to analyzing this data are falling and, with them, so are the excuses for not focusing on this data. In the current technology landscape, every organization can make use of this data, turning it into an asset.

In recent years, there has been a well-documented big data explosion and a primary driver of this is the growth in data that sensors create. In nearly every industry and many business units, sensors are being created to monitor and report on specific events Common examples include utility smart meters, health care biosensors such as EKGs, HVAC monitors, traffic readings, insurance company automobile sensors and smart appliances for the home.

The value of analyzing this data is undeniable. It can be used in externally facing applications to customers and vendors, as well as for internal efforts. It can be used to quickly discover faulty equipment, fraud and operational inefficiencies, as well as for predictive modeling and forecasting. With deeper data mining, one can unearth trends and outliers to find untapped markets that can drive new business models.

There is value in it of itself with sensor data, but by tying this data to other sources the value grows exponentially. Combining the sensor data with other complementary sensors or operational data provides value that is well beyond the sum of its parts. For example, by combining data from automobile sensors with weather readings and traffic data, insurance companies can gain a better situational understanding of the conditions surrounding a claim. Similarly, combining smart meter data with billing data can provide rich customer reporting that helps customers optimize their usage.

In the past, technical challenges prevented companies from analyzing the atomic data. There was simply too much data being created, too quickly, to be realistic to store and analyze. Companies analyzed either small windows of the data, looked at statistical sample sets, aggregated data to usable levels or simply ignored the data altogether.

New technologies have removed the technical barriers to big data problems. Now companies can quickly achieve significant value at a low cost. MPP databases, Hadoop-based systems and cloud storage can all play a role in storing this data in a cost-effective manner and efficiently processing it using parallel processing. On the front end, new analysis and reporting tools make use of these technologies and allow both business users and data scientists to dig into the information. Most importantly, it’s possible to do all of this without a large financial commitment.

Furthermore, the makeup of the data allows for quick deployment of these systems. For one, the data structures are often quite simple, so the sensor data itself does not need a complex model. In addition, many sensor data systems follow consistent paradigms, so common storage and analysis patterns can be used.

Coupled with agile development techniques and prototyping, it is realistic for a business to see real value in a month or two and set up a program that provides continual added value by expanding the scope of the data and performing additional analysis.

With the value that can be achieved, along with the low risk and quick turnaround, it’s difficult to consider not starting a program for sensor data analytics. There’s a very good chance that the competition already has.

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access