Data pioneers promised us the three V’s. Yet, more than a decade after the introduction of these “transformative” concepts, (volume, variety and velocity of data), most businesses have yet to master this hat trick. And they simply cannot afford to wait much longer.

Earlier this year, I wrote about how in-memory computing helps executives keep pace with their data. In this column I would like to look at why not making use of technologies and tools that leverage the velocity of data is a critical mistake.

Business leaders understand that soaring volumes of data have the potential to offer valuable insights to their enterprises. They also realize the remarkable variety of data – be it financial information, customer tweets or mobile phone GPS signals – represents an enormous opportunity.

Yet most leaders are only just beginning to grasp the concept of data velocity. The pace at which data can be gathered, sorted and analyzed to produce actionable insights is increasingly becoming a determinant of success. Those who take too long to generate insights from the data they acquire – to both identify and exploit opportunities at speed – will fall behind their agile rivals.

For example, a major beverage brand has gained an edge in the Chinese market purely by quickly determining which stores are selling out of its goods and restocking their shelves more rapidly than its rivals.

Understanding the Need for Speed

The need to ask for and act on data more quickly is due in part to heightened customer and business expectations. With ubiquitous access to data via smartphones and tablets, businesses no longer have an excuse not to make informed decisions in real time. Mobile technology also enables workers to track insights about customers, products, work orders and more from anywhere.

Moreover, business leaders increasingly understand what capabilities are possible from their experiences as consumers. Someone accustomed to tracking their FedEx orders online will increasingly wonder why they can’t do the same with their own company’s supply chain processes. And as analytic skills and techniques become more accessible, businesses will want more employees to generate and exploit information-driven insights  at the moment it matters.

The ability to harness data velocity is also crucial for businesses to remain competitive. In the retail sector, for example, companies have only seconds to connect with customers who land on their product sites. In this short window of opportunity, they must use all customer data available (e.g., past purchase history, social networking activity, recent customer support questions, etc.) to immediately generate customized and compelling messages for their consumers. Amazon’s recommendation engine is powerful because it makes relevant suggestions at the precise moment a customer is ready to purchase a product, not a day or two later.

In any industry, data velocity can be vital to maintenance and repair. For instance, the sooner data can be used to identify a problem at a malfunctioning oil rig – where the shut-down cost may be upwards of $1 million a day – the quicker it can be resolved. Plus, the data can be used to predict future breakdowns and determine preventative maintenance. This is now a core business line for some of the world’s major jet engine manufacturers, supplying a live monitoring service to ensure engineering backup precisely when it’s needed.

Similar applications have arisen across every area and type of business. A company that spots a fall in productivity at an aging plant can address the problem quickly enough to prevent delays in customer orders.

The Tools to Deliver – Quickly

After a period in which innovation was concentrated on the scale and breadth of data, technology providers have begun to focus on velocity. Some vendors have developed solid-state-based appliances that are faster than ever. In-memory computing is ideal for filtering and analyzing streaming data. In fact, the increasing adoption rate of in-memory technology in corporate data centers – which Gartner predicts will grow threefold by 2015 as memory costs fall – represents an inflection point for business enterprise applications. For the first time, it’s possible to unify transactional and analytical processing, and business leaders can ask their databases specific, ad hoc questions and receive immediate answers.

Inevitably, greater speed costs more. For this reason, one challenge for CIOs and their colleagues is to determine when non-real-time data can be effectively applied. The aim should be to combine faster and slower technologies and systems to cost effectively solve. For example, a company’s weekly analysis of one customer’s spending habits might identify a potential loss, while data showing a live search for lower-cost service options on the company’s website might prompt instant ideas for new sales.

Taking this hybrid data approach does not have to undermine a concerted push for faster data processing. In today’s global business environment, where volatility has become a constant state, data velocity is the key to securing a competitive advantage. Reducing “the time to insight” is a business necessity.

(Author's note: To read more about “data velocity” and other trends IT and business leaders need to prepare for, download Accenture’s recently released Technology Vision 2013.)

Register or login for access to this item and much more

All Information Management content is archived after seven days.

Community members receive:
  • All recent and archived articles
  • Conference offers and updates
  • A full menu of enewsletter options
  • Web seminars, white papers, ebooks

Don't have an account? Register for Free Unlimited Access