Understanding the data downtime gap — and how to fix it
Today's society is so data-driven that many industries are at an extraordinary disadvantage, and they may find they cannot operate efficiently without reliable information.
The growing dependence on this information, plus the sheer amount of it, led to something called the data downtime gap. Here's a look at what that is and how companies can address it.
What Is the Data Downtime Gap?
The data downtime gap occurs during periods where the information a company uses is missing, incomplete or has errors. Unfortunately, the typical way to deal with these issues is to take a reactive approach. However, that may mean a data-related problem persists for weeks or even months before someone catches it. In that case, it could take a significant amount of time to fix.
There are several warning signs that a data downtime gap may exist. One is that people who work with or view the data contact a company to complain about discrepancies they found. That issue is severe because it makes people start to doubt the data as a whole, not just the parts with known faults.
Another sign that the data downtime gap is becoming problematic in an organization is if company leaders start to acknowledge their distrust in data or decide they are not ready to invest in becoming a more data-driven organization.
It can also become apparent if any team members at an organization spend a substantial amount of time working on problems associated with the data or trying to figure out where issues exist. In that case, the data specialists at an organization may waste so much of their workdays on tackling preventable matters that they cannot devote adequate resources to tasks that help the business grow.
How Does the Data Downtime Gap Affect Various Industries?
There are virtually no limits to how the data downtime gap could hinder operations at a company level. It's also valuable to investigate how a data outage could affect particular industries.
For example, Harvard Medical School researchers looked at medical records for more than 5,500 patients. The individuals studied had either bipolar or depression, and the goal of the study was to see how well electronic health records (EHR) represented a complete and accurate picture of the care they received. The results showed that the EHR did not capture all diagnoses, visits, hospitalizations or emergency room care.
The data downtime gap, then, could mean each provider who treats a patient may not have the correct details about what has happened with that person over time. Such omissions could negatively affect the quality of care.
Data problems could also spark larger issues. For example, if a manufacturer depends on predictive analytics to prevent machine breakdowns, but something about the collected data is wrong, production could halt when equipment fails without warning. Or, if a marketing company gathers data to make decisions about how to please its customers, the data downtime gap could cause the firm to spend massive amounts of money on a campaign that fails.
In 2018, scientists admitted that miscalculations in a study about how fast global warming heated the oceans meant that they could not trust their findings as much as they previously thought. The way that they quickly and publicly announced corrections set a good example for others who may make similar mistakes. But, if it took longer to find the errors — or they never got discovered — those issues would have hurt their credibility.
Ways to Fix the Data Downtime Gap
Using intelligent analytics can alert companies in any industry when things go wrong. For example, a product called MECO smartANALYTICS is for the water purification industry. It collects real-time systems data, such as temperature and flow rate. Then, the technology benchmarks those inputs thousands of times daily.
Quality tracking for every data-related job or table of information is another option, and it's one that is often embraced by companies in the latter stages of what's sometimes called the data reliability maturity curve. Being proactive with quality tracking is an excellent way to catch issues early before they affect multiple parts of an organization.
Creating a culture of data integrity at a company is also crucial for minimizing the data downtime gap. A data issue in a sector like the pharmaceutical industry is especially problematic because it could have life-threatening consequences. The U.S. Food and Drug Administration (FDA) published guidance for creating a data quality culture that any industry could adopt. It stresses that data integrity must be a top-down value that gets reinforced throughout the company.
Organizations can also get valuable tips for how to mitigate data downtime problems by looking deeper into the cause of recent outages, such as those occurring over the past year or six months. Are patterns present in any of them? For example, maybe there's a tool, department or process that's often part of an outage. If so, the best approach is to find out what's specifically going wrong, and what's necessary to solve the problems.
Finally, company leaders who want to get to the heart of what's causing a data downtime gap must remember that the solutions may vary depending on how the enterprise uses data and for what purposes. That's why it may be useful to hire an outside organization to assess for weaknesses in a company's infrastructure and practices that may make downtime more likely to happen.
If a company finds a data downtime gap, it must address it. However, a firm doesn't need to see this issue as an impossible one to overcome. By using some of the strategies mentioned here, plus having an overall willingness to change, enterprises can make meaningful progress.