When it comes to data quality best practices, it’s often argued, and sometimes quite vehemently, that proactive defect prevention is far superior to reactive data cleansing. Advocates of defect prevention sometimes admit that data cleansing is a necessary evil. However, at least in my experience, most of the time they conveniently, and ironically, cleanse (i.e., drop) the word necessary.
Therefore, I thought I would share a story about how data cleansing saves lives, which I read about in the highly recommended book “Space Chronicles: Facing the Ultimate Frontier” by Neil deGrasse Tyson. “Soon after the Hubble Space Telescope was launched in April 1990, NASA engineers realized that the telescope’s primary mirror – which gathers and reflects the light from celestial objects into its cameras and spectrographs – had been ground to an incorrect shape. In other words, the two-billion dollar telescope was producing fuzzy images. That was bad. As if to make lemonade out of lemons, though, computer algorithms came to the rescue. Investigators at the Space Telescope Science Institute in Baltimore, Maryland, developed a range of clever and innovative image-processing techniques to compensate for some of Hubble’s shortcomings.”
Register or login for access to this item and much more
All Information Management content is archived after seven days.
Community members receive:
- All recent and archived articles
- Conference offers and updates
- A full menu of enewsletter options
- Web seminars, white papers, ebooks
Already have an account? Log In
Don't have an account? Register for Free Unlimited Access