What if algorithms did the hiring at Fox News?

Register now

(Bloomberg View) -- What if Fox News decided to address its gender and racial discrimination issues by entrusting personnel decisions to an algorithm? It’s a fascinating thought experiment -- and one that helps illustrate the dangers of putting too much trust in big data.

The channel’s problems won’t end with the passing of founder Roger Ailes, who resigned last year amid abundant allegations of sexual harassment. Just last week, a racial harassment complaint claimed yet another Fox host, and three more women filed suit saying that sexist and racial discrimination derailed their careers.

There’s a widespread belief that algorithms can help address such human foibles. After all, computers rely on hard data and have no misogynistic bone to pick. With a truly objective robot in charge, complaints would cease and the lawyers could sigh in relief.

OK, let’s try to imagine what would really happen.

Any algorithm requires two things: data (typically piles of historical data) and a definition of success. It parses the data to identify patterns that have led to success in the past, and then uses that information to figure out what will bring success in the future.

If Fox were building its own human-resources algorithm, the relevant data would probably be the profiles of past hires. The algorithm would want to know which characteristics were correlated with success -- defined, say, as staying at the channel for at least five years and getting promoted at least twice.

Given Fox’s history, what hiring recommendations would the algorithm make? Well, if we believe the lawsuits claiming that women and minorities were systematically discriminated against, the data would show that females and blacks weren’t particularly successful. So the computer would filter them out, because they wouldn’t fit the desirable profile.

The result: Fox’s robot would be highly likely to hire precisely the same kinds of white males who caused all the problems in the first place.

Machine learning algorithms are not wise. They don’t see through bias. Rather, they pick it up and repeat it. They automate the status quo. So if the status quo at Fox consistently prevents women from succeeding, the algorithm will keep doing so -- only with great statistical precision.

The lesson here is that if companies or organizations want to rehabilitate their cultures or business practices, they’ll have to entirely rethink how they hire, whom they value and what qualities they reward. Instead of relying on the past, they need to reinvent the future.

(About the author: Cathy O'Neil is a mathematician who has worked as a professor, hedge-fund analyst and data scientist. She founded ORCAA, an algorithmic auditing company, and is the author of "Weapons of Math Destruction.")

For reprint and licensing requests for this article, click here.