There is a stir this week over a report at Politico.com that active and retired CIA agents are moonlighting for hedge funds and financial firms to profile the words and physical tics of executives delivering earnings calls in the name of "deception detection."

According to Politico, an outfit called Business Intelligence Associates is at the root of this service. BIA (note the similarity to CIA) "agents" listen for qualified wording (e.g. "basically," "honestly") that preface statements; detour phrases ("as I already said...") or hostile answers to judge whether deceit might be present. They also monitor for physical tipoffs, hand gestures or other behavior that might suggest a misleading statement or outright lie.

The Politico report includes enough citations to have been picked up by The New York Times and has already provoked a question of propriety from Senator Diane Feinstein of California. 

Moonlighting CIA operatives is probably the bigger story here. But what immediately dawned on me is that if the report signals the start of something bigger, speech analytics vendors -- who have traditionally addressed contact center interactions -- are going to have a field day with a new outbound line of business to measure the veracity of just about anything said in public.

I'm neither credulous nor well-schooled on the topic, so I called my favorite non-resident expert, Donna Fluss at DMG Consulting, who is very well informed in the area of speech analytics and contact center technology generally.

I asked Donna about this idea of flipping profiling for sentiment or emotional outbursts -- or even lies -- outward from the contact center into the world at large. "Well, it's already out there and productized in speech analytics -- and that already includes emotion detection." It's advanced enough that Fluss is willing to bet that speech analytics will be "at least as accurate as a top CIA-type profiler for looking at any kind of spoken activity."

While it may not be evidence you could take into a courtroom, financial analysts are always looking for a leg up - just like businesses. And an earnings conference call is really no different than a service inquiry in that sense.

Fluss recalled listening to a new CEO at a big company deliver his first financial call. "Before he did this, I said to the company, 'you need to train this person.' They didn't really listen and then he gets on the call and answers questions he shouldn't have been expected to know the answers to. He made every mistake he could have, and if you didn't know he was a new CEO you'd have thought he was lying."

An interpretation like that would have been helped by some background information to cast what was said in the proper light. Within contact centers, a newer discipline called desktop analytics is coming to play, where agent activity during a customer interaction is studied to bring context to the event, Fluss says. "Whether it's a CEO or a contact center, you want to bring metadata to the table, and the more context you have for an interaction, the better you can understand what's really going on."

That makes sense, but back to the BIA/CIA story, I am sure there will be good news and bad news coming in the wake of this profiling. Where does it lead? And as Donna pointed out, what's the value of Sarbanes-Oxley reform if we now have to parse the utterances of executives for subtext?

Speech analytics and profiling are very real, and that's why there are high-end tools and real profilers out there working for a living. BSA, according to Politico, is already charging high six figures for their services. But used indifferently, I can anticipate a bit of circus down this path among people with a predisposed agenda. I already chuckle at news outlets that bring body-language experts on air to decipher the gestures of political opponents, and TV show FBI profilers who can discern that an "unsub" criminal hates his mother and likes to garden before they see any evidence of a crime scene.

Moonlighting CIA profilers is another subject entirely and I'll leave that to others. Uncovering financial deception is one thing, but I worry whether another layer of perceptive modeling is going to lead to abuse of data of questionable quality at the individual (as opposed to group) level, and more insularity among people already overmined for their thoughts and habits.

What do you think?