When IBM's Watson challenge goes live on television's Jeopardy game show, a 50-year slog of futurist computing credulity will again hang in the balance. The demonstration, billed alternately as man versus machine or as a natural language breakthrough, is also another stab at a durable vision of how people and computers ought to maturely interact.
Anyone who works with data and information will see Watson in part for what it is, a speech processing, question answering, talking machine built over four years at stunning cost to play the Jeopardy game show against human competitors. Reports so far have stopped at the implication of artificial intelligence from industry watchers burned too often by a term that has failed by definition to meet expectations in a single commercial product.
But even jaded technology analysts say what Watson does is a new thing that goes far beyond a chess match or computational exercise, and are tantalized by a new potential angle for the computing industry.
"Watson is a state of the industry example of what a vendor with a lot of imagination, R&D, experience and investment can do and I'm pretty sure it is going to be surprising and even shocking to people when they see it," says Charles King, principal analyst at Pund-IT Inc.
The upcoming TV event, already taped, pits Jeopardy uber-champs Ken Jennings and Brad Rutter against the computer in a three-match series. In a demo taped for reporters and analysts before the actual shows were recorded, the computer beat out the two champions.
On the surface, the event has the makings of a familiar gimmick, a talking video monitor atop a podium between two human foes. It evokes an episode of Star Trek or HAL from 2001, A Space Odyssey, not to mention Forbidden Planet and Robbie the Robot. IBM teased that imagery with a bit of anthropomorphic Jeopard-ese in the demo at a point when the computer tells the host, "Let's finish 'Chicks Dig Me,'" and draws a laugh from the audience before correctly closing the category.
For all that, long-time observers are struck by what behaves like a domain-specific but real personal digital assistant, authoritative, instantaneous, adaptive and standing by to take orders and answer questions with all the information at its disposal. What is surprising is how the system is so responsive, informed and hands off, with none of the awkward desocializing heads-down fixation consumers display with their personal devices.
"I'd almost call it a baroque approach to technology," King says, "a lot of different pieces pulled together on the computing side plus some enormous work on syntactical analysis and voice recognition and natural language."
Bits of all these technologies are common in smart phones and other types of computers but are much more esoteric in the depth Watson delivers. As a fine-tuned collection of parts, Watson is part hardware and processing, part software and natural language-processing, heavy on algorithms that parse vernacular and build associative relationships among the data stored within. IBM claims to have spent $3.2 billion over the last four years developing the products behind Watson.
Some have remarked that IBM didn't build a true computerized answer machine, but rather one that can play a game like Jeopardy at an expert level. But that is precisely the type of opportunity that has analysts excited.
Richard Doherty, who co-founded the consultancy Envisioneering with Steve Wozniak in early 90s, attended the live demo and came away convinced that the architecture of Watson is loaded with potential.
"It's a finite data set that does not include things outside the scope of the Jeopardy game, but it is such an amazing demonstration of near real-time competency that the mind just goes afire with the opportunity for other finite data sets," says Doherty. "You immediately think of construction or civil planning or health care planning where this kind of preloaded data can very quickly post answers." Third-party opportunities, he says, could be "phenomenal" for creating data sets that could open up "a whole new IT industry."
Analysts picture such a system in the hands of a physician seeking help with a diagnosis. The benefits might be staggering if a doctor were able to instantly inspect every bit of published medical research from the last three years and draw conclusions with an attached degree of confidence to compare to his own hunches.
More than speed, it is the confidence-building ability of Watson that has analysts seduced by the demonstration. Watson doesn't just hunt through materials like a search engine. It builds on its understanding of clues in the vernacular of Jeopardy the same way a puzzler would solve a Will Shortz New York Times crossword. It looks for associations, gathering evidence and sureness along the way. Unlike a search engine, the machine will not even offer an answer below a preset threshold of confidence.
Longtime data analyst Merv Adrian, who's now working for market researcher Gartner Inc., sees the Jeopardy wordplay as a classic learning challenge for man or machine. "Each column on a Jeopardy board gives an answer that suggests a type of question. You might get the first two answers wrong before you figure out the right way to ask the question."