Sarcastic Computers On The Horizon
Computers may soon be responding with witty or sarcastic remarks - a contextual process which computers cannot currently comprehend. Researchers have described a mathematical model in a new paper which allows a computer to practice pragmatic reasoning.
Context is a key component of understanding another human being. For instance, a sign outside of a store for children might read "Baby sale - this weekend only!" Without the context of the store, it might seem that the store is selling babies. Or the simple baseball term, "Man on first," is relatively meaningless unless it's said during a baseball game.
Like Us on Facebook
While humans can understand and contextualize these scenarios, they still require having experience. Computers are drawing from pre-programmed decision making sets, and having never been to a baseball game or a children's boutique, cannot theorize the meaning. All we have are direct voice command programs, like Siri for the iPhone.
"If you've ever called an airline, you know the computer voice recognizes words but it doesn't necessarily understand what you mean," says Assistant Professor Michael Frank, one of the authors of the paper. "That's the key feature of human language. In some sense it's all about what the other person is trying to tell you, not what they're actually saying."
To infuse computers with more awareness, they have developed a quantitative theory of pragmatics that would help computers understand social cues and interface better with humans and with each other.
To figure out a way to categorize human pragmatism, the two researchers from Stanford asked 745 participants to take simple tests.
One group was shown a set of objects and asked how they would describe one. For instance, a blue square, a blue circle, and a red square were shown. The subject was then asked to describe the middle object as "blue," or a "circle." The other group was asked to identify which of the objects they thought was being referred to by "blue" or "circle."
"We modeled how a listener understands a speaker and how a speaker decides what to say," explained Noah Goodman, co-author of the paper and assistant professor.
"Before, you couldn't take these informal theories of linguistics and put them into a computer. Now we're starting to be able to do that."