ADVERTISEMENT
  About the SA Blog Network













Science Sushi

Science Sushi


Real science. Served raw.
Science Sushi Home

AI Takes Baby Steps: RoboBaby Learns Words

The views expressed are those of the author and are not necessarily those of Scientific American.


Email   PrintPrint



In 1998, a strange fad swept the nation. Standing a mere 5 inches tall, the gremlin-esque talking robots known as furbies became the season’s must-have toys for kids (much to Hasbro’s delight). The most compelling aspect of furbies wasn’t their strange, half hamster, half owl aesthetic or even their ability to talk; it was that, from the beginning, furbies were advertised as learning robots. A newly purchased furby starts out speaking an entirely made up language called Furbish, but, over time, was said to ‘learn’ English by talking to its owner. As the the instruction manual touted: “The more time you spend with me, the sooner I will be able to speak your language.”

While it was a neat trick, the fact is, furbies didn’t really learn. English phrases were in their memory from the get go – their programming simply dictated that the use of these phrases increased over time. Still, even faked robot learning was cool enough to sell over 40 million furbies in the first three years.

While the furby fad might seem silly in retrospect, their sudden popularity revealed a lot about how people feel about robotic brains. Whether through fear or fascination, we are fixated on the idea of true artificial intelligence (AI). Our obsession with AI is why we applauded when Deep Blue beat chess champion Garry Kasparov in 1997, and why we were transfixed as we watched IBM’s Watson trounce both the biggest all-time money winner and the record holder for the longest championship streak on Jeopardy!.

Now, the Adaptive Systems Research Group at the University of Hertfordshire have created a real-life (and significantly less creepy) furby of sorts. In a new article published today in PLoS ONE, they reveal DeeChee – a robot that, like a small child, can learn words through human interaction.

Most robots outperform people in tasks which require extreme levels of computation. That’s how Deep Blue won at chess – the machine was simply able to calculate possible follow-up moves at alarming speed. The Holy Grail of AI, though, has always been language. When you break it down, language is difficult to learn. Even we, as intelligent beings, have trouble learning other languages. The constant shift of context and structure makes defining language in an algorithmic way near impossible, and thus robots can’t learn to talk simply by being super computers.

A six to fourteen month old child, though, learns to discriminate between words and phrases readily with time. It was this early stage of language learning, when children first turn babble into words, that the research team from Hertfordshire sought to emulate with iCub robot DeeChee.

“Since our work concerns the acquisition of a human language by a robot we are inspired by the process in humans,” explain the authors. “The basis of our experimental work is a real-time interactive situation where a human participant talks to a robot, using his or her own spontaneous words.”

The team told volunteers, who were varied in age, occupation, gender, experience with children and familiarity with computers, to talk to DeeChee exactly how they would if they wanted to teach a real child the words for colors and patterns. This YouTube video shows an example interaction between DeeChee and a volunteer:

DeeChee, in turn, was programmed to hear the teacher’s speech as distinct sounds called phenoemes, not syllables. To DeeChee, the phrase “a red box” might contain any of the following phenoemes: a, ar, re, red, e, ed, bo, box, o, ox. By listening for and responding to praise in response to its babble, DeeChee attempted to piece together what phenoemes correspond to words the teachers were trying to get it to say.

What was interesting about these experiments was not only whether or not DeeChee would succeed at learning words, but also how the volunteers themselves varied in their abilities. “We wanted to explore human-robot interaction and were deliberately not prescriptive,” explain the authors. “However, leaving participants to talk naturally opened up possibilities of a wide range of behaviour, possibilities that were certainly realized.”

Like in a real teaching setting, the teachers – and DeeChee’s learning – varied. “Some participants were better teachers than others: some of the less good produced very sparse utterances, while other talkative participants praised DeeChee whatever it did, which skewed the learning process towards non-words.”

Overall, though, DeeChee learned. As you can see in the video, the robo-baby was able to pick up simple, one-syllable words like red, green, and heart. DeeChee’s success suggests that similar mechanisms may explain how human babies learn to talk.

Modeling children is only the first step in creating truly intelligent robots, of course. There are still a number of hurdles between cooing the names of colors and shapes and being able to learn language to fluency. But DeeChee is non-living proof that computers may be capable of the complex and intricate process of learning language.

“It is known that infants are sensitive to the frequency of sounds in speech, and these experiments show how this sensitivity can be modelled and contribute to the learning of word forms by a robot.”

I, for one, welcome our infant robot overlords.

 
Reference: Lyon C, Nehaniv CL, Saunders J (2012) Interactive Language Learning by Robots: The Transition from Babbling to Word Forms. PLoS ONE 7(6): e38236. doi:10.1371/journal.pone.0038236

Furby image c/o Wikimedia Commons

Christie Wilcox About the Author: Christie Wilcox is a science writer and blogger who moonlights as a PhD student in Cell and Molecular Biology at the University of Hawaii. Follow on Google+. Follow on Twitter @NerdyChristie.

The views expressed are those of the author and are not necessarily those of Scientific American.

Tags: , ,





Rights & Permissions

Comments 1 Comment

Add Comment
  1. 1. cgervasi 10:52 am 07/9/2012

    I never heard about furbies. That explains some of DeeChee’s appeal. I read some of the studies. It’s surprising that the phonetic transcription software did a bad job of detecting human speech sounds, but the system could still learn despite that. http://bit.ly/DeeChe

    Link to this

Add a Comment
You must sign in or register as a ScientificAmerican.com member to submit a comment.

More from Scientific American

Scientific American Back To School

Back to School Sale!

12 Digital Issues + 4 Years of Archive Access just $19.99

Order Now >

X

Email this Article



This function is currently unavailable

X