Los
Angeles Times: Caroline Lyon of the University of
Hertfordshire, UK, and her colleagues have conducted a study
with a 1-meter-tall humanoid robot called DeeChee that could
produce all the English syllables needed for speech, but was
incapable of uttering any English
wordsâmdash;at least at first. Lyon's team
then programmed DeeChee's computer brain to listen to what
humans told it, break up the audio streams into individual
syllables, and then rank the syllables by frequency. DeeChee
was also programmed to react positively to "Good!," "Well
done!," and other encouraging remarks. Top-ranked syllables
formed the robot's nascent vocabulary. DeeChee quickly picked
up common nouns and adjectives, but struggled with common
prepositions, even though the prepositions occur frequently in
speech. That difference in comprehension, Lyon speculates,
arises because "at" or "in" are used in more ways than, say,
"blue" or "house." A
paper about the project appeared last week
in
PlosOne.
Skip Nav Destination
Robot programmed to learn words by listening, just like infants
18 June 2012
DOI:https://doi.org/10.1063/PT.5.026108
Content License:FreeView
EISSN:1945-0699
© 2012 American Institute of Physics