You can’t help but smile when a good-looking robot makes goo-goo eyes at you — even though you know it’s simply a matter of 24 mechanical actuators pulling a foam-rubber face in just the right way. That’s the trick that robotics experts will be trying to perfect as they develop new strains of companion contraptions.
The new breeds of sociable robots could serve as experimental subjects in the quest to understand human cognition, a friendly mechanical face at the ticket booth or teller’s window, even faux friends for shut-ins.
But that’s probably decades down the line. For today, K-Bot is the state of the art for David Hanson, an artist and graduate student at the University of Texas at Dallas. The robotic face is set on a wooden stand with a tangle of multicolored electrical wires streaming down to electrical gadgets and a laptop computer. It ... she ... was built from about $400 worth of parts, with a face sculpted from soft, skinlike foam rubber material of Hanson’s own design, dubbed “f’ubber.”
K-Bot, whose name and feminine visage were inspired by lab assistant Kristen Nelson, can be programmed to follow humans with its camera-equipped eyes and mimic their facial expressions, Hanson said. Twenty-four actuators control the eyes, the face muscles and the tilt of the head.
The total effect might remind you of the animatronic mannequins made famous by Disney, which is where Hanson once worked as a consultant.
“The main difference is the intelligence — putting the ability to socialize in these devices, to make the appropriate social facial expressions, to recognize facial expressions in real time,” he said.
K-Bot wasn’t in full form for Monday’s demonstration at the annual meeting of the American Association for the Advancement of Science, since Hanson didn’t have time to prepare the full setup. But he did put it ... her ... through a repertoire of expressions ranging from a smile to a grimace of repulsion. All it took was a furrowed brow here, downturned mouth corners there, a raised lower lip and a pursed upper lip.
“In terms of mechanical expressivity, we’re matching the human face, one to one,” Hanson said, although he admitted that adding three or four more actuators around the mouth would come in handy for mimicking speech.
For now, K-Bot is mute, but eventually Hanson plans to add a speaker for speech synthesis. That’s more efficient than trying to add a mouth cavity, larynx and all the other plumbing required for speaking the way humans do it. He also plans to add much more smarts to the software.
“The next step is to begin to design the psyche of the robot,” Hanson said, so that K-Bot knows when to respond to an angry human with anger, and when to flash an conciliatory smile.
For Hanson, K-Bot is step down a decades-long path in cognitive science. Future robo-faces could be used to test theories about how humans come up with acceptable responses to social cues. Eventually, the robot itself might recognize when it has flashed an inappropriate expression or made an ill-timed remark, then adjust its own software accordingly.
There may even be occasions when humans who have a psychological problem with socializing could learn a thing or two from K-Bot’s descendants.
Many other robotics experts are working on their own brands of sociable machines.
Cynthia Breazeal, a professor at the Massachusetts Institute of Technology, was a pioneer in the field, by virtue of a cute contraption called Kismet. The machine, which looks something like a partially disassembled Furby toy, had big eyes, ears and lips that responded to grown-ups and toys much as a baby might.
Now she’s working on a furry, lop-eared robot named Leonardo, which was designed with the aid of experts in animatronics.
“There are many, many, many, many possible applications,” she said. Sociable robots could serve as entertainers, nursemaids, servants or surrogate friends. The software advances could also lead to better on-screen “virtual humans” in situations where the physical form isn’t needed — say, providing a friendly “face” at automatic teller machines.
Meanwhile, Yoseph Bar-Cohen, a researcher at NASA’s Jet Propulsion Laboratory, is working on electrically activated plastic muscles that could be used in the next generation of prosthetic and robotic limbs. Eventually, he’d like to see a human-vs.-machine arm-wrestling challenge, on a par with the recent chess challenges involving human champions and computerized competitors.
“Competitions drive a lot of good stuff for us,” he said.
Looking beyond the science and engineering, the effort to construct more humanlike robots has a philosophical point as well, the researchers said.
“Robots have always been an intriguing mirror to our own conception of what it means to be a human,” Breazeal said.
“The ultimate goal is to create a compassionate, sociable robot that begins to approach on various aspect of human intelligence, and someday become our peer,” Hanson said.
For some, that may sound like either a fantasy or a nightmare. Could a robot really develop intelligence on a human scale?
For the purposes of the debate, Hanson defines intelligence as an individual’s ability to adapt his responses to an often-unpredictable environment.
“Happily, biology has tackled many of these problems very effectively, so by modeling biology in robotics, you’re going to make robotics more adaptive, more intelligent,” he said. “Will robotics become a living intelligence at some point? That’s a question that has profound possible implications. It’s very difficult to answer conclusively. My answer would be yes. But there would be a lot of people who would debate that.”