IE 11 is not supported. For an optimal experience visit our site on another browser.

Making 'Second Life' more like real life

You can always spot the novices in the virtual reality world of "Second Life": Their online characters — or avatars — stumble around awkwardly and walk into objects.
Japan Virtual Reality
Tokyo University researcher Katsunori Tanaka crouches down with a Web camera on a large mat with specially coded leaf patterns. A new position-tracking system developed by the university could make it easier for players to navigate virtual worlds.Katsumi Kasahara / AP
/ Source: The Associated Press

You can always spot the novices in the virtual reality world of "Second Life": Their online characters — or avatars — stumble around awkwardly and walk into objects, as their real-world users fumble with the keyboard controls.

Now, technology from Japan could help make navigating online virtual worlds simpler by letting players use their own bodies — or even brain waves — to control their avatars.

Take the new position-tracking system developed by Tokyo University, which uses a mat printed with colorful codes and an ordinary Web camera to calculate the player's position in three dimensions.

The user turns left, and the avatar turns left. The user crouches down, and the avatar follows.

Navigating a virtual world
"This technology lets you use take the actions you'd use in real life and transpose them to the virtual world," said research leader Michitaka Hirose. "It could make maneuvering much, much easier."

"Second Life," the virtual universe run by San Francisco-based Linden Lab, boasts more than 11 million registered users worldwide. People can design online characters that meet and chat with other avatars, go shopping or party.

But the online world isn't as easy as the real world to navigate — especially for beginners.

At a recent demonstration in Tokyo, researcher Katsunori Tanaka strapped a Web camera to his hip, lens down, and walked around on a large mat with specially coded patterns on it. On a large screen was the computer graphic-generated 3-D world of his avatar.

As Tanaka moved across the mat, the view on the screen shifted perspective. When he crouched down to peer under a virtual parked car, the image swerved to show what his avatar would "see" — the vehicle's underside.

The system can track movements in 3-D because as the user moves, the patterns on the mat change from the camera's perspective and the images can be processed to calculate vertical distance and tilt, Hirose said.

Tapping into brain waves
Across Tokyo at Keio University, another research team is offering a virtual experience that reaches even more deeply into the user.

Junichi Ushiba's technology monitors brain activity so players can make their avatars move in "Second Life" just by thinking of commands like forward, right or left.

The interface uses electrodes attached to the user's scalp to sense activity in the brain's sensory-motor cortex, which controls body motions, according to Ushiba. Software then translates the brain activity into signals that control the avatar.

The technology can detect what a user is thinking because when people imagine moving their right arm, the brain's left hemisphere is activated — and vice versa. When people think about moving their feet, the top part of the brain is used.

"The difficult part is to stop thinking," said research student Takashi Ono as he made his avatar stroll through a virtual Tokyo neighborhood in "Second Life."

"I want to go left, so I think, 'left' — but then the avatar turns too far to the left before I can get rid of the command in my head," he said.

Both Hirose and Ushiba said they had no immediate plans to commercialize their technology, though they are applying for patents.

Hirose said he envisioned combining avatar-control systems with video game consoles.

"That would be the ultimate interactive virtual experience," Hirose said. "That's where we're heading to."