Distance driving can be mind-numbingly boring, but looking away from the road to text or change songs is a life-or-death gamble. Plus, buttons embedded in the wheel only control a fraction of a car's functionality. Now German researchers have a wheel prototype that puts everything within reach — no glancing needed.
"If you have gestures on the steering wheel, you spend more time looking at the street," said Albrecht Schmidt, a computer science professor specializing in human-computer interaction at the University of Stuttgart in Germany who worked on the prototype.
The team, which includes University of Duisburg-Essen researchers Tanja Döring, Dagmar Kern, Max Pfeiffer, and Volker Gruhn, as well as Johannes Schöning of the German Research Center for Artificial Intelligence, came up with the idea for a multi-touch steering wheel interface while thinking about driving and mobile technology.
Their prototype is made from 11-millimeter-thick clear acrylic ringed in infrared LEDs. An infrared camera attached to the bottom picks up the reflections made when the surface is touched. A driver can control a radio or navigate a map with simple movements along the surface. Those gestures can be made with the thumbs while still gripping the wheel and looking at the road.
"We use a standard tracking framework, very much like Microsoft Surface and those interactive tables," Schmidt said.
To identify intuitive gestures, the researchers conducted a study asking participants what movements they'd make for each of 20 commands. Döring said that gestures originated from different mobile devices. Participants pinched two fingers together to zoom in on a map and made a triangle for "play." When they couldn't come up with an abstract gesture, they traced the first letter of the word.
Once they had established a set of gestures, the team tested the impact on driving performance in a simulator with new participants. According to Paul Marshall, a research fellow at the University of Warwick who analyzed the study data, the wheel substantially reduced visual demand compared to a conventional console.
A solid multi-touch steering wheel would mean re-imagining the traditional steering column.
"Somebody in the '30s decided it's good to have the instruments behind the steering wheel," Schmidt said. "It may be time now, having entered the computing age, where we have to rethink if it's very clever to hide one display underneath controls."
Schmidt said the wheel could be combined with automotive head-up display technology that projects info directly onto the windshield. He added that the wheel could also work with a sensor system that detects traffic and encourage drivers to focus on steering. The team envisions a touch surface thin enough to break and allow an airbag to deploy.
The steering wheel was presented recently at the ACM CHI Conference on Human Factors in Computing Systems in Vancouver. Schmidt said the group is currently talking with automotive companies to see what could be put into cars in the near-term, even if it's a small multi-touch surface.
Andrew Kun is an associate professor of electrical and computer engineering at the University of New Hampshire and principal investigator for Project54, which integrated electronics into police cars so officers can tell vehicles to turn on lights or run a license through a remote database.
Reducing glances away from the road is important, he said. "Certainly it's exciting to be utilizing this space," he said of the German researchers' focus on the steering wheel. "This is nice because it's a look ahead."
Kun pictures displays everywhere in the future. "You could see how that would make a big difference in the car…assuming we're still driving then," he said with a chuckle. "[But] I think we'll be driving for a while."
More from Discovery News: