A new supercomputer that "sees" the world very much like humans do could allow cars to drive themselves someday, researchers say.
The supercomputer, dubbed NeuFlow, is based on the mammalian visual system and mimics its neural network to quickly interpret the surrounding environment.
NeuFlow is embedded on a single chip, making the system much smaller and yet more powerful and efficient than a full-scale computer.
"The complete system is going to be no bigger than a wallet, so it could easily be embedded in cars and other places," said Eugenio Culurciello, an associate professor of electrical engineering Yale University who has helped develop NeuFlow.
In order to be able to recognize various objects encountered on the road – such as other cars, people, stoplights, sidewalks, not to mention the road itself – NeuFlow processes tens of megapixel images in real time.
The system is also extremely efficient. It simultaneously runs more than 100 billion operations per second using only a few watts, or less than the power that a cell phone uses, to accomplish what a bench-top computer with multiple graphic processors needs more than 300 watts to accomplish.
"One of our first prototypes of this system is already capable of outperforming graphic processors on vision tasks," Culurciello said.
Beyond autonomous car navigation, the system could be used to improve robot navigation into dangerous or difficult-to-reach locations, to provide 360-degree synthetic vision for soldiers in combat situations, or in assisted living situations where it could be used to monitor motion and call for help should an elderly person fall, for example.
Culurciello presented the results Sept. 15 at the High Performance Embedded Computing (HPEC) workshop in Boston, Mass.