WASHINGTON — When nanotech researcher Luke Lee is looking for inspiration for the next generation of optical gadgets, he ponders the lobster. And the house fly. And the octopus.
Lee and other “bioengineers” are borrowing ideas from all corners of the animal kingdom to design artificial vision systems that could be used for high-tech cameras, motion detectors, navigation devices in unmanned vehicles, or perhaps even synthetic retinal implants.
Lee and his colleague at the University of Berkeley, Robert Szema, describe the latest progress on this front in an article in the November 18 issue of the journal Science, published by AAAS, the nonprofit science society.
Researchers have been fascinated with the phenomenon of vision since the days of the ancient Greeks, who thought sight involved rays emanating from the eye to touch surrounding objects. It wasn’t until fifteen hundred years later that the Arab scholar al-Haytham described how lenses can focus and magnify images.
Today, we know that natural selection has produced at least ten animal vision systems, each tailored to fit the specific needs of its owner. Eyes for different species are adapted for seeing in the day or night, short or long distances, with wide or narrow fields of view, etc.
But, all of these systems capture light and use it produce some sort of picture in the brain representing the surrounding environment. These systems are superior to the imaging technologies that humans have produced in many ways. They can be more efficient and powerful, and often simpler and more elegant than their synthetic counterparts.
How animals see
Animals have two main types of vision systems: camera-type eyes, which use a single lens to focus images onto a retina, and compound eyes which have multiple lenses — sometimes thousands of them.
Animals with camera-type eyes use a variety of ways to focus the lenses so they can see objects at various distances. Humans and birds have specialized muscles that change the lens’ curvature. The single lens in the octopus eye has layers like an onion, each with slightly different optical properties, to help sharply focus the light, even with a wide field of view.
In whales, a chamber behind the lens fills and empties with fluid to move the lens closer or farther from the retina, while in some amphibians a muscle moves the lens back and forth.
Lee and other researchers have made lenses that are similar to those in camera-type eyes, whose focus can be tuned by changing the pressure of fluid in special chambers. These so-called “microdoublet” lenses can assume two different shapes — with both sides either bulging away from each other or curving in the same direction — to help adjust the focal length and field of view. Lenses like these may be useful for applications such as cellular cameras, medical imaging inside the body or optical data storage.
Researchers are also working on building synthetic retinas. This is a challenge because most retinas are curved in nature, but conventional arrays of light-sensors, including those inside cameras, are typically flat and rigid.
Scientists also haven’t yet figured out how to put all the parts together in order to produce a full-fledged artificial eye of this type, according to Lee.
Research is progressing much faster with compound eyes, which are made up of many individual lenses (as many as ten thousand in some dragonflies) and found in insects and other arthropods.
The lenses are part of separate imaging units called “ommatidia,” which each provide fragments of an image. In many cases, the ommatidia send their signals simultaneously, allowing for the fast motion detection and image recognition that allows flies, for example, to evade fly-swatters time and again.
Lee and his colleagues have made artificial ommatidia, each with a tiny lens connected to a tube-like “waveguide” that directs the light down to an optoelectronic imaging device. The ommatidia can be arranged around a dome, projecting outwards in all directions. Putting two of these structures back-to-back could hypothetically allow for a device with 360-degree vision.
“The whole thing could potentially be smaller than a vitamin. What if you could swallow it to get a look inside the body? That’s just a concept right now, though,” Lee said.
One of the most well-known examples of technology imitating biology involves the compound eyes of the lobster. In the 1960s, researchers figured out how to build x-ray telescopes, which capture incoming x-rays and focus them as they bounce off specialized mirrors.
In 1979, it became clear that nature had been way ahead of the game. Researchers discovered that natural selection had endowed lobsters with pretty much the same type of system for vision millions of years ago.
Researchers are also learning optics lessons from animals that, until recently, we didn’t even think had vision-like abilities at all.
Several years ago, scientists discovered that a relative of the starfish called the brittlestar has arrays calcite crystals throughout its skeleton that correct certain light-distorting effects and send vision-related signals to the nervous system. These crystals, which collectively form a system akin to a compound eye, may offer new leads for improving optical fibers or developing optical computers.
Another curious creature, the beetle Melanophila acuminate, can detect forest fires around 80 kilometers away. Female beetles lay eggs in burnt trees, and they detect the fires using specialized pit organs tuned to a specific frequency of infrared light. Researchers are now working on developing new materials that behave similarly to use for detecting heat.
One of the main reasons that researchers are making such progress in imitating living eyes is that they are able to build them out of flexible synthetic polymers or plastics. Living eyes and other body parts are made of naturally occurring, chain-like molecules, so the use of synthetic polymers makes plenty of sense.
In conventional optoelectronics, researchers use flat wafers and create patterns on top of them. A number of recently developed techniques for processing polymers, including rapid templating and thin film multilayers have made it possible to make complex and curved shapes. Lee and other researchers make polymer structures from three-dimensional molds, allowing them to produce curved structures and stretchy components for their devices.
“All of our devices are fabricated with soft-state polymers, not solid-state semiconductors, metal or glass. It’s a completely different way of creating complex integrated system with a precision optical alignment in three dimensions using polymers,” Lee said.
© 2013 American Association for the Advancement of Science