A year from now, capturing a crisp, clear image of a candlelit birthday party could be a piece of cake — even with a camera phone.
Eastman Kodak Co. said Thursday it has developed a color-filter technology that at least doubles the sensitivity to light of the image sensor in every digital camera, enabling shutterbugs to take better pictures in poor light.
“Low light can mean trying to get a good image indoors of your kid blowing out the birthday candles. It can mean you want to take a photograph on a street corner in Paris at midnight,” said Chris McNiffe, general manager of the photography company’s image sensor business. “We’re talking about a 2-to-4-times improvement in (light) sensitivity.”
Analyst Chris Chute doesn’t doubt that the new filter system, intended to supplant an industry-standard filter pattern designed by Kodak scientist Bryce Bayer in 1976, represents a breakthrough in boosting photo quality — especially when light conditions are not ideal.
“It’s often the most simple concepts that can have the most profound impact,” said Chute of IDC, a market research firm near Boston. “This could be revolutionary in terms of just changing that very simple filter on top of the sensor and basically allowing companies to use it in all different kinds of cameras.”
Kodak expects to provide samples of its new technology to a variety of camera manufacturers in the first quarter of 2008. The technology is likely to be incorporated first in mass-market point-and-shoot cameras and camera-equipped mobile phones beginning sometime next year.
“Typically new features like this would be more likely to show up in high-end products and then trickle down,” said analyst Steve Hoffenberg of Lyra Research Inc. “But I think the biggest potential benefit of this may come in the camera phone environment. Camera phones are using smaller sensors to begin with and smaller sensors generally mean smaller pixels, which means lower sensitivity.”
How it works
When the shutter opens on a digital camera, an image is projected onto the sensor, which converts light into an electric charge.
Most sensors use the Bayer mask: Half of the millions of cells on a checkerboard grid are filtered to collect green light and a quarter each are filtered to let through red and blue light.
A computer chip then reconstructs a full color signal for each pixel in the final image.
The new method, which has been under development for more than five years, adds “panchromatic” cells that are sensitive to all wavelengths of visible light and collect a larger amount of light striking the sensor.
Tailoring software algorithms to this unique new pattern enables faster shutter speeds, which reduces blurring when capturing a moving subject, McNiffe said.