Looking for a new Ultra HD TV or a top-of-the-line smartphone? Experts agree: tech fans crazy for sharper resolution are sometimes paying for more pixels than their eyes can actually see.
When it comes to televisions touting new 4K technology, "a regular human isn't going to see a difference," said Raymond Soneira, head of display-testing firm DisplayMate Technologies.
In 2010, when Apple unveiled the iPhone 4, Steve Jobs explained that with the phone's breakthrough "Retina" screen, the eye could no longer distinguish between individual pixels on the display when viewed from an ordinary distance. The promise wasn't just a sharp screen, but a screen so sharp that further refinements would be unnoticeable.
Yet the number of pixels-per-inch (PPI) on mobile devices has been on the rise. The iPhone's pixel density has stayed the same at 326 PPI, but Android-powered competitors such as the HTC One and the LG G2 have screens that rate well over 400 PPI.
Meanwhile, as shoppers line up with their holiday carts, stores are starting to carry "Ultra HD" TVs — also called 4K. These sets have a resolution of 3840 x 2160, or four times as many pixels as ordinary high-definition TVs. But even those standard HD sets, at the distance viewers regularly watch them, can be considered "retina" resolution. The number of pixels is quadrupled for 4K TVs, but experts say that in most cases, the human eye cannot even perceive the difference.
"There's going to be some density beyond which you can't do any better because of the limits of your eye," said Don Hood, a professor of ophthalmology at Columbia University, in a phone interview with NBC News.
Manufacturers like Sony and Samsung have their new 4K TVs as a revolution in imaging. Sony's website describes their displays -- which range in price from $3,000 to $25,000 -- as "four times clearer than HD." Samsung's $40,000 85-inch TV promises "a new form of fulfillment" with its "simply breaktaking resolution."
"Sony believes that the 4K picture quality difference is evident when seen in person, and we invite consumers to see and experience the difference for themselves because seeing is believing," Sony said in an email to NBC News in response to experts who questioned the practicality of the company's 4K displays. Samsung did not respond to similar inquiries.
Are these marketing claims plausible? And just what are the limits of the human eye's ability to perceive resolution? Here's an easy way to visualize it:
A person's field of vision covers about 200 degrees, a little more than a semicircle. At arm's length their index finger's fingernail will appear to be about the width of one of those degrees. Imagine that fingernail covered in 120 alternating black and white stripes — being able to discern those stripes at that distance is just about the theoretical limit of the human eye.
The same image displayed at 100% of its resolution on a smart phone with 1136 x 640 pixels and a TV with 1920 x 1080 pixels.
In reality, though, hardly anyone has such superb vision. In fact, most people would be unable to discern pixels or lines twice that size. And whether a phone or tablet display meets that standard depends on how far it it is from the viewer. In a living room, a viewer's 40- to 60-inch TV is positioned at a fixed distance, probably seven to nine feet away. Unless pixel-hungry TV fans buy far larger set, or push their couches much closer, any increases in resolution simply won't be perceived.
So why are companies pushing for the extra pixels? Are the extra dots really going to make 'Law and Order: SVU' any more entertaining?
"History has shown that people make something technologically possible, then someone figures out how to capitalize on it," said University of Utah neuroscientist Bryan Jones, who was among the first to put Apple's original Retina display under the microscope.
"But for TVs," he continued, "I don't see a point."
Other experts NBC News spoke to concurred:
"It's barking up the wrong tree," said Hood.
"It's a waste of time, if you ask me," said New York University neuroscientist Michael Landy.
"There was a bigger case for 3D than there is for 4K," said Soneira.
"And consumers will soon realize that they aren't seeing much, if any, visual resolution and sharpness improvements," Soneira continued. The sets will likely be better than today's in other ways, he noted, "but the higher pixel count will not be the reason."
So if piling on more pixels isn't the next big thing — despite what TV makers and retailers will try to tell shoppers over and over again — what is? Experts said there are plenty of ways displays could improve.
Soneira pointed to newly developed "quantum dot" technology for displays that is already leading to far better color representation on some devices. Jones and Landy favor advancements in dynamic range, leading to displays capable of showing light and shadow in movies and games the way we see them in real life.
"When you're in a scene where there's indoor stuff, outdoor stuff, glossy materials reflecting other lights ... that dynamic range is huge," explained Landy. "Consumer-grade displays don't get that stuff right."
"Some of the great masters, the painters, they knew things about light and shadow," adds Jones. "They kind of knew instinctively how the retina works." In other words, perhaps the secret to a better TV is hidden in the smile of the Mona Lisa.
Devin Coldewey is a contributing writer for NBC News Digital. His personal website is coldewey.cc.
First published December 15 2013, 1:11 PM