IE 11 is not supported. For an optimal experience visit our site on another browser.

Mars revealed through human-like eyes

Space.com: Scientists can determine distance and object sizes on Mars through Opportunity's 3-D vision, which overcomes the limits of depth-perception.
/ Source: Space.com

When geologists first saw pictures of rock outcroppings at the Opportunity landing site on Mars, they thought the mini-cliff was perhaps as tall as a person. Some started calling it the "Great Wall." Then the robot's 3-D cameras, a pair of eyes standing as tall as a person, showed it was all a bluff.

Seen in stereo, the stack of rocks shrunk to the height of a house cat and the public never heard the catchy nickname.

At a time when digital cameras are fueling a renaissance of three-dimensional picture taking on Earth, scientists are using the technology to estimate distances to Martian science targets and size them up from afar. The capability is crucial to making decisions on what objects the rover should visit, how long each would take to reach, and whether the path contains oversized obstacles to avoid.

With two eyes atop its camera mast, the rover's 3-D vision overcomes limits of depth perception that would plague a single-camera setup.

No secret
"There’s no secret about making 3-D photographs," wrote Joe Farace in the December issue of Shutterbug magazine. "You make an image as seen by the left eye, then one by the right eye, and then — here’s the hard part — use some kind of system to put them together."

Hard or simple, a lot of people don't understand 3-D. Back in June of 2002, Sam Ramada didn't have an answer when his 8-year-old son asked how it worked.

"I'm supposed to have all the answers," Ramada said in a telephone interview last week. "I'm Dad, you know. But I really had no idea."

Two months of research later, he'd figured it out and started Mission 3D (www.mission3-d.com), which makes its own version of glasses as well as kits designed to help any owner of a digital camera do 3-D magic at home.

Ramada's timing could not have been better. Several industry analysts say amateur 3-D photography experienced a comeback last year, spurred both by the availability of inexpensive digital cameras and the summer blockbuster Spy Kids 3-D , said to be the first popular 3-D feature film in decades.

There are various methods for making and viewing 3-D images. In essence, it takes advantage of the separation between two cameras, or one camera that has been moved to take a second picture of a given scene. A triangle of determinable size can be gleaned using as points the two cameras and an object in the distance. When the cameras are our eyes, our brains use the geometry of these triangles to calculate depth, height, or both.

Ramada said human eyes can't judge depth accurately beyond about 30 to 50 feet (9-15 meters) without familiar cues -- objects of known size.

No cues on Mars
"If you are looking at a person in the distance, you can judge how far they are by their size in your field of view," explained Cornell University's Jonathan Joseph, leader of a team that developed the software that interprets 3-D images from the Mars rover's panoramic cameras.

The brain can pick up another clue when some head movement reveals the extent to which an object in the field of view moves — things farther away move less.

Geologists don't have the innate sense of heights for anything at a Martian landing site they're seeing for the first time. There are no houses, cars or trees. "If we're looking at rocks on Mars, we can't judge," Joseph said.

That led to some misinterpretation when Opportunity first photographed the intriguing rock outcropping at its landing site. At first, scientists thought it was 3 to 6 feet (1-2 meters) tall.

Then with the help of 3-D data they were able to better determine the distance from the rover to the ledge, and the feature's true height became clear. "Instead of a meter or two, it's 10 or 20 centimeters," he said. That's just 4 to 8 inches.

How it works
The panoramic camera, or Pancam, is two devices in one. A pair of digital eyes sits atop a mast that's about the height of a human. Only the left camera is equipped with the filters required to make true-color, "normal" photographs in visible light. (The right camera has the filters needed for multi-color infrared imaging, but it can't make true-color pictures in visible light.)

Both cameras are employed for 3-D images, which are gathered in black-and-white. Joseph explained what's then done back here on Earth to produce the 3-D photos, also called stereoscopic images or anaglyphs.

The image from the left camera is put into the red channel of a software program like Photoshop, which can read the standard red-green-blue (RGB) color scheme known to digital photography experts and printing companies. An image of the same scene from the right camera is put into both the blue and green channels, which together make cyan (a light blue). The two images — left and right — are then blended together into one.

The result is a picture that looks largely gray and rather fuzzy, often with a few obvious red and cyan patches jutting this way and that.

A viewer must wear special 3-D glasses to see the glory of the image's depth.

A red filter on the left eye lets only the red light in, absorbing all the other colors of the spectrum. A cyan filter on the right eye lets the cyan in. Two views of one scene are sent to the brain, which processes the result as though it has been viewing the scene with two eyes.

"The brain mixes all the colors," said Ramada of Mission 3D. "It's one of the wonders of the brain."

There's a big difference, though.

A giant's perspective
Instead of the human-eye separation of about 2.5 inches (6.4 centimeters) center-to-center, the Pancam's digital wonders are 11.8 inches (30 centimeters) apart.

Joseph explained that the separation allows researchers to examine the images and, importantly, do simple geometry with the image data, to calculate the distance to objects and their heights, such as the rock ledge studied by Opportunity.

"Just making a stereo anaglyph and looking at it with 3-D glasses, you can still be fooled about the actual size and distance," he said.

In fact because the cameras are farther apart than the human eye, the brain can be quite deceived.

"It either makes you feel like you are a giant person on Mars," Ramada said, "or it makes Mars seem like a small model."