Getting test subjects lost in a virtual building could reveal a lot about how to construct more people-friendly hospitals, schools and other spaces, according to a unique collaboration by a group of California neuroscientists and architectural designers.
The merging of neuroscience, architecture, psychology and virtual reality is allowing researchers to track the brain signals of study participants as they navigate through a simulated building within a high-tech room called the StarCAVE.
“Our goal is to measure the human response to architectural features in a way that we’ve been unable to measure before,” said Eve Edelstein, the project’s intermediary and senior vice president of research and design for Ontario, Calif.-based HMC Architects.
The project should provide a more realistic understanding of how people experience real-world spaces — before a single brick is ever laid, said Edelstein, a trained neurophysiologist and a visiting scholar at the University of California at San Diego.
The inclusion of electroencephalography (EEG) measurements will allow researchers to look at how brain signals change when people know where they are versus when they’re utterly lost.
Beyond the cost advantages of determining before construction begins whether a proposed layout is hopelessly confusing, the science could say plenty about how people navigate through, interact with, and form “cognitive maps” of physical spaces and their virtual stand-ins.

For a pilot study testing the feasibility of such an approach, Edelstein joined collaborators at the university’s Swartz Center for Computational Neuroscience and at the California Institute for Telecommunications and Information Technology, abbreviated Calit2. The first order of business: designing a virtual replica of Calit2’s campus headquarters and projecting it within StarCAVE, a five-sided chamber the size of a small bedroom. Within the room’s slightly inward-tilting space, stereoscopic images displayed on 15 large wall panels and two floor screens immerse viewers wearing polarized glasses in a virtual environment.
The researchers also outfitted study participants with a swim cap-like hat connected to 256 dangling EEG electrodes to measure brain activity. A tracking device on the cap pinpointed the position of each volunteer, while a set of cameras captured head movements to follow their gazes.
“It gives us an opportunity to look at an interesting brain response and ask what the subject is looking at,” Edelstein said.
Alternatively, researchers can detect when an architectural feature is perceived and how it is being analyzed by the observer’s brain.
Neuroscience, she said, traditionally taught that humans could not grow new nerve cells through adulthood. But more recent research suggests that the adult brain is still malleable, spurring researchers like her to ask how architecture can influence the formation of new nerve cells in areas such as the brain’s memory center.
From a practical standpoint, Edelstein said, more scientifically grounded data could be critical in addressing priorities in a hospital, like the goal of dramatically reducing patient injuries, medical errors and infection rates cited by the Institute for Healthcare Improvement’s 100,000 Lives Campaign.
Most medical centers and healthcare facilities focus on signs to help people find their way, she said. Other public spaces, including hospitals, sometimes use colored stripes on the floors as navigational aids.
Studies suggest that ineffective visual cues can cost a hospital hundreds of thousands of dollars annually as staff members take time from their jobs to redirect lost patients. Even more ominously, Edelstein said, “the cost of getting lost in a healthcare setting can be life-threatening.” Someone with an infectious disease could wander into a hospital area that should be a clean environment, for example, or a desperately ill patient may be unable to find the appropriate caregiver in time.
Getting lost and getting a cue
For her group’s proof-of-principle study, the Calit2 space took the place of a hospital, with its virtual replica featuring the building’s lobby, exterior courtyards and some rooms and corridors. In the lobby, the researchers added a few details, including a colored door, projected shadows, and a version of an outdoor teddy bear sculpture made of eight massive granite boulders. The intent was to make the lobby as photorealistic as possible, Edelstein said. “And so it’s rich with visual cues that could assist a person in navigation.”
In contrast, the researchers successively removed visual cues in the building’s south corridor. Study volunteers were then given navigational tasks and remote controls to help them get through the virtual building, and the scientific team pored over the brain responses as the participants found their way.
“The first thing that was very fascinating to us occurred before the analysis of the brain wave response,” Edelstein said. “It was an observation of the increasingly subtle cues that people used.”
The angle of incoming sunlight, the researchers discovered, was a major cue for many participants.
“That’s what humans and animals have been using for millennia and we actually remove that in most architecture,” Edelstein said. “And that was one of the first things that people told us they were using.”
With that cue removed in the virtual corridor, people began looking for cues as fine as the carpet pattern.
Although the team is still analyzing the results, Edelstein said the experiment supported the concept that scientists could synchronously record the brainwaves of individuals moving within a real-time virtual reality environment and correlate their brain activity and travel patterns in that virtual world. A larger-scale study, she hopes, will expand on results and delve into the behavior of navigating people.
The rich complexity of a healthcare environment, with the contrasting needs for specialists and patients, young and old, sick and well, may be the best place to begin sorting out what cues the brain recognizes and which it seems to ignore.
But Edelstein says the same questions could be addressed in educational environments or commercial spaces within a city.
“It’s about looking at the human response beneath the level of culture,” she said. “If we can answer questions for healthcare settings, I argue that we are answering questions for all spaces that serve the breadth of needs.”
Gregory Berns, a neuroeconomist at Emory University in Atlanta who studies how neuronal firing patterns affect decision-making, praised the study as “a perfect use of neuroscience to peer into someone’s brain while they do something important.”
Berns, who has followed a similar thread with his new book, "Iconoclast: A Neuroscientist Reveals How to Think Differently," said the relative mobility of EEG technology could lend itself to poring over the brain waves of people in existing buildings as well.
“I think virtual reality is a helpful starting point for design,” he said, adding, “I’m an advocate of getting a person into a physical space.”
A before-and-after test could measure the success of a hallway designed to be navigation-friendly, for example.
“If this worked, potentially an even more beneficial use would be in urban planning,” he said. “Getting lost in a building is one thing, but getting lost in a city is another.”
Beyond navigation, Berns said he could imagine the technique being used to record responses to a space intended to be inspiring or surprising — and perhaps to prevent the design from going overboard.
“I think it’s a bit of a fine line between inspiring because it affords surprises in the environment,” he said, “and one that’s completely disorienting.”