Are gadgets making us dumber? Two new studies suggest they might be. One found that people who are interrupted by technology score 20 percent lower on a standard cognition test. A second demonstrated that some students, even when on their best behavior, can't concentrate on homework for more than two minutes without distracting themselves by using social media or writing an email.
Interruptions are the scourge of modern life. Our days and nights are full of gadgets that ping, buzz and beep their way into our attention, taking us away from whatever we are doing.
We've known for a while that distractions hurt productivity at work. Depressing research by Gloria Mark at the University of California, Irvine, says that typical office workers only get 11 continuous minutes to work on a task before interruption. With smartphones reaching near ubiquity, the problem of tech-driven multitasking — juggling daily tasks with email, text messages, social media etc — is coming to a head.
Multitasking has been the subject of popular debate, but among neuroscientists, there is very little of that. Brain researchers say that what many people call multitasking should really be called “rapid toggling” between tasks, as the brain focuses quickly on one topic, then switches to another, and another. As all economics students know, switching is not free. It involves "switching costs" — in this case, the time it takes to re-immerse your mind in one topic or another.
Researchers say only the simplest of tasks are candidates for multitasking, and all but one of those tasks must involve automaticity. If you are good at folding laundry, you can probably fold laundry and watch TV at the same time, for example.
Despite this concern among brain scientists, many people overestimate their ability to multitask, such as the college student who thinks he can text and listen to a lecture simultaneously. He cannot, says brain expert Annie Murphy Paul, who writes "The Brilliant Blog."
"Multitasking while doing academic work — which is very, very common among young people — leads to spottier, shallower, less flexible learning," Paul warned in a recent column.
The two studies mentioned above underscore this point.
Get breaking news and insider analysis on the rapidly changing world of media and technology right to your inbox.
In the first, Alessandro Acquisti and Eyal Peer at Carnegie Mellon University's Human Computer Interaction lab recruited 136 college students to take a standard test of cognitive abilities, and invented a controlled method of distraction. Test-takers were interrupted via instant message, which they were told contained important additional instructions, during the exam.
(The research was conducted in concert with research for The Plateau Effect, a book I recently co-authored with Hugh Thompson.)
The interrupted group answered correctly 20 percent less often than members of a control group.
The Carnegie Mellon test might seem a bit contrived, however, because the control group was pretty unrealistic. It's hard to find a group of college students who could take a test without being interrupted by gadgets.
Larry Rosen, a professor at California State University-Dominguez Hills, published a study in the May issue of Computers in Human Behavior that attempted to quantify how often students of all ages are distracted by technology while studying. Even under ideal circumstances, the results were dismal.
Rosen's observers followed 263 students into their normal study environments — bedroom, library, den — and told them to work on an important school assignment for 15 minutes. Even knowing they were being watched, the students couldn't resist texting or using social media. So-called "on-task" behavior started declining at about the two minute mark, and overall, only 65 percent of the time was used on schoolwork.
"We really assumed we set up a situation where people would try to impress us," said Rosen, an expert in the psychology of technology. "Frankly, I was appalled at how quickly they became distracted."
'Problem built into the brain'
The two studies, published closely together, generated strong reaction, particularly from students.
"Yes, we text in class, but if my grade in that class is and A or a B I don’t see why it’s a problem," wrote one student to Paul.
It's a big problem for both students and adults, Paul counters, for plenty of reasons. Assignments inevitably take longer when learners split their time between tasks, she says. All that task-switching wears out the brain and makes learners more tired and less competent. Most important, several studies have shown that information learned while partially distracted is often quickly forgotten, so the learning is tragically shallow.
The key to transferring new information from the brain's short-term to long-term memory is a process called "encoding." Without deep concentration, encoding is unlikely to occur, explained Nicholas Carr in his book “The Shallows: What the Internet is Doing to Our Brains.”
So Paul is among a group of researchers who worry that the digital divide is not about the gadget haves and have nots, but rather about those who can resist the constant distracting tug of technology and those who cannot. She compares it to the famous marshmallow test, which shows that children who can delay eating one marshmallow for 10 or 15 minutes on the promise of gaining a second one are the most likely to succeed later in life. In a new "marshmallow" test, educators or employers might test to see how long people can resist "a blinking inbox or a buzzing phone."
"There are those people who think that multitasking is simply the way life is now and we should be focusing on getting better at it ... that we are a bunch of old fogies who don't understand," Paul said. "But scientifically, there is no evidence for that. There are fundamental biological limits to what the brain can pay attention to. This is a problem built into the brain."
Follow Bob Sullivan on Facebook or Twitter.