The deadline approaches.
I’ve seen it coming for weeks but keep getting pulled away by the incessant distractions of my modern, connected office life: desk phone, cell phone, window, and the eternal promise of serendipitous discovery offered by what I call the TV on my desk. At any moment, my email inbox, blog reader, news sites, social networks, and several hundred fellow Twitterers might thrill me with a heretofore unknown fact.
The late Mariner’s broadcaster Dave Niehaus was the first to call Alex Rodriguez “A-Rod.”
Alaskan sled dogs perform best at 0 degrees F.
A Facebook friend in a Dallas diner has posted a picture of his breakfast “schnitzel”—a breaded pork medallion with grits, sautéed spinach and hollandaise.
This just in: NPR has a books page.
It’s almost enough to make me miss the days when a writer’s main distraction was a pencil in need of sharpening. But not quite. Unstated in most complaints about distractions is that we like many of them, which is why we let them distract us.
But they do cut into a precious but largely unappreciated sector of human capital: our attention.
Attention is the portal of our experience, the gatekeeper of all we know and love and hope to avoid. It keeps us alive, tipping us off to tempting foodstuffs and warning of dangers hurtling at us from the corner of our eye. It alerts us to possible mates.
All while weeding out billions of bits of information a second to focus on just enough to think, react, and remember.
The “skillful management of attention is the sine qua non of the good life,” writes Winifred Gallagher in Rapt: Attention and the Focused Life, “and the key to improving virtually every aspect of your experience, from mood to productivity to relationships.”
Our attention is also central to our economic and political life. Whole industries exist simply to bring products to your attention. Our democracy is based on the premise that, when you raise your voice, someone will listen.
But now it seems our attention is atomized among an ever greater set of options.
“There’s so much more coming at us all the time from so many different kinds of communication platforms,” says Erica Austin, a Murrow College of Communication professor who has looked at the ways marketers catch the eyes of children. “You think about it: If you’re just walking down the street, people have their iPods. There are cars going by all the time. Some of them are vans that have advertising emblazoned on them. You’ve got billboards. You may be walking into a store where they’ve got stuff playing, promotional end caps of merchandise. There’s just stuff everywhere.”
Our growing attention deficit is insidious: It’s hard to know you’re not paying attention to something when, well, you’re not paying attention to it. That’s how you get scads of people talking and texting while they drive. They think they’re paying attention to two things at once. They aren’t. They can’t be. They could drive past a gorilla and not see it.
We might have seen this coming. The social scientist Herbert Simon did in 1971, a quarter century before the flowering of the Internet, when he noted “a wealth of information creates a poverty of attention.”
Information is the new coin of our realm, but we see it in a fleeting, superficial, “ooh, shiny” way. Vast quantities of knowledge wash over us and we lie mute and mouth-breathing on the couch, our hand paralyzed on the remote. We may even be witnessing the defeat of the information age, but only if we are paying attention.
Calling All Neurons
Some things require little or no attention, like tying a shoe for the 4,000th time.
Then there are situations in which attention must be paid, like when a siren passes outside the window, activating our automatic orienting response. Try to ignore it, but it’s already caught your attention. The modern world is lousy with such distractions, distractions for which we are not quite evolved.
“You have sounds, you have smells, you have sights that are all going to command and fight for your attention,” says Craig Parks, a professor of psychology whose work on groups has him interested in the role of attention in social settings. “In prehistory you just didn’t have that. And one argument at least on the socio-psychological side that you sometimes see is that, in an evolutionary sense, our ability to adapt to competing demands on our attention has not evolved as rapidly as those competing sources have evolved.”
Between the extremes of minimal, automatic attention (shoe tying) and mandatory attention (sirens) lies a world in which our attention is, millisecond by millisecond, weighing its options. It can only handle so much, so it plays the odds.
This is a big concern of Lisa Fournier, an associate professor of psychology whose research focuses on selective attention, perception, and action. When she told test subjects that the odds were even that they would see an object in one of two places, they divided their attention evenly between the two. But when the odds went to 70–30, they tended to focus on the higher probability place.
“There seems to be some biases in the system that you can actually set up in terms of what it is you’re going to attend to and then those things you’re more susceptible to see,” says Fournier. If you’re driving a car, she says, you might expect oncoming traffic. Or in a rural area, you might expect an animal to jump out into the road.
But if things get challenging—an unexpected bicyclist appears, or an angry friend calls on the cell—it’s going to get harder to see.
In a widely cited Science study, British researchers had subjects watch a screen on which dots expanded outward. When the dots froze after a minute or so, the test subjects got the illusion that they had changed direction. This is called the motion after-effect.
Then researchers put words over the dots and asked participants to press a key when upper-case words appeared, or press a key when a two-syllable word appeared.
The second task was harder—a “high-load linguistic task”—as each word had to be read and evaluated. And that’s when researchers saw the motion after-effect decrease. Through functional brain imaging, they also saw reduced motion processing when subjects were busy recognizing the two-syllable words.
The study, says Fournier, shows that the cells sensitive to motion are actually less sensitive when you’re attending to something else.
This could have implications for driving, where more than one-third of accidents are due to inattention or errors of perception. If you’re engaged in something a bit challenging, like a cell phone conversation, you may be less sensitive to motion outside the windshield, says Fournier. You may try to give equal weight to watching the road and talking, but talking is a high-load linguistic task and can be a complicated act of social navigation.
“It can be very engaging,” says Fournier, so you have a lot less of your attention dedicated to driving. “We do know that when someone is super-engaged with a cell phone, his or her driving is similar to that of a drunk driver.”
In a practical sense, the cell-phone yakking driver may be functionally blind.
One form of this is “change blindness.” When test subjects look at a series of pictures with subtle changes between them, they have a hard time picking up differences that don’t alter the picture’s meaning. You can perform this test yourself. Watch a movie, then look up its “goofs” page on the Internet Movie Database. Dozens of mistakes wil
l have eluded you: sudden changes to Richard Gere’s wardrobe in Pretty Woman, an electric lamp in the nineteenth century setting of Gone with the Wind, a ’60s era Cessna in the ’40s era Godfather.
“What this suggested to researchers is that attention itself is very important for perception,” says Fournier. “It’s not just what hits the eye and hits the visual cortex. What you select out, what you’re looking at, how you’re interpreting the picture, etc., is very important.”
Then there’s “inattention blindness.” In one of the most popular attention studies ever, test subjects watched videos of people passing a basketball and were told to count the passes of either a team in white shirts or a team in black shirts. Afterwards, they were asked if they saw anything unusual.
One-third of the time, they had not noticed a woman walking across the screen with an umbrella. More than half the time, their attention was so absorbed by the ball passing that they didn’t see a person walk by in a gorilla suit.
Fournier likens attention to a spotlight centering on one thing. The more demanding the task, the narrower and more concentrated the light’s focus, to the exclusion of things on the darkened periphery.
“There’s only so much you can process at one time,” she says.
In other words, we have precious little attention to give—much less than we even think we have.
Compounding this is how our high-tech world is throwing things at us at a greater rate.
“Sometimes it can be just overwhelming to stay on a certain task, particularly when you have to write, to really focus,” says Fournier, whose own distractions include seven-year-old twins. “When you have an idea, and you’re interrupted, that idea can just be lost.”
This spring, Erica Austin’s son got a video game that flashed a lot. It was a study in that automatic, unavoidable orienting response, over and over and over.
“You’re constantly reorienting, reorienting, reorienting,” says Austin. She figured the game makers meant to do that, the better to hold the player’s attention.
“We actually had a malfunctioning unit,” she says, “and didn’t realize it for a week.”
As our world has filled with attention-grabbing technologies, the people whose job relies on getting our attention—TV outlets, radio stations, websites, advertisers—have been scrambling to, as they say, cut through the clutter. A lot is at stake, from whole product lines to the education of students to the marketplace of ideas in a democracy. But oftentimes, the clamor for eyeballs has only added to the clutter. In some instances, like the video game, media have succeeded in drawing our attention to the shine and not the substance, catching our eyes while our minds stay stuck in neutral.
“It makes it hard for you to process information,” says Austin. “If you’re orienting all the time, you’re not actually doing any deeper processing. There’s a balance between getting people’s attention and being able to keep sustained attention and get people to think about whatever the message might be.”
It’s the Tower of Babel, with a light show, and it poses problems for both consumers and those who want their message seen and heard.
Consumers, children included, can be the first to suffer.
Austin keeps by her desk a necklace with a small, beer-bottle light at the end. There’s no name on the bottle, but it’s a dead ringer for a Corona.
“It’s shiny,” says Austin. “It flashes. It was given to a five-year-old on a field trip.”
It’s also an example of “cradle-to-grave” marketing, which tries to capture the attention and loyalty of consumers early on in the hopes they will stay with the product for life. Much of Austin’s research has looked at how even tobacco and alcohol companies will target young people toward this end.
Meanwhile, young children lack the judgment to tell if an ad is for their benefit or someone else’s—information or persuasion.
“Young people pretty much think everything is information,” she says.
Moreover, says Tom Power, chair of WSU’s Department of Human Development, attention regulation is not completely developed in children.
The children’s television show Sesame Street picked up on this and was among the first to figure out how to hold a child’s attention by keeping segments short and changing them often. Power sees yet another technique in video games, which can hold a player’s attention for hours by scaffolding the action, luring participants from one level to the next.
Keep in mind, Power is the holder of a doctoral degree, a class of person with phenomenal powers of self-direction and focus. He also sees a model of attentiveness in Thomas Edison, whose superhuman attention to thousands of possible light bulb filaments led at last to one that worked.
But he has a point: Attention is often borne out of an individual’s values and motivations. In the case of a scaffolded video game, it’s the machine that maintains attention, not the person.
“So you’re becoming a pawn that’s being manipulated by this machine,” Power says.
There is one place where it is generally accepted that it’s good to get your audience’s attention: the classroom. It’s becoming a tough room to work.
“There’s a whole range of things that are in play,” says Olusola Adesope, an assistant professor in educational psychology whose interests include learning with multimedia resources.
“Nothing is controlled,” he says. “You have hungry kids. You have kids whose parents have just broken up. You have kids who, maybe their sister just had a kid last week and they’re thinking about that. You have someone who was just asked out on a date last night. You’re dealing with these varied abilities in terms of interest.”
On top of that, students can text on cell phones and take notes on wirelessly connected laptops.
When I visit Adesope, he has up on his computer screen a paper called, “How seductive details do their damage.” Seductive details are basically the carnival barker that gets you into the tent—interesting, but of little educational value. But such research is generally done in controlled laboratory settings without the distractions of social lives or technology. To punch through that, says Adesope, teachers need to rise above the muted technique of the cool, gentle professor who “runs the risk of losing these kids.”
As a child in Nigeria, Adesope himself learned the periodic table of elements from a chemistry teacher who translated it into songs.
“I’ve seen professors, I’ve seen good teachers, interject their lectures with music,” he says. “They’re singing, or they start playing. These are little tricks that could potentially work. Dealing with these students, we can’t force them to shut up their computers. We can seduce them.”
Now That I Have Your Attention
Laura Sample ’89 runs a Yakima advertising firm called Attention Marketing and uses many of the latest tools and techniques to be seen and heard: websites, TV commercials, social media. She knows a bit about giving and getting attention and appreciates an ad that, Sesame Street-style, moves fast between edits.
“Two seconds is about right,” she says. “It’s about the attention span of adults today.”
But it could be that the fast-moving stream of media images and ideas can only do so much to get its point across, even when it does catch our eyes and ears.
In 2003, Paul Bolls (’95 MA Comm.) and Darrel Muehling in the College of B
usiness looked at how the number of visual cuts in a television commercial affected viewers’ attention. They showed subjects 12 30-second advertisements—six fast-paced ones with 11 or more cuts and six slow-paced ones with three or fewer cuts. The participants’ automatic orienting response was monitored by electrodes that could detect subtle changes in sweat, an electrolyte and indicator of arousal. Afterwards, they were asked to recall what they saw.
The faster paced the commercial, the more it captured the viewers’ orienting response. But it was only drawing attention to the ad’s execution, as the viewers struggled to recall its content. The medium’s flash had overwhelmed its message.
If an advertiser aims “to clearly communicate product benefits,” the authors wrote, “a slower-paced advertisement (i.e. one with fewer ‘bits’ of information presented) would appear to be the advertisement execution of choice.”
“If you’re in the business of trying to get people to learn information, you have to be careful trying to have that, ‘oh, shiny’ road too much,” says Bolls, now an associate professor of strategic communication at the Missouri School of Journalism. “While that strategy is good at capturing this short-term working memory attention, oftentimes, particularly in the media world, it actually interferes with learning.”
Bolls is now working on ways to make journalism rise above the din by being more compelling, engaging, and memorable. In unpublished research presented to an International Communication Association conference, he found that some of the most captivating and memorable political ads “consisted of a candidate just simply talking to the camera, having a discussion.”
He was surprised, extrapolating from the early work showing faster pacing captured attention.
“In terms of media production, one thing that will also reliably, automatically engage attention is someone looking at a camera as if they’re talking to you,” he says. “We’re hard-wired to pay attention when it looks like people are talking to us. Which makes perfect common sense. And that carries over into the media world.”
While machines seem to be overloading us in an arcade of lights and noise, a human face and voice can cut through the din and actually be memorable. It’s a hard-wired thing: Our brain is designed to focus attention on social things, so even if something is fast-moving and electronic, we respond if people are at the end of it.
“Facebook is an example of that,” says Bolls. “Twitter is an example of that. LinkedIn. All forms of social media. That’s why that’s so compelling.”
It’s a reassuring thought: The original social media—a person looking you in the eye and talking—still works. Especially when we look away from the screen, or the person talking to us stops texting and puts away the phone.
On the web
Stanford multitasking study (PNAS, “Cognitive control in media multitaskers”, July 20, 2009)
Eyal Ophir on the Science of Multitasking (Boing Boing, Nov. 7, 2011. Eyal Ophir was primary researcher on the Stanford multitasking study)
Perpetual Inattentional Blindness (Linda Stone’s blog)