What Matters In The Age Of Distraction

On a subway train not long ago, I had the familiar, unsettling experience of standing behind a fellow-passenger and watching everything that she was doing on her phone. It was a crowded car, rush hour, with the dim but unwarm lighting of the oldest New York City trains. The stranger’s phone was bright, and as I looked on she scrolled through a waterfall of videos that other people had filmed in their homes. She watched one for four or five seconds, then dispatched it by twitching her thumb. She flicked to a text message, did nothing with it, and flipped back. The figures on her screen, dressed carefully and mugging at the camera like mimes, seemed desperate for something that she could not provide: her sustained attention. I felt mortified, not least because I saw on both sides of the screen symptoms I recognized too clearly in myself.

For years, we have heard a litany of reasons why our capacity to pay attention is disturbingly on the wane. Technology—the buzzing, blinking pageant on our screens and in our pockets—hounds us. Modern life, forever quicker and more scattered, drives concentration away. For just as long, concerns of this variety could be put aside. Television was described as a force against attention even in the nineteen-forties. A lot of focussed, worthwhile work has taken place since then.

But alarms of late have grown more urgent. Last year, the Organization for Economic Cooperation and Development reported a huge ten-year decline in reading, math, and science performance among fifteen-year-olds globally, a third of whom cited digital distraction as an issue. Clinical presentations of attention problems have climbed (a recent study of data from the medical-software company Epic found an over-all tripling of A.D.H.D. diagnoses between 2010 and 2022, with the steepest uptick among elementary-school-age children), and college students increasingly struggle to get through books, according to their teachers, many of whom confess to feeling the same way. Film pacing has accelerated, with the average length of a shot decreasing; in music, the mean length of top-performing pop songs declined by more than a minute between 1990 and 2020. A study conducted in 2004 by the psychologist Gloria Mark found that participants kept their attention on a single screen for an average of two and a half minutes before turning it elsewhere. These days, she writes, people can pay attention to one screen for an average of only forty-seven seconds.

“Attention as a category isn’t that salient for younger folks,” Jac Mullen, a writer and a high-school teacher in New Haven, told me recently. “It takes a lot to show that how you pay attention affects the outcome—that if you focus your attention on one thing, rather than dispersing it across many things, the one thing you think is hard will become easier—but that’s a level of instruction I often find myself giving.” It’s not the students’ fault, he thinks; multitasking and its euphemism, “time management,” have become goals across the pedagogic field. The SAT was redesigned this spring to be forty-five minutes shorter, with many reading-comprehension passages trimmed to two or three sentences. Some Ivy League professors report being counselled to switch up what they’re doing every ten minutes or so to avoid falling behind their students’ churn. What appears at first to be a crisis of attention may be a narrowing of the way we interpret its value: an emergency about where—and with what goal—we look.

“In many ways, it’s the oldest question in advertising: how to get attention,” an executive named Joanne Leong told me one afternoon, in a conference room on the thirteenth floor of the midtown office of the Dentsu agency. We were speaking about a new attention market. Slides were projected on the wall, and bits of conversation rattled like half-melted ice cubes in the corridor outside. For decades, what was going on between an advertisement and its viewers was unclear: there was no consensus about what attention was or how to quantify it. “The difference now is that there’s better tech to measure it,” Leong said.

Dentsu is one of the world’s leading advertising agencies, running accounts for Heineken, Hilton, Kraft Heinz, Microsoft, Subway, and other global corporations. In 2019, the firm began using digital technology to gather data that showed not only how many people attended to its ads but in what ways they did—information that could be applied to derive a quantitative unit of attention value. In 1997, the technology pundit Michael Goldhaber had envisaged a world in which attention supplanted money as a dominant currency. (“If you have enough attention, you can get anything you want,” he lamented.) Since then, advertising has caught up with the trade.

“Six years ago, the question was around ‘Can this usefully be measured?’ ” Leong said. Now it’s a circus. “There are companies that use eye tracking. There are companies that do facial coding”—reading emotions through micro-expressions. “It’s no longer a matter of convincing clients that this is something they should lean into—it’s how.”

There is a long-standing, widespread belief that attention carries value. In English, attention is something that we “pay.” In Spanish, it is “lent.” The Swiss literary scholar Yves Citton, whose study of the digital age, “The Ecology of Attention,” argues against reducing attention to economic terms, suggested to me that it was traditionally considered valuable because it was capable of bestowing value. “By paying attention to something as if it’s interesting, you make it interesting. By evaluating it, you valorize it,” he said. To treat it as a mere market currency, he thought, was to undersell what it could do.

Advertisers’ interest in attention as a measure was sharpened with the publication of “The Attention Economy” (2001), by Thomas H. Davenport and John C. Beck, which offered a theory of attention as a prelude to action: we pay attention in order to do (or buy). But there have long been varied views. The neuroscientist Karl Friston has suggested that attention is a way of prioritizing and tuning sensory data. Simone Weil, one of attention’s eloquent philosophers, also resisted the idea of attention as subject to economic measure.

In the Dentsu office, Leong, who had her hair in a neat ponytail and wore a sweater with wide, simple horizontal stripes, sat beside the company’s head of research and measurement, Celeste Castle, an executive who oversees the math behind Dentsu’s own answer to the question of attention’s worth—the “effective attention cost per a thousand” impressions. The old metrics used in advertising were based on an opportunity to see. “An ‘impression’ is just a measure that the ad was served,” Leong said. But recent data revealed that even most supposedly “viewable” ads weren’t being viewed. “Consumers’ span of attention is now believed to be less than eight seconds,” Raja Rajamannar, the chief marketing officer of Mastercard, a Dentsu client, told me. “That is less than the attention span of a goldfish.”

At Dentsu, as elsewhere, the aim has become to get more from these shrinking slivers—an endeavor some outsiders liken to fracking, the process used to force lingering pockets of fossil fuels out of the earth. When I asked whether these efforts would dissipate people’s focus further, Castle said that optimizing would result in ads being even more precisely tailored to entice their audiences. “As attention measurement matures, things will fall by the wayside and we can eliminate some of the waste,” she said.

In “Scenes of Attention,” a collection of scholarly essays published last year, the editors, D. Graham Burnett and Justin Smith-Ruiu, challenge the idea that shortened attention spans came about because of technological acceleration alone. True, tools and lives are faster, they write. But claiming innovation as the original cause is backward: “Human beings make the technologies—and they make them in the context of other human beings needing and wanting various things.” It wasn’t as though people, after millennia of head-scratching, suddenly “discovered” the steam engine, the spinning jenny, and the telegraph, and modernity unspooled. Rather, people’s priorities underwent a sea change with the onset of the modern age, turning to efficiency, objective measurement, and other goals that made such inventions worthwhile. The acceleration of life isn’t an inevitability, in that sense, but an ideological outcome.

Burnett, a historian of science at Princeton, is the author of five books, ranging in subject from seventeenth-century lens-making to New York’s judicial system. For the past several years, he has been working on a history of the scientific study of attention. I went one day to the main branch of the New York Public Library to hear him speak at the invitation of the New York Institute for the Humanities. “It was the sciences that sliced and diced this nebulous, difficult-to-define feature of our conscious and sensory life so that the market could price it,” Burnett said.

As an academic at the lectern, Burnett cut a curious figure. He was tall, with a graying backpacker’s beard and light-brown hair pulled into a topknot. He wore sixteen silver rings, gunmetal nail polish, and an outfit—T-shirt, V-neck sweater-vest, climbing pants—entirely in shades of light gray. He looked as if he had arrived from soldering metal in an abandoned loft. Scientific models of attention, he argued, had been products of their eras’ priorities, too. So-called “vigilance studies,” which figured attention in terms of cognitive alertness, had coincided with the rise of monotonous control-panel jobs in the years after the Second World War. When soldiers began having to deal with multiple directives over the wire, attention science became preoccupied with simultaneous inputs.

Woman cooking over campfire talks to man standing next to fire.

“Can we set the flame to medium high?”

Cartoon by Amy Hwang

It was a short leap from there to attention-chasing advertising. Companies that once resigned themselves to using billboards and print ads to appeal to a large American public now target us in private moments. The legal scholar Tim Wu, in his book “The Attention Merchants,” notes, “Without express consent, most of us have passively opened ourselves up to the commercial exploitation of our attention just about anywhere and any time.” No wonder young people struggle. Burnett, in an opinion piece that he co-wrote in the Times last fall, argued that schools, rather than just expecting students to pay attention, should teach them how.

I visited Burnett one afternoon in Washington Heights, where he lives with his partner, the filmmaker Alyssa Loh, and his two teen-age children. The windows of his living room were open; breezes off the Hudson River twirled silver spiral streamers hanging from the ceiling. A sideboard featured a blown ostrich egg, delicately etched with an image of the bird’s skeleton—a gift from a student.

“It’s a perfect mix of scrimshaw technique and X-ray of the form of the bird,” Burnett commented from an open kitchen. He was chopping radishes for a salad.

The rest of the living room was artily posed, as if presented for study by visitors. There was a faded dhurrie rug and a dining-room table made from a single slab of tree trunk. In one corner, a kind of altar had been assembled with peculiar objects: a feather-trimmed bow and arrow from Guyana; a bird skeleton; and a short stack of old leather-bound books, such as the first English edition of “L’Oiseau” (“The Bird”), a nineteenth-century study of birds by the historian Jules Michelet, and “Canaries and Cage-Birds,” by an ornithologist named George H. Holden. I opened it. “The lectures on which these chapters are based were appropriately announced as given under the auspices of one of our bird clubs,” the book read, “for the word auspices comes from the Latin avis,—a bird,—and spicere,—to look at.”

Source link

About The Author

Scroll to Top