Attention Deficit: A Bigger Disorder

Our behaviors are leading to a clinical disorder of attention deficit; and is all, spectacularly, backfiring. It’s time to start reversing course, while we still have a choice to make.

Image for post
ADHD and TikTok © Anthony Fieldman 2020

In 1971, long before we first surfed the Internet, and years before he went on to win both a Nobel Prize and a Turing Award (aka the ‘Nobel Prize for computing’) — a feat no one else has duplicated — Herbert Simon, one of the world’s great experts in “the architecture of complexity”, and a pioneer of many domains, including artificial intelligence, organizational theory, decision-making, problem-solving and information processing, warned us:

“In an information-rich world, the wealth of information means a dearth of something else: a scarcity of whatever it is that information consumes. What information consumes is rather obvious: it consumes the attention of its recipients. Hence a wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it.”

If only he could see us now.

Back in November, I used part of this quote to open a provocation on the myth of multi-tasking; that in fact, we cannot. Some of us, rather, are experts at shifting our focus between things rapidly; but that, in fact, the more we attempt to multi-task in our lives, the harder it becomes for our brains to reach their own baseline of human attention. Dan Goleman and Richard Davison, each a best-selling author focused on brain science, and co-authors of the book Altered Traits, found in their research that “heavy multi-taskers are more easily distracted in general, and when they have to focus on one thing, their brains activate many more areas than just those relevant to the task at hand — a neural indicator of distraction.”

In other words, multi-taskers have to use other parts of their brains to compensate for their reduced ability to focus ‘normally’.

I also wrote about research by Stanford University’s Eyal Ophir — a cognitive scientist — who in 2009 found that multi-tasking (what he colorfully calls ‘self-distraction’) leads to “reduced ability to filter out interference from the irrelevant data set.” Said another way, our ability to make sense of things — to understand our world — suffers.

What all of this means is not only, as Simon warned us, does an overabundance of inputs/stimuli lead to ‘poverty of attention’, but in fact we suffer, because we can no longer make sense of things as easily, and our processing ability is in fact atrophying, requiring us to compensate with other parts of our brains.

We are turning to mush, at “the hand of our own thumbs and fingers”.

(Yeah; I know that sounds funny.)

Image for post
My daughter’s ‘shield’ © Anthony Fieldman 2020

What else is funny about this — by which I really mean alarming — is that most of us have been duped into believing that not only can we multi-task (we cannot), and not only is it tolerated in social circumstances and encouraged at work, but shockingly, also, it is often worn as a badge of honor by those who do it, instead of being understood for the pathology — the illness — that it is.

Do you know anyone who crows about their ability to multi-task? I know tons of them. It’s a delusion.

Goleman and Davison write, multi-tasking “creates confusion about what’s important, and so a drop in our ability to retain what matters,” supporting Ophir’s discoveries of “reduced ability to filter out interference” and “irrelevance”.

Our incessant multi-tasking is making many of us sick and dumb.

Or, at the very least, we must work increasingly hard to return the same baseline of capacity that our pre-technology ancestors — you know, way back in the 70’s — enjoyed without special effort.

The Illusion of Work

Try this: the next time you’re on a video call (or in a conference room; remember those?) look at what the other people in the meeting are doing. Are any of them looking at their screens rather than at whomever is speaking? (Even on Zoom or Teams or WebEx, if their cameras are on, you can tell.) Of course they are. They’re not listening to you. Or, they’re hearing every fifth word. As a result, those invited to participate in resolving issues are not fully present in the task at hand — certainly not efficiently; and because this, we create additional post-meeting work for ourselves, including more emails and more meetings; and of course, we usually do both simultaneously, which is what led to the issue in the first place.

Crazy.

Because of this, I have become intransigent about a few things, when it comes to work. When we met in person, before the pandemic sent us home, I began asking people in meetings I ran to close their laptops, and pay attention. Not children — adults. It had become alarming to me that everyone’s — everyone’s — default behavior had become to walk into a meeting with a laptop, flip it open, and tap away on it while they sat there, taking the occasional pause to look at the person speaking.

Not only did I find that offensive, but it was incredibly counterproductive. I would often have to repeat what I was saying, and on occasion, would pointedly ask someone staring at their screen what they thought, knowing full well that they’d look up, apologize and ask me to repeat myself.

It is no different from what we do socially. At restaurants, look around you, and see how many people spend the evening swiping away at their phones, and only on occasion look up at their meal-mates, before returning their eyes to their screens.

Video calls are no different; it’s simply easier to hide it, because we cannot see what one another is staring at (as easily) on those meetings. Moreover, there are still people who will not turn their cameras on, a full half-year into this, which perplexes me, and rankles. Six months in, they will tell you their internet is slow, or they’re not properly dressed; or that the sticker is still on their screen. More likely, they are multi-tasking and don’t want you to know, doing emails, or walking the dog, or driving, or listening while making food or a drink; and, more than occasionally, they have run out of the room, and when you call their name, there is nothing but silence.

And so the rest of us guess where they went, and move on with the meeting.

If the alarm bells aren’t going off in your head yet, they should be.

Technology Use Causes ADHD Symptoms

That’s a crazy statement, I know. My wife will assuredly take exception to it. Our son has clinical ADHD — the hyperactive-impulsive type. It’s a real problem and a real disorder, and it consumes an enormous amount of our combined energy. It also causes him a ton of social fallout, about most of which he remains oblivious, because of his ADHD. Beyond years of executive function coaching, classes and targeted parenting, to little material effect, he is almost continually medicated. When he is not, things go awry, quickly, in a number of disruptive ways that prevent the rest of us from focusing for very long, before we must intervene. That is, unless he’s ‘on tech’. Then, he’s quiet. Which is problematic, as you’ll see in a moment. Then, there’s his sister. While she has not been diagnosed with it, she, too, exhibits at least a few of the hallmark symptoms of the inattentive type of ADHD; and she, too, has received coaching, and regular daily reminders in order to redirect whatever she’s doing toward accomplishing her chores, or work, in her forgetfulness and distraction. My wife, who is an occupational therapist by training, and runs a large hospital for a living, has read exhaustively about ADHD, and invested years of effort to understand how to use that knowledge to help her children, long before I arrived in their lives. And finally, there is my own daughter from a previous marriage. She, too, has been clinically diagnosed with ADHD of the inattentive type; but it’s relatively mild, and at fourteen, it doesn’t manifest until she is in a classroom ripe with distraction, for which medication helps.

The term is not one we use loosely.

How, then, is technology causing ADHD symptoms, when it is a clinical disorder, not an epithet to be thrown around without care?

JAMA — the Journal of the American Medical Association, and the profession’s apex periodical — published a study two years ago, finding that “there was a significant association between higher frequency of modern digital media use and subsequent symptoms of ADHD over a 24-month follow-up” among adolescents. These were teens — 2,587 of them — chosen because they did not have ADHD. Moreover, it manifested as both types of ADHD: inattention, such as difficulty organizing or completing tasks; and hyperactivity-impulsivity, such as having trouble sitting still, or creating outbursts.

Technology users were twice as likely to exhibit those symptoms.

It’s no wonder that the AAP — the American Academy of Pediatrics — has issued a call to parents for them to forbid children under three from using any technology, and to limit all children to one hour a day, until they reach adulthood.

Image for post
Future addict © Anthony Fieldman 2013

Then, there’s the NIH. Back in 2011, seven years before JAMA’s study, the National Institutes of Health published an article on ADHD and technology use. It’s worth sharing part of the abstract, here:

“Various studies confirm that psychiatric disorders, and ADHD in particular, are associated with overuse [of technology], with severity of ADHD specifically correlated with the amount of use. ADHD children may be vulnerable since these games operate in brief segments that are not attention demanding. In addition, they offer immediate rewards with a strong incentive to increase the reward by trying the next level. The time spent on these games may also exacerbate ADHD symptoms, if not directly then through the loss of time spent on more developmentally challenging tasks. While this is a major issue for many parents, there is no empirical research on effective treatment.”

No effective treatment. Other, that is, than limiting tech.

Let’s ask ourselves a simple question. In the context of medical science, what, then, is the practical difference between clinical ADHD and ADHD-like symptoms, as they manifest in someone’s life? If tech makes us unable to focus, or fidgety, or more prone to outbursts, and less effective at whatever we turn our minds to — whether we are a child, or an adult — what does it matter if the cause is an inherited chemical-neurological condition, or one caused by technology, and its rampant use?

Outcomes are what counts, are they not?

We are all, increasingly, suffering from the outcomes of ADHD-like behavior.

Health Costs

In the United States alone, the National Safety Council reports that “distracted driving” — a euphemism for staring at one’s phone — leads to 1.6 million car crashes every year, including 390,000 injuries caused by texting alone. That’s one in every four accidents. Comparing it to ‘the good ol’ days’, texting is six times as likely to result in an accident than driving drunk.

Insurance provider Simply Insurance conducted a survey based on data from the US Department of Transportation, to “benchmark deaths related to cell phone use and driving habits.” They found that 88% of drivers — nine in ten of us! — use their phones while driving; that 68% of drivers text while driving; and that 21% of teen fatalities involved distraction by phone.

It’s not just drivers.

You could argue that death is final, but that tapping away at emails in a meeting isn’t on the same playing field. You could equally argue that in the case of driving, one behavior leads to the other. And while relatively few people still perish directly as a result of phone use (it’s ‘only’ about 4,000 people per year — 25% more than perished in the 9/11 attacks), everyone’s lives are still greatly impacted by our pathological addiction to screen-watching, as I tried highlighting earlier.

Economic Costs

In an article on lifehack.org, McKinsey & Company reports that high-skilled workers spend 28% of their working hours reading and replying to emails; and that if we mono-tasked more, in lieu of pretending to multi-task while in fact being simply inattentive to whatever is in front of us, it could save the US economy between $900 billion to $1.3 trillion dollars, per year. Moreover, according to Mashable, distraction due to social media alone — all the swipes and clicks — costs the same economy $650 billion annually. How is that possible? Well, it’s not hard to understand, if you consider that in the US alone, 12,207,423,487 (that’s twelve billion) collective hours are spent browsing on a social network, every. single. day.

And guess what? We know it’s bad for us! According to research by online learning platform Udemy, more than 70% of working adults report feeling distracted on the job, while 16% of them say they almost always feel unfocused.

Out attention is, increasingly, at a glaring deficit.

Conclusions

The dictionary definition of the word ‘deficit’ is “a state of confusion”. Is there a more appropriate term to describe what we are increasingly feeling? Our sense-making is in the toilet. We don’t know what — or who — to believe anymore. Because of this, we feel confused, and don’t know who to blame for our malaise. And it is coming out, in ugly fashion, on news cycles, in politics, and on social media. As I wrote in Why Communications Matters, “We can see this in Twitter wars, trolling, cyberbullying, extreme polarization and the alarming rise of depression, anxiety and stress. As I wrote on July 17, rates of all three have increased by 55% in just seven years, according to HCUP.gov, while suicide death rates are up 34% since 2000, according to businessinsider.com. In all categories, young people (ages 12–25) lead the upswing, logging 50–100% increases over an eight-year period.”

We are spending more time distracted, angry, depressed, anxious, unmoored, feeling alone, hurting ourselves or others, getting into accidents and losing productivity than we ever have, while developing symptoms in the process that make our behaviors largely indistinguishable from those who suffer from clinical ADHD.

Put bluntly, our behaviors are leading to a disorder of attention deficit; and in the process, we are losing ourselves.

There is only one answer — only one antidote, in spite of the NIH’s claim that there is no ‘effective treatment’. That’s because the only effective treatment comes in the form of advice that Founding Father Benjamin Franklin gave Philadelphians in 1736, when their houses kept burning down. He said, “An ounce of prevention is worth a pound of cure.”

In other words, limit tech use, and we won’t have to fix anything.

Image for post
No Phone Zone © VectorStock.com/21515066

When you do use it, use it at allotted times when you can focus on it. When you do use it, use it to enrich your understanding of things, not distract yourself. When you do use it, use it to effectively communicate, in the spirit of problem-solving. When you do use it, remember that it is a tool intended to help amplify your reach, not to entertain you. Just as we’d be hard-pressed to use a hammer for anything other than driving or removing nails, as an improvement over our fingers, we should see technology not as a replacement for human interaction or thinking, but as a tool for productivity. When McKinsey & Company, the gold standard in management consulting, says our productivity tools — as used — are costing us a trillion dollars a year, “something ain’t right”. When 88% of us are causing 1.6 million annual vehicular accidents in the US alone because of our phones, “something aint’ right”. When perfectly healthy teenagers are developing ADHD symptoms that are getting in their way, “something ain’t right”. When they’re depressed and killing themselves in an alarming upward trajectory, “something ain’t right.” And when we can no longer make sense of the world in spite of the fact that all of human history and knowledge is literally a click away, “something ain’t right.”

It is all, spectacularly, backfiring. And it’s time to start correcting our behaviors, while we still have a choice to make.

Put your laptops away in meetings, or if you must stare at a screen, mono-task. Your brain will thank you. Limit your tech hours, and that of your children. Take, as my friend Jamie Wheal just suggested yesterday, the Hottub Time Machine Challenge — aka an ‘Internet Sabbath’, to escape the lure of what he calls ‘dopamine-loop hijacks’. I call it ‘Switched-off Sundays’ when I do it with my daughter. Exercise mono-tasking as often as you can. When you drive, drive. When you meet, meet. When you converse, converse. You’ll do everything better, and your relationships, mood, and mental capacity will all see the benefits from it!

Jamie even created a video on Vimeo — two years old, now, to dispense advise about ‘digital hygiene’ strategies, to help us recalibrate. They’re good suggestions. When he talks, I listen, not just because as a globally leading expert on ‘flow dynamics’, peak performance and leadership, or that his advice is sought by the US Naval War College, the NFL, NBA and MLB, Google, Goldman Sachs, etc. etc.

I listen, because he’s right.

And I value my relationships, children, productivity, sense-making and brain too much to lose all of it to tech.

You should, too.

Written by

Architect | Photographer | Writer | Polyglot | Windmill Jouster | Nomade Civilisée.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store