The Crazy World of Our Media Obsessions

Are you watching the news less? Me too. Now that the grownups are back in charge, I’m spending much less time checking my news feed.

Whatever you might say about the last four years, it certainly was good for the news business. It was one long endless loop of driving past a horrific traffic accident. Try as we might, we just couldn’t avoid looking.

But according to Internet analysis tool Alexa.com, that may be over. I ran some traffic rank reports for major news portals and they all look the same: a ramp-up over the past 90 days to the beginning of February, and then a precipitous drop off a cliff.

While all the top portals have a similar pattern, it’s most obvious on Foxnews.com.

It was as if someone said, “Show’s over folks. There’s nothing to see here. Move along.” And after we all exhaled, we did!

Not surprisingly, we watch the news more when something terrible is happening. It’s an evolved hardwired response called negativity bias.

Good news is nice. But bad news can kill you. So it’s not surprising that bad news tends to catch our attention.

But this was more than that. We were fixated by Trump. If it were just our bias toward bad news, we would still eventually get tired of it.

That’s exactly what happened with the news on COVID-19. We worked through the initial uncertainty and fear, where we were looking for more information, and at some point moved on to the subsequent psychological stages of boredom and anger. As we did that, we threw up our hands and said, “Enough already!”

But when it comes to Donald Trump, there was something else happening.

It’s been said that Trump might have been the best instinctive communicator to ever take up residence in the White House. We might not agree with what he said, but we certainly were listening.

And while we — and by we, I mean me — think we would love to put him behind us, I believe it behooves us to take a peek under the hood of this particular obsession. Because if we fell for it once, we could do it again.

How the F*$k did this guy dominate our every waking, news-consuming moment for the past four years?

We may find a clue in Bob Woodward’s book on Trump, Rage. He explains that he was looking for a “reflector” — a person who knew Trump intimately and could provide some relatively objective insight into his character.

Woodward found a rather unlikely candidate for his reflector: Trump’s son-in-law, Jared Kushner.

I know, I know — “Kushner?” Just bear with me.

In Woodward’s book, Kushner says there were four things you needed to read and “absorb” to understand how Trump’s mind works.

The first was an op-ed piece in The Wall Street Journal by Peggy Noonan called “Over Trump, We’re as Divided as Ever.” It is not complimentary to Trump. But it does begin to provide a possible answer to our ongoing fixation. Noonan explains: “He’s crazy…and it’s kind of working.”

The second was the Cheshire Cat in Alice in Wonderland. Kushner paraphrased: “If you don’t know where you’re going, any path will get you there.” In other words, in Trump’s world, it’s not direction that matters, it’s velocity.

The third was Chris Whipple’s book, The Gatekeepers: How the White House Chiefs of Staff Define Every Presidency. The insight here is that no matter how clueless Trump was about how to do his job, he still felt he knew more than his chiefs of staff.

Finally, the fourth was Win Bigly: Persuasion in a World Where Facts Don’t Matter, by Scott Adams. That’s right — Scott Adams, the same guy who created the “Dilbert” comic strip. Adams calls Trump’s approach “Intentional Wrongness Persuasion.”

Remember, this is coming from Kushner, a guy who says he worships Trump. This is not apologetic. It’s explanatory — a manual on how to communicate in today’s world. Kushner is embracing Trump’s instinctive, scorched-earth approach to keeping our attention focused on him.

It’s — as Peggy Noonan realized — leaning into the “crazy.”  

Trump represented the ultimate political tribal badge. All you needed to do was read one story on Trump, and you knew exactly where you belonged. You knew it in your core, in your bones, without any shred of ambiguity or doubt. There were few things I was as sure of in this world as where I stood on Donald J. Trump.

And maybe that was somehow satisfying to me.

There was something about standing one side or the other of the divide created by Trump that was tribal in nature.

It was probably the clearest ideological signal about what was good and what was bad that we’ve seen for some time, perhaps since World War II or the ’60s — two events that happened before most of our lifetimes.

Trump’s genius was that he somehow made both halves of the world believe they were the good guys.

In 2018, Peggy Noonan said that “Crazy won’t go the distance.” I’d like to believe that’s so, but I’m not so sure. There are certainly others that are borrowing a page from Trump’s playbook.  Right-wing Republicans Marjorie Taylor Greene and Lauren Boebert are both doing “crazy” extraordinarily well. The fact that almost none of you had to Google them to know who they are proves this.

Whether we’re loving to love, or loving to hate, we are all fixated by crazy.

The problem here is that our media ecosystem has changed. “Crazy” used to be filtered out. But somewhere along the line, news outlets discovered that “crazy” is great for their bottom lines.

As former CBS Chairman and CEO Leslie Moonves said when Trump became the Republican Presidential forerunner back in 2016, “It may not be good for America, but it’s damned good for CBS.”

Crazy draws eyeballs like, well, like crazy. It certainly generates more user views then “normal” or “competent.”

In our current media environment  — densely intertwined with the wild world of social media — we have no crazy filters. All we have now are crazy amplifiers.

And the platforms that allow this all try to crowd on the same shaky piece of moral high ground.

According to them, it’s not their job to filter out crazy. It’s anti-free speech. It’s un-American. We should be smart enough to recognize crazy when we see it.

Hmmm. Well, we know that’s not working.

Connected Technologies are Leaving Our Seniors Behind

One of my pandemic projects has been editing a video series of oral history interviews we did with local seniors in my community. Last week, I finished the first video in the series. The original plan, pre-pandemic, was to unveil the video as a special event at a local theater, with the participants attending. Obviously, given our current reality, we had to change our plans.

We, like the rest of the world, moved our event online. As I started working through the logistics of this, I quickly realized something: Our seniors are on the other side of a wide and rapidly growing chasm. Yes, our society is digitally connected in ways we never were before, but those connections are not designed for the elderly. In fact, if you were looking for something that seems to be deliberately designed to disadvantage a segment of our population, it would be hard to find a better example than Internet connection and the elderly.

I have to admit, for much of the past year, I have been pretty focused on what I have sacrificed because of the pandemic. But I am still a pretty connected person. I can Zoom and have a virtual visit with my friends. If I wonder how my daughters are doing, I can instantly text them. If I miss their faces, I can FaceTime them. 

I have taken on the projects I’ve been able to do thanks to the privilege of being wired into the virtual world.   I can even go on a virtual bike ride with my friends through the streets of London, courtesy of Zwift.

Yes, I have given up things, but I have also been able find digital substitutes for many of those things. I’m not going to say it’s been perfect, but it’s certainly been passable.

My stepdad, who is turning 86, has been able to do none of those things. He is in a long-term care home in Alberta, Canada. His only daily social connections consist of brief interactions with staff during mealtime and when they check his blood sugar levels and give him his medication. All the activities that used to give him a chance to socialize are gone. Imagine life for him, where his sum total of connection is probably less than 30 minutes a day. And, on most days, none of that connecting is done with the people he loves.

Up until last week, family couldn’t even visit him. He was locked down due to an outbreak at his home. For my dad, there were no virtual substitutes available. He is not wired in any way for digital connection. If anyone has paid the social price of this pandemic, it’s been my dad and people like the seniors I interviewed, for whom I was desperately trying to find a way for them just to watch a 13-minute video that they had starred in.

A recent study by mobile technology manufacturer Ericsson looked specifically at the relationship between technology and seniors during the pandemic. The study focused on what the company termed the “young-old” seniors, those aged 65-74. They didn’t deal with “middle-old” (aged 75-85) or “oldest-old” (86 plus) because — well, probably because Ericsson couldn’t find enough who were connected to act as a representative sample.

But they did find that even the “young old” were falling behind in their ability to stay connected thanks to COVID-19. These are people who have owned smartphones for at least a decade, many of whom had to use computers and technology in their jobs. Up until a year ago, they were closing the technology gap with younger generations. Then, last March, they started to fall behind.

They were still using the internet, but younger people were using it even more. And, as they got older, they were finding it increasingly daunting to adopt new platforms and technology. They didn’t have the same access to “family tech support” of children or grandchildren to help get them over the learning curve. They were sticking to the things they knew how to do as the rest of the world surged forward and started living their lives in a digital landscape.

But this was not the group that was part of my video project. My experience had been with the “middle old” and “oldest old.” Half fell into the “middle old” group and half fell into the “oldest old” group. Of the eight seniors I was dealing with, only two had emails. If the “young old” are being left behind by technology, these people were never in the race to begin with. As the world was forced to reset to an online reality, these people were never given the option. They were stranded in a world suddenly disconnected from everything they knew and loved.

Predictably, the Ericsson study proposes smartphones as the solution for many of the problems of the pandemic, giving seniors more connection, more confidence and more capabilities. If only they got connected, the study says, life will be better.

But that’s not a solution with legs. It won’t go the distance. And to understand why, we just have to look at the two age cohorts the study didn’t focus on, the “middle old” and the “oldest old.”

Perhaps the hardest hit have been the “oldest old,” who have sacrificed both physical and digital connection, as this Journals of Gerontologyarticle notes.   Four from my group lived in long-term care facilities. Many of these were locked down at some point due to local outbreaks within the facility. Suddenly, that family support they required to connect with their family and friends was no longer available. The technological tools  that we take for granted — which we were able to slot in to take the place of things we were losing — were unimaginable to them. They were literally sentenced to solitary confinement.

A recent study from Germany found that only 3% of those living in long-term care facilities used an internet-connected device. A lot of the time, cognitive declines, even when they’re mild, can make trying to use technology an exercise in frustration.

When my dad went into his long-term care home, my sister and I gave him one of our old phones so he could stay connected. We set everything up and did receive a few experimental texts from him. But soon, it just became too confusing and frustrating for him to use without our constant help. He played solitaire on it for a while, then it ended up in a drawer somewhere. We didn’t push the issue. It just wasn’t the right fit.

But it’s not just my dad who struggled with technology. Even if an aging population starts out as reasonably proficient users, it can be overwhelming to keep up with new hardware, new operating systems and new security requirements. I’m not even “young old” yet, and I’ve worked with technology all my life. I owned a digital marketing company, for heaven’s sake. And even for me, it sometimes seems like a full-time job staying on top of the constant stream of updates and new things to learn and troubleshoot. As connected technology leaps forward, it does not seem unduly concerned that it’s leaving the most vulnerable segment of our population behind.

COVID-19 has pushed us into a virtual world where connection is not just a luxury, but a condition of survival. We need to connect to live. That is especially true for our seniors, who have had all the connections they relied on taken from them. We can’t leave them behind. Connected technology can no longer ignore them.

This is one gap we need to build a bridge over.

Our Disappearing Attention Spans

Last week, Mediapost Editor in Chief Joe Mandese mused about our declining attention spans. He wrote, 

“while in the past, the most common addictive analogy might have been opiates — as in an insatiable desire to want more — these days [consumers] seem more like speed freaks looking for the next fix.”

Mandese cited a couple of recent studies, saying that more than half of mobile users tend to abandon any website that takes longer than three seconds to load. That

“has huge implications for the entire media ecosystem — even TV and video — because consumers increasingly are accessing all forms of content and commerce via their mobile devices.”

The question that begs to be asked here is, “Is a short attention span a bad thing?” The famous comparison is that we are now more easily distracted than a goldfish. But does a shorter attention span negatively impact us, or is it just our brain changing to be a better fit with our environment?

Academics have been debating the impact of technology on our ability to cognitively process things for some time. Journalist Nicholas Carr sounded the warning in his 2010 book, “The Shallows,” where he wrote, 

” (Our brains are) very malleable, they adapt at the cellular level to whatever we happen to be doing. And so the more time we spend surfing, and skimming, and scanning … the more adept we become at that mode of thinking.”

Certainly, Carr is right about the plasticity of our brains. It’s one of the most advantageous features about them. But is our digital environment forever pushing our brains to the shallow end of the pool? Well, it depends. Context is important. One of the biggest factors in determining how we process the information we’re seeing is the device where we’re seeing it.

Back in 2010, Microsoft did a large-scale ethnographic study on how people searched for information on different devices. The researchers found those behaviors differed greatly depending on the platform being used and the intent of the searcher. They found three main categories of search behaviors:

  • Missions are looking for one specific answer (for example, an address or phone number) and often happen on a mobile device.
  • Excavations are widespread searches that need to combine different types of information (for example, researching an upcoming trip or major purchase). They are usually launched on a desktop.
  • Finally, there are Explorations: searching for novelty, often to pass the time. These can happen on all types of devices and can often progress through different devices as the exploration evolves. The initial search may be launched on a mobile device, but as the user gets deeper into the exploration, she may switch to a desktop.

The important thing about this research was that it showed our information-seeking behaviors are very tied to intent, which in turn determines the device used. So, at a surface level, we shouldn’t be too quick to extrapolate behaviors seen on mobile devices with certain intents to other platforms or other intents. We’re very good at matching a search strategy to the strengths and weaknesses of the device we’re using.

But at a deeper level, if Carr is right (and I believe he is) about our constant split-second scanning of information to find items of interest making permanent changes in our brains, what are the implications of this?

For such a fundamentally important question, there is a small but rapidly growing body of academic research that has tried to answer it. To add to the murkiness, many of the studies done contradict each other. The best summary I could find of academia’s quest to determine if “the Internet is making us stupid” was a 2015 article in academic journal The Neuroscientist.

The authors sum up by essentially saying both “yes” — and “no.” We are getting better at quickly filtering through reams of information. We are spending fewer cognitive resource memorizing things we know we can easily find online, which theoretically leaves those resources free for other purposes. Finally, for this post, I will steer away from commenting on multitasking, because the academic jury is still very much out on that one.

But the authors also say that 

“we are shifting towards a shallow mode of learning characterized by quick scanning, reduced contemplation and memory consolidation.”

The fact is, we are spending more and more of our time scanning and clicking. There are inherent benefits to us in learning how to do that faster and more efficiently. The human brain is built to adapt and become better at the things we do all the time. But there is a price to be paid. The brain will also become less capable of doing the things we don’t do as much anymore. As the authors said, this includes actually taking the time to think.

So, in answer to the question “Is the Internet making us stupid?,” I would say no. We are just becoming smart in a different way.

But I would also say the Internet is making us less thoughtful. And that brings up a rather worrying prospect.

As I’ve said many times before, the brain thinks both fast and slow. The fast loop is brutally efficient. It is built to get stuff done in a split second, without having to think about it. Because of this, the fast loop has to be driven by what we already know or think we know. Our “fast” behaviors are necessarily bounded by the beliefs we already hold. It’s this fast loop that’s in control when we’re scanning and clicking our way through our digital environments.

But it’s the slow loop that allows us to extend our thoughts beyond our beliefs. This is where we’ll find our “open minds,” if we have such a thing. Here, we can challenge our beliefs and, if presented with enough evidence to the contrary, willingly break them down and rebuild them to update our understanding of the world. In the sense-making loop, this is called reframing.

The more time we spend “thinking fast” at the expense of “thinking slow,” the more we will become prisoners to our existing beliefs. We will be less able to consolidate and consider information that lies beyond those boundaries. We will spend more time “parsing” and less time “pondering.” As we do so, our brains will shift and change accordingly.

Ironically, our minds will change in such a way to make it exceedingly difficult to change our minds.

The Ebbs and Flows of Consumerism in a Post-Pandemic World

As MediaPost’s Joe Mandese reported last Friday, advertising was, quite literally, almost decimated worldwide in 2020. If you look at the forecasts of the top agency holding companies, ad spends were trimmed by an average of 6.1%. It’s not quite one dollar in 10, but it’s close.

These same companies are forecasting a relative bounceback in 2021, starting slow and accelerating quarter by quarter through the year — but that still leaves the 2021 spend forecast back at 2018 levels.

And as we know, everything about 2021 is still very much in flux. If the year 2021 was a pack of cards, almost every one of them would be wild.

This — according to physician, epidemiologist and sociologist Nicholas Christakis — is not surprising.

Christakis is one of my favorite observers of network effects in society. His background in epidemiological science gives him a unique lens to look at how things spread through the networks of our world, real and virtual. It also makes him the perfect person to comment on what we might expect as we stagger out of our current crisis.

In his latest book, “Apollo’s Arrow,” he looks back to look forward to what we might expect — because, as he points out, we’ve been here before.

While the scope and impact of this one is unusual, such health crises are nothing new. Dozens of epidemics and a few pandemics have happened in my lifetime alone, according to this Wikipedia chart.

This post goes live on Groundhog Day, perhaps the most appropriate of all days for it to run. Today, however, we already know what the outcome will be. The groundhog will see its shadow and there will be six more months (at least) of pandemic to deal with. And we will spend that time living and reliving the same day in the same way with the same routine.

Christakis expects this phase to last through the rest of this year, until the vaccines are widely distributed, and we start to reach herd immunity.

During this time, we will still have to psychologically “hunker down” like the aforementioned groundhog, something we have been struggling with. “As a society we have been very immature,” said Christakis. “Immature, and typical as well, we could have done better.”

This phase will be marked by a general conservatism that will go in lockstep with fear and anxiety, a reluctance to spend and a trend toward risk aversion and religion.

Add to this the fact that we will still be dealing with widespread denialism and anger, which will lead to a worsening vicious circle of loss and crisis. The ideological cracks in our society have gone from annoying to deadly.

Advertising will have to somehow negotiate these choppy waters of increased rage and reduced consumerism.

Then, predicts Christakis, starting some time in 2022, we will enter an adjustment period where we will test and rethink the fundamental aspects of our lives. We will be learning to live with COVID-19, which will be less lethal but still very much present.

We will likely still wear masks and practice social distancing. Many of us will continue to work from home. Local flare-ups will still necessitate intermittent school and business closures. We will be reluctant to be inside with more than 20 or 30 people at a time. It’s unlikely that most of us will feel comfortable getting on a plane or embarking on a cruise ship. This period, according to Christakis, will last for a couple years.

Again, advertising will have to try to thread this psychological needle between fear and hope. It will be a fractured landscape on which to build a marketing strategy. Any pretense of marketing to the masses, a concept long in decline, will now be truly gone. The market will be rife with confusing signals and mixed motivations. It will be incumbent on advertisers to become very, very good at “reading the room.”

Finally, starting in 2024, we will have finally put the pandemic behind us. Now, says Christakis, four years of pent-up demand will suddenly burst through the dam of our delayed self-gratification. We will likely follow the same path taken a century ago, when we were coming out of a war and another pandemic, in the period we call the “Roaring Twenties.”

Christakis explained: “What typically happens is people get less religious. They will relentlessly seek out social interactions in nightclubs and restaurants and sporting events and political rallies. There’ll be some sexual licentiousness. People will start spending their money after having saved it. They’ll be joie de vivre and a kind of risk-taking, a kind of efflorescence of the arts, I think.”

Of course, this burst of buying will be built on the foundation of what came before. The world will likely be very different from its pre-pandemic version. It will be hard for marketers to project demand in a straight line from what they know, because the experiences they’ve been using as their baseline are no longer valid. Some things may remain the same, but some will be changed forever.

COVID-19 will have pried many of the gaps in our society further apart — most notably those of income inequality and ideological difference. A lingering sense of nationalism and protectionism born from dealing with a global emergency could still be in place.

Advertising has always played an interesting role in our lives. It both motivates and mirrors us.

But the reflection it shows is like a funhouse mirror: It distorts some aspects of our culture and ignores others. It creates demand and hides inconvenient truths. It professes to be noble, while it stokes the embers of our ignobility. It amplifies the duality of our human nature.

Interesting times lie ahead. It remains to be seen how that is reflected in the advertising we create and consume.

The Academics of Bullsh*t

“One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted.”—

from On Bullshit,” an essay by philosopher Henry Frankfurt.

Would it surprise you to know that I have found not one, but two academic studies on organizational bullshit? And I mean that non-euphemistically. The word “bullshit” is actually in the title of both studies. I B.S. you not.

In fact, organizational bullshit has become a legitimate field of study. Academics are being paid to dig into it — so to speak. There are likely bullshit grants, bullshit labs, bullshit theories, bullshit paradigms and bullshit courses. There are definitely bullshit professors.  There is even an OBPS — the Organization Bullshit Perception Scale — a way to academically measure bullshit in a company.

Many years ago, when I was in the twilight of my time with the search agency I had founded, I had had enough of the bullshit I was being buried under, shoveled there by the company that had acquired us. I was drowning in it. So I vented right here, on MediaPost. I dared you to imagine what it would be like to actually do business without bullshit getting in the way.

My words fell on deaf ears. Bullshit has proliferated since that time. It has been enshrined up and down our social, business and governmental hierarchies, becoming part of our “new” organizational normal. It has picked up new labels, like “fake news” and “alternate facts.” It has proven more dangerous than I could have ever imagined. And it is this dangerous because we are ignoring it, which is legitimizing it.

Henry Frankfurt defined the concept and set it apart from lying. Liars know the truth and are trying to hide it. Bullshitters don’t care if what they say is true or false. They only care if their listener is persuaded. That’s as good a working definition of the last four years as any I’ve heard.

But at least one study indicates bullshit may have a social modality — acceptable in some contexts, but corrosive in others. Marketing, for example, is highlighted by the authors as an industry built on a foundation of bullshit:

“advertising and public relations agencies and consultants are likely to be ‘full of It,’ and in some cases even make the production of bullshit an important pillar of their business.”

In these studies, researchers speculate that bullshit might actually serve a purpose in organizations. It may allow for strategic motivation before there is an actual strategy in place. This brand of bullshit is otherwise known as “blue-sky thinking” or “out-of-the-box thinking.”

But if this is true, there is a very narrow window indeed where this type of bullshit could be considered beneficial. The minute there are facts to deal with, they should be dealt with. But the problem is that the facts never quite measure up to the vision of the bullshit. Once you open the door to allowing bullshit, it becomes self-perpetuating.

I grew up in the country. I know how hard it is to get rid of bullshit.

The previous example is what I would call strategic bullshit — a way to “grease the wheels” and get the corporate machine moving. But it often leads directly to operational bullshit — which is toxic to an organization, serving to “gum up the gears” and prevent anything real and meaningful from happening. This was the type of bullshit that was burying me back in 2013 when I wrote that first column. It’s also the type of bullshit that is paralyzing us today.

According to the academic research into bullshit, when we’re faced with it, we have four ways to respond: exit, voice, loyalty or neglect. Exit means we try to escape from the bullshit. Loyalty means we wallow in it, spreading it wider and thicker. Neglect means we just ignore it. And Voice means we stand up to the bullshit and confront it.  I’m guessing you’ve already found yourself in one of those four categories.

Here’s the thing. As marketers and communicators, we have to face the cold, ugly truth of our ongoing relationship with bullshit. We all have to deal with it. It’s the nature of our industry.

But how do we deal with it? Most times, in most situations, it’s just easier to escape or ignore it. Sometimes it may serve our purpose to jump on the bullshit bandwagon and spread it. But given the overwhelming evidence of where bullshit has led us in the recent past, we all should be finding our voice to call bullshit on bullshit.

Missing the Mundane

I realize something: I miss the mundane.

Somewhere along the line, mundanity got a bad rap. It became a synonym for boring. But it actually means worldly. It refers to the things you experience when you’re out in the world.

And I miss that — a lot.

There is a lot of stuff that happens when we’re living our lives that we don’t give enough credit to: Petting a dog being taken for a walk. A little flirting with another human we find attractive. Doing some people-watching while we eat our bagel in a mall’s food court. Random situational humor that plays itself out on the sidewalk in front of us. Discovering that the person cutting your hair is also a Monty Python fan. Snippets of conversation — either ones we’re participating in, or ones we overhear while we wait for the bus. Running into an old acquaintance. Even being able to smile at a stranger and have them smile back at you.

The mundane is built of all those hundreds of little, inconsequential social exchanges that happen daily in a normal world that we ordinarily wouldn’t give a second thought to.

And sometimes, serendipitously, we luck upon the holy grail of mundanity — that random “thing” that makes our day.

These are the things we live for. And now, almost all of these things have been stripped from our lives.

I didn’t realize I missed them because I never assigned any importance to them. If I did a signal-to-noise ratio analysis of my life, all these things would fall in the latter category. Most of the time, I wasn’t even fully aware that they were occurring. But I now realize when you add them all up, they’re actually a big part of what I’m missing the most. And I’ve realized that because I’ve been forced to subtract them — one by one — from my life.

I have found that the mundane isn’t boring. It’s the opposite — the seasoning that adds a little flavor to my day-to-day existence.

For the past 10 months, I thought the problem was that I was missing the big things: travel, visiting loved ones, big social gatherings. And I do miss those things. But those things are the tentpoles – the infrequent, yet consequential things that we tend to hang our happiness on. We failed to realize that in between those tentpoles, there is also the fabric of everyday life that has also been eliminated.

It’s not just that we don’t have them. It’s also that we’ve tried to substitute other things for them. And those other things may be making it worse. Things like social media and way too much time spent looking at the news. Bingeing on Netflix. Forcing ourselves into awkward online Zoom encounters just because it seems like the thing to do. A suddenly developed desire to learn Portuguese, or how to bake sourdough bread.

It’s not that all these things are bad. It’s just that they’re different from what we used to consider normal — and by doing them, it reinforces the gap that lies between then and now. They add to that gnawing discontent we have with our new forced coping mechanisms.

The mundane has always leavened our lives. But now, we’ve swapped the living of our lives for being entertained — and whether it’s the news or the new show we’re bingeing, entertainment has to be overplayed. It is nothing but peaks and valleys, with no middle ground. When we actually do the living, rather than the watching, we spend the vast majority of our time in that middle ground — the mundane, which is our emotional reprieve.

I’ve also noticed my social muscles have atrophied over the past several months due to lack of exercise. It’s been ages since I’ve had to make small talk. Every encounter now — as infrequent as they are — seems awkward. Either I’m overeager, like a puppy that’s been left alone in a house all day, or I’m just not in any mood to palaver.  

Finally, it’s these everyday mundane encounters that used to give me anecdotal evidence that not all people were awful. Every day I used to see examples of small kindnesses, unexpected generosity and just plain common courtesy. Yes, there were also counterpoints to all of these, but it almost always netted out to the good. It used to reaffirm my faith in people on a daily basis.

With that source of reaffirmation gone, I have to rely on the news and social media. And — given what those two things are — I know I will only see the extremes of human nature. It’s my “angel and asshole” theory : That we all lie on a bell curve somewhere between the two, and our current situation will push us from the center closer to those two extremes. You also know that the news and social media are going to be biased towards the “asshole” end of the spectrum.

There’s a lot to be said for the mundane — and I have. So I’ll just wrap up with my hope that my life — and yours — will become a little more mundane in the not-too-distant future.

The Timeline of Factfulness

After last Wednesday, when it seemed that our reality was splitting at the seams, I was surprised to see financial markets seemed to ignore what was happening in Washington. It racked up a 1% gain. I later learned that financial markets have a history of being rather oblivious to social upheaval.

Similarly, a newsletter I subscribe to about recent academic research was packed with recent discoveries. Not one of the 35 links in that day’s edition pointed to anything remotely relevant to what was happening at that time in Washington, D.C (or various other state capitals in the country). That was less surprising to me than the collective shrugging off of events by financial markets, but it still made an interesting contrast clear to me.

These two corners of the world are not tied to the happenings of today. Markets look forward and lay economic bets on what will be. And apparently it had bet that the events of January 6 wouldn’t have any lasting impact.

Scientific journals look backward and report on what has already happened in the world of academic research. Neither are very focused on today.

But there is another reason why these two corners of the world seemed unfazed by the news headlines of January 6th. Science and the Markets are two examples of things driven by facts and data. Yes, emotion certainly plays a part. Investors have long known that irrational exuberance or fear can drive artificial bubbles or crashes. And the choice of research paths to take is a human one, which means it’s inevitably driven by emotions.

But both these ecosystems try to systematically reduce the role of emotion as much as possible by relying on facts and data. And because facts and data do not reveal their stories immediately but rather over time in the form of trends, they have to take a longer view of the world.

Therefore, these things operate on different timelines from the news. Financial markets use what’s happening now – today – as just one of many inputs into a calculated bet that will be weeks or months in the future.

Science takes a longer view, using the challenges of today to set a research agenda that may be years away from realizing its pay-off.  Both finance and science use what’s happening right now as one input to determine what will be in the future, but neither focus exclusively on today.

In contrast, that’s exactly what the news has to do. And it hyperbolizes the now, stripping the ordinary from the extraordinary, separating it, picking it out, and concentrating it for our consumption

The fact is, both markets and science have to operate by Factfulness, to use the term coined by the late Hans Rosling, Swedish physician and well known TED speaker. To run like the rest of the world, over-focused on the amplified volatility of the here and now that fills our news feeds, would be to render them dysfunctional. They couldn’t operate. They would be in a constant state of anxiety.

Increasingly, the engines that drive our world  – such as science and financial markets – have to decouple themselves from the froth and frenzy of the immediate. They do so because the rest of the world is following a very different path – one where hyper-emotionality and polarized news outlets whip us back and forth like a rag-doll caught by a Doberman.

This decoupling has accelerated thanks to the role of technology in compressing the timelines of the worldview of most of us. We are instantly alerted to what’s happening now and are then ushered into a highly biased bubble from in which we look at the world. Our world view is not only formed by emotion, it is deliberately designed to manipulate those emotions.

Emotions are our instant response to the world. They run fully hot or cold, with nary a nuance of ration to modulate them. Also, because emotions are our natural early warning systems, we tend to be hyper-aware of them and are immediately drawn to anything that promises to push our emotional buttons. As such, they are a notoriously inaccurate lens from which to look at reality.  That is why efforts are made to minimize their impact in the worlds of science and finance.

We should hold other critical systems to the same standards. Take government, for instance. Now, more than ever, we need those that govern us to be clear eyed and dealing with facts. Unfortunately, as we saw last week, they’re running as fast as they can in the opposite direction.

Happy New Year?

“Speaking of the happy new year, I wonder if any year ever had less chance of being happy. It’s as though the whole race were indulging in a kind of species introversion — as though we looked inward on our neuroses. And the thing we see isn’t very pretty… So we go into this happy new year, knowing that our species has learned nothing, can, as a race, learn nothing — that the experience of ten thousand years has made no impression on the instincts of the million years that preceded.”

That sentiment, relevant as it is to today, was not written about 2021. It was actually written 80 years ago — in 1941 — by none other than John Steinbeck.

John was feeling a little down. I’m sure we can all relate.

It’s pretty easy to say that we have hopefully put the worst year ever behind us. I don’t know about your news feed, but mine has been like a never-ending bus tour of Dante’s 7 Circles of Hell — and I’m sitting next to the life insurance salesman from Des Moines who decided to have a Caesar salad for lunch.

An online essay by Umair Haque kind of summed up 2020 for me: “The Year of the Idiot.” In it, Haque doesn’t pull any punches,

“It was the year that a pandemic searched the ocean of human stupidity, and found, to its gleeful delight, that it appeared to be bottomless. 2020 was the year that idiots wrecked our societies.”

In case you’re not catching the drift yet, Haque goes on to say, “The average person is a massive, gigantic, malicious, selfish idiot.”

Yeah. That pretty much covers it.

Or does it? Were our societies wrecked? Is the average person truly that shitty? Is the world a vast, soul sucking, rotten-cabbage-reeking dumpster fire? Or is it just the lens we’re looking through?

If you search hard enough, you can find those who are looking through a different lens — one that happens to be backed by statistical evidence rather than what bubbles to the top of our newsfeed. One of those people is Ola Rosling. He’s carrying on the mission of his late father, Hans Rosling, who was working on the book “Factfulness” when he passed away in 2017. Bill Gates called it “one of the most educational books I’ve ever read.” And Bill reads a lot of books!

Believe it or not, if you remove a global pandemic from the equation (which, admittedly, is a whole new scale of awful) the world may actually be in better shape than it was 12 months ago. And even if you throw the pandemic into the mix, there are some glimmers of silver peeking through the clouds.

Here are some things you may have missed in your news feed:

Wild polio was eradicated from Africa. That’s big news. It’s a massive achievement that had its to-do box ticked last August. And I’m betting you never heard about it.

Also, the medical and scientific world has never before mobilized and worked together on a project like the new COVID mRNA vaccines now rolling out. Again, this is a huge step forward that will have far reaching impacts on healthcare in the future. But that’s not what the news is talking about.

Here’s another thing. At long last, it looks like the world may finally be ready to start tearing apart the layers that hide systemic racism. What we’re learning is that it may not be the idiots  — and, granted, there are many, many idiots — who are the biggest problem. It may be people like me, who have unknowingly perpetuated the system and are finally beginning to see the endemic bias baked into our culture.

These are just three big steps forward that happened in 2020. There are others. We just aren’t talking about them.

We always look on the dark side. We’re a “glass half-empty” species. That’s what Rosling’s book is about: our tendency to skip over the facts to rush to the worst possible view of things. We need no help in that regard — but we get it anyway from the news business, which, run by humans and aimed at humans, amplifies our proclivity for pessimism.

I’m as glad as anyone to see 2020 in my rear-view mirror. But I am carrying something of that year forward with me: a resolution to spend more time looking for facts and relying less on media “spun” for profit to understand the state of the world.

As we consume media, we have to remember that good news is just not as profitable as bad news. We need to broaden our view to find the facts. Hans Rosling warned us, “Forming your worldview by relying on the media would be like forming your view about me by looking only at a picture of my foot.”

Yes, 2020 was bad, but it was also good. And because there are forces that swing the pendulum both ways, many of the things that were good may not have happened without the bad. In the same letter in which Steinbeck expressed his pessimism about 1941, he went on to say this:

“Not that I have lost any hope. All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die. I don’t know why we should expect it to. It seems fairly obvious that two sides of a mirror are required before one has a mirror, that two forces are necessary in man before he is man.”

There are two sides to every story, even when it’s a horror story like 2020.

Second Thoughts about the Social Dilemma

Watched “The Social Dilemma” yet? I did, a few months ago. The Netflix documentary sets off all kinds of alarms about social media and how it’s twisting the very fabric of our society. It’s a mix of standard documentary fodder — a lot of tell-all interviews with industry insiders and activists — with an (ill-advised, in my opinion) dramatization of the effects of social media addiction in one particular family.

The one most affected is a male teenager who is suddenly drawn, zombie-like,by his social media feed into an ultra-polarized political activist group. Behind the scenes, operating in a sort of evil-empire control room setting, there are literally puppet masters pulling his strings.

It’s scary as hell. But should we be scared? Or — at least — should we be that scared?

Many of us are sounding alarms about social media and how it nets out to be a bad thing. I’m one of the worst. I am very concerned about the impact of social media, and I’ve said so many, many times in this column. But I also admit that this is a social experiment playing out in real time, so it’s hard to predict what the outcome will be. We should keep our minds open to new evidence.

I’ve also said that younger generations seem to be handling this in stride. At least, they seem to be handling it better than those in my generation. They’re quicker to adapt and to use new technologies natively to function in their environments, rather than fumble as we do, searching for some corollary to the world we grew up in.

I’ve certainly had pushback on this observation. Maybe I’m wrong. Or maybe, like so many seemingly disastrous new technological trends before it, social media may turn out to be neither bad nor good. It may just be different.

That certainly seems to be the case if you read a new study from the Institute for Family Studies at Brigham Young University’s Wheatley Institution.  

One of the lead authors of the study, Jean Twenge, previously rang the alarm bells about how technology was short-circuiting the mental wiring of our youth. In a 2017 article in The Atlantic titled “Have Smartphones Destroyed a Generation?” she made this claim:

“It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.”

The article describes a generation of zombies mentally hardwired to social media through their addiction to their iPhone. One of the more startling claims was this:

“Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan.”

Again, scary as hell, right? This sounds frighteningly familiar to the scenarios laid out in ““The Social Dilemma.”  

But what if you take this same group and this same author, fast-forward three years to the middle of the worst pandemic in our lifetimes, and check in with over 1,500 teens to see how they’re doing in a time where they have every right to be depressed? Not only are they locked inside, they’re also processing societal upheavals and existential threats like systemic racial inequality, alt-right political populism and climate change. If 2017-2018 was scary for them, 2020 is a dumpster fire.

Surprisingly, those same teens appear to be doing better than they were two years ago. The study had four measures of ill-being: loneliness, life dissatisfaction, unhappiness and depression.  The results were counterintuitive, to say the least. The number of teens indicating they were depressed actually dropped substantially, from 27% in 2018 to 17% who were quarantined during the school year in 2020. Fewer said they were lonely as well. 

The study indicated that the reasons for this could be because teens were getting more sleep and were spending more time with family.

But what about smartphones and social media? Wouldn’t a quarantined teen (a Quaran-teen?) be spending even more time on his or her phone and social media? 

Well, yes – and no. The study found screen time didn’t really go up, but the way that time was spent did shift. Surprising, time spent on social media went down, but time spent on video chats with friends or watching online streaming entertainment went up. 

As I shared in my column a few weeks ago, this again indicates that it’s not how much time we spend on social media that determines our mental state. It’s how we spend that time. If we spend it looking for connection, rather than obsessing over social status, it can be a good thing. 

Another study, from the University of Oxford, examined data on more than 300,000 adolescents and found that increased screen time has no more impact on teenager’s mental health than eating more potatoes. Or wearing glasses.

If you’re really worried about your teen’s mental health, make sure they have breakfast. Or get enough sleep. Or just spend more time with them. All those things are going to have a lot more impact than the time they spend on their phone.

To be clear, this is not me becoming a fan of Facebook or social media in general. There are still many things to be concerned about. But let’s also realize that technology — any technology– is a tool. It is not inherently good or evil. Those qualities can be found in how we choose to use technology. 

Have More People Become More Awful?

Is it just me, or do people seem a little more awful lately? There seems to be a little more ignorance in the world, a little less compassion, a little more bullying and a lot less courtesy.

Maybe it’s just me.

It’s been a while since I’ve checked in with eternal optimist Steven Pinker.  The Harvard psychologist is probably the best-known proponent of the argument that the world is consistently trending towards being a better place.  According to Pinker, we are less bigoted, less homophobic, less misogynist and less violent. At least, that’s what he felt pre-COVID lockdown. As I said, I haven’t checked in with him lately, but I suspect he would say the long-term trends haven’t appreciably changed. Maybe we’re just going through a blip.

Why, then, does the world seem to be going to hell in a hand cart?  Why do people — at least some people — seem so awful?

I think it’s important to remember that our brain likes to play tricks on us. It’s in a never-ending quest to connect cause and effect. Sometimes, to do so, the brain jumps to conclusions. Unfortunately, it is aided in this unfortunate tendency by a couple of accomplices — namely news reporting and social media. Even if the world isn’t getting shittier, it certainly seems to be. 

Let me give you one example. In my local town, an anti-masking rally was recently held at a nearby shopping mall. Local news outlets jumped on it, with pictures and video of non-masked, non-socially distanced protesters carrying signs and chanting about our decline into Communism and how their rights were being violated.

What a bunch of boneheads — right? That was certainly the consensus in my social media circle. How could people care so little about the health and safety of their community? Why are they so awful?

But when you take the time to unpack this a bit, you realize that everyone is probably overplaying their hands. I don’t have exact numbers, but I don’t think there were more than 30 or 40 protestors at the rally. The population of my city is about 150,000. These protestors represented .03% of the total population. 

Let’s say for every person at the rally, there were 10 that felt the same way but weren’t there. That’s still less than 1%. Even if you multiplied the number of protesters by 100, it would still be just 3% of my community. We’re still talking about a tiny fraction of all the people who live in my city. 

But both the news media and my social media feed have ensured that these people are highly visible. And because they are, our brain likes to use that small and very visible sample and extrapolate it to the world in general. It’s called availability bias, a cognitive shortcut where the brain uses whatever’s easy to grab to create our understanding of the world.

But availability bias is nothing new. Our brains have always done this. So, what’s different about now?

Here, we have to understand that the current reality may be leading us into another “mind-trap.” A 2018 study from Harvard introduced something called “prevalence-induced concept change,” which gives us a better understanding of how the brain focuses on signals in a field of noise. 

Basically, when signals of bad things become less common, the brain works harder to find them. We expand our definition of what is “bad” to include more examples so we can feel more successful in finding them.

I’m probably stretching beyond the limits of the original study here, but could this same thing be happening now? Are we all super-attuned to any hint of what we see as antisocial behavior so we can jump on it? 

If this is the case, again social media is largely to blame. It’s another example of our current toxic mix of dog whistlecancel culturevirtue signaling, pseudo-reality that is being driven by social media. 

That’s two possible things that are happening. But if we add one more, it becomes a perfect storm of perceived awfulness. 

In a normal world, we all have different definitions of the ethical signals we’re paying attention to. What you are focused on right now in your balancing of what is right and wrong is probably different from what I’m currently focused on. I may be thinking about gun control while you’re thinking about reducing your carbon footprint.

But now, we’re all thinking about the same thing: surviving a pandemic. And this isn’t just some theoretical mind exercise. This is something that surrounds us, affecting us every single day. When it comes to this topic, our nerves have been rubbed raw and our patience has run out. 

Worst of all, we feel helpless. There seems to be nothing we can do to edge the world toward being a less awful place. Behaviors that in another reality and on another topic would have never crossed our radar now have us enraged. And, when we’re enraged, we do the one thing we can do: We share our rage on social media. Unfortunately, by doing so, we’re not part of the solution. We are just pouring fuel on the fire.

Yes, some people probably are awful. But are they more awful than they were this time last year? I don’t think so. I also can’t believe that the essential moral balance of our society has collectively nosedived in the last several months. 

What I do believe is that we are living in a time where we’re facing new challenges in how we perceive the world. Now, more than ever before, we’re on the lookout for what we believe to be awful. And if we’re looking for it, we’re sure to find it.