Connected Technologies are Leaving Our Seniors Behind

One of my pandemic projects has been editing a video series of oral history interviews we did with local seniors in my community. Last week, I finished the first video in the series. The original plan, pre-pandemic, was to unveil the video as a special event at a local theater, with the participants attending. Obviously, given our current reality, we had to change our plans.

We, like the rest of the world, moved our event online. As I started working through the logistics of this, I quickly realized something: Our seniors are on the other side of a wide and rapidly growing chasm. Yes, our society is digitally connected in ways we never were before, but those connections are not designed for the elderly. In fact, if you were looking for something that seems to be deliberately designed to disadvantage a segment of our population, it would be hard to find a better example than Internet connection and the elderly.

I have to admit, for much of the past year, I have been pretty focused on what I have sacrificed because of the pandemic. But I am still a pretty connected person. I can Zoom and have a virtual visit with my friends. If I wonder how my daughters are doing, I can instantly text them. If I miss their faces, I can FaceTime them. 

I have taken on the projects I’ve been able to do thanks to the privilege of being wired into the virtual world.   I can even go on a virtual bike ride with my friends through the streets of London, courtesy of Zwift.

Yes, I have given up things, but I have also been able find digital substitutes for many of those things. I’m not going to say it’s been perfect, but it’s certainly been passable.

My stepdad, who is turning 86, has been able to do none of those things. He is in a long-term care home in Alberta, Canada. His only daily social connections consist of brief interactions with staff during mealtime and when they check his blood sugar levels and give him his medication. All the activities that used to give him a chance to socialize are gone. Imagine life for him, where his sum total of connection is probably less than 30 minutes a day. And, on most days, none of that connecting is done with the people he loves.

Up until last week, family couldn’t even visit him. He was locked down due to an outbreak at his home. For my dad, there were no virtual substitutes available. He is not wired in any way for digital connection. If anyone has paid the social price of this pandemic, it’s been my dad and people like the seniors I interviewed, for whom I was desperately trying to find a way for them just to watch a 13-minute video that they had starred in.

A recent study by mobile technology manufacturer Ericsson looked specifically at the relationship between technology and seniors during the pandemic. The study focused on what the company termed the “young-old” seniors, those aged 65-74. They didn’t deal with “middle-old” (aged 75-85) or “oldest-old” (86 plus) because — well, probably because Ericsson couldn’t find enough who were connected to act as a representative sample.

But they did find that even the “young old” were falling behind in their ability to stay connected thanks to COVID-19. These are people who have owned smartphones for at least a decade, many of whom had to use computers and technology in their jobs. Up until a year ago, they were closing the technology gap with younger generations. Then, last March, they started to fall behind.

They were still using the internet, but younger people were using it even more. And, as they got older, they were finding it increasingly daunting to adopt new platforms and technology. They didn’t have the same access to “family tech support” of children or grandchildren to help get them over the learning curve. They were sticking to the things they knew how to do as the rest of the world surged forward and started living their lives in a digital landscape.

But this was not the group that was part of my video project. My experience had been with the “middle old” and “oldest old.” Half fell into the “middle old” group and half fell into the “oldest old” group. Of the eight seniors I was dealing with, only two had emails. If the “young old” are being left behind by technology, these people were never in the race to begin with. As the world was forced to reset to an online reality, these people were never given the option. They were stranded in a world suddenly disconnected from everything they knew and loved.

Predictably, the Ericsson study proposes smartphones as the solution for many of the problems of the pandemic, giving seniors more connection, more confidence and more capabilities. If only they got connected, the study says, life will be better.

But that’s not a solution with legs. It won’t go the distance. And to understand why, we just have to look at the two age cohorts the study didn’t focus on, the “middle old” and the “oldest old.”

Perhaps the hardest hit have been the “oldest old,” who have sacrificed both physical and digital connection, as this Journals of Gerontologyarticle notes.   Four from my group lived in long-term care facilities. Many of these were locked down at some point due to local outbreaks within the facility. Suddenly, that family support they required to connect with their family and friends was no longer available. The technological tools  that we take for granted — which we were able to slot in to take the place of things we were losing — were unimaginable to them. They were literally sentenced to solitary confinement.

A recent study from Germany found that only 3% of those living in long-term care facilities used an internet-connected device. A lot of the time, cognitive declines, even when they’re mild, can make trying to use technology an exercise in frustration.

When my dad went into his long-term care home, my sister and I gave him one of our old phones so he could stay connected. We set everything up and did receive a few experimental texts from him. But soon, it just became too confusing and frustrating for him to use without our constant help. He played solitaire on it for a while, then it ended up in a drawer somewhere. We didn’t push the issue. It just wasn’t the right fit.

But it’s not just my dad who struggled with technology. Even if an aging population starts out as reasonably proficient users, it can be overwhelming to keep up with new hardware, new operating systems and new security requirements. I’m not even “young old” yet, and I’ve worked with technology all my life. I owned a digital marketing company, for heaven’s sake. And even for me, it sometimes seems like a full-time job staying on top of the constant stream of updates and new things to learn and troubleshoot. As connected technology leaps forward, it does not seem unduly concerned that it’s leaving the most vulnerable segment of our population behind.

COVID-19 has pushed us into a virtual world where connection is not just a luxury, but a condition of survival. We need to connect to live. That is especially true for our seniors, who have had all the connections they relied on taken from them. We can’t leave them behind. Connected technology can no longer ignore them.

This is one gap we need to build a bridge over.

Our Disappearing Attention Spans

Last week, Mediapost Editor in Chief Joe Mandese mused about our declining attention spans. He wrote, 

“while in the past, the most common addictive analogy might have been opiates — as in an insatiable desire to want more — these days [consumers] seem more like speed freaks looking for the next fix.”

Mandese cited a couple of recent studies, saying that more than half of mobile users tend to abandon any website that takes longer than three seconds to load. That

“has huge implications for the entire media ecosystem — even TV and video — because consumers increasingly are accessing all forms of content and commerce via their mobile devices.”

The question that begs to be asked here is, “Is a short attention span a bad thing?” The famous comparison is that we are now more easily distracted than a goldfish. But does a shorter attention span negatively impact us, or is it just our brain changing to be a better fit with our environment?

Academics have been debating the impact of technology on our ability to cognitively process things for some time. Journalist Nicholas Carr sounded the warning in his 2010 book, “The Shallows,” where he wrote, 

” (Our brains are) very malleable, they adapt at the cellular level to whatever we happen to be doing. And so the more time we spend surfing, and skimming, and scanning … the more adept we become at that mode of thinking.”

Certainly, Carr is right about the plasticity of our brains. It’s one of the most advantageous features about them. But is our digital environment forever pushing our brains to the shallow end of the pool? Well, it depends. Context is important. One of the biggest factors in determining how we process the information we’re seeing is the device where we’re seeing it.

Back in 2010, Microsoft did a large-scale ethnographic study on how people searched for information on different devices. The researchers found those behaviors differed greatly depending on the platform being used and the intent of the searcher. They found three main categories of search behaviors:

  • Missions are looking for one specific answer (for example, an address or phone number) and often happen on a mobile device.
  • Excavations are widespread searches that need to combine different types of information (for example, researching an upcoming trip or major purchase). They are usually launched on a desktop.
  • Finally, there are Explorations: searching for novelty, often to pass the time. These can happen on all types of devices and can often progress through different devices as the exploration evolves. The initial search may be launched on a mobile device, but as the user gets deeper into the exploration, she may switch to a desktop.

The important thing about this research was that it showed our information-seeking behaviors are very tied to intent, which in turn determines the device used. So, at a surface level, we shouldn’t be too quick to extrapolate behaviors seen on mobile devices with certain intents to other platforms or other intents. We’re very good at matching a search strategy to the strengths and weaknesses of the device we’re using.

But at a deeper level, if Carr is right (and I believe he is) about our constant split-second scanning of information to find items of interest making permanent changes in our brains, what are the implications of this?

For such a fundamentally important question, there is a small but rapidly growing body of academic research that has tried to answer it. To add to the murkiness, many of the studies done contradict each other. The best summary I could find of academia’s quest to determine if “the Internet is making us stupid” was a 2015 article in academic journal The Neuroscientist.

The authors sum up by essentially saying both “yes” — and “no.” We are getting better at quickly filtering through reams of information. We are spending fewer cognitive resource memorizing things we know we can easily find online, which theoretically leaves those resources free for other purposes. Finally, for this post, I will steer away from commenting on multitasking, because the academic jury is still very much out on that one.

But the authors also say that 

“we are shifting towards a shallow mode of learning characterized by quick scanning, reduced contemplation and memory consolidation.”

The fact is, we are spending more and more of our time scanning and clicking. There are inherent benefits to us in learning how to do that faster and more efficiently. The human brain is built to adapt and become better at the things we do all the time. But there is a price to be paid. The brain will also become less capable of doing the things we don’t do as much anymore. As the authors said, this includes actually taking the time to think.

So, in answer to the question “Is the Internet making us stupid?,” I would say no. We are just becoming smart in a different way.

But I would also say the Internet is making us less thoughtful. And that brings up a rather worrying prospect.

As I’ve said many times before, the brain thinks both fast and slow. The fast loop is brutally efficient. It is built to get stuff done in a split second, without having to think about it. Because of this, the fast loop has to be driven by what we already know or think we know. Our “fast” behaviors are necessarily bounded by the beliefs we already hold. It’s this fast loop that’s in control when we’re scanning and clicking our way through our digital environments.

But it’s the slow loop that allows us to extend our thoughts beyond our beliefs. This is where we’ll find our “open minds,” if we have such a thing. Here, we can challenge our beliefs and, if presented with enough evidence to the contrary, willingly break them down and rebuild them to update our understanding of the world. In the sense-making loop, this is called reframing.

The more time we spend “thinking fast” at the expense of “thinking slow,” the more we will become prisoners to our existing beliefs. We will be less able to consolidate and consider information that lies beyond those boundaries. We will spend more time “parsing” and less time “pondering.” As we do so, our brains will shift and change accordingly.

Ironically, our minds will change in such a way to make it exceedingly difficult to change our minds.

The Ebbs and Flows of Consumerism in a Post-Pandemic World

As MediaPost’s Joe Mandese reported last Friday, advertising was, quite literally, almost decimated worldwide in 2020. If you look at the forecasts of the top agency holding companies, ad spends were trimmed by an average of 6.1%. It’s not quite one dollar in 10, but it’s close.

These same companies are forecasting a relative bounceback in 2021, starting slow and accelerating quarter by quarter through the year — but that still leaves the 2021 spend forecast back at 2018 levels.

And as we know, everything about 2021 is still very much in flux. If the year 2021 was a pack of cards, almost every one of them would be wild.

This — according to physician, epidemiologist and sociologist Nicholas Christakis — is not surprising.

Christakis is one of my favorite observers of network effects in society. His background in epidemiological science gives him a unique lens to look at how things spread through the networks of our world, real and virtual. It also makes him the perfect person to comment on what we might expect as we stagger out of our current crisis.

In his latest book, “Apollo’s Arrow,” he looks back to look forward to what we might expect — because, as he points out, we’ve been here before.

While the scope and impact of this one is unusual, such health crises are nothing new. Dozens of epidemics and a few pandemics have happened in my lifetime alone, according to this Wikipedia chart.

This post goes live on Groundhog Day, perhaps the most appropriate of all days for it to run. Today, however, we already know what the outcome will be. The groundhog will see its shadow and there will be six more months (at least) of pandemic to deal with. And we will spend that time living and reliving the same day in the same way with the same routine.

Christakis expects this phase to last through the rest of this year, until the vaccines are widely distributed, and we start to reach herd immunity.

During this time, we will still have to psychologically “hunker down” like the aforementioned groundhog, something we have been struggling with. “As a society we have been very immature,” said Christakis. “Immature, and typical as well, we could have done better.”

This phase will be marked by a general conservatism that will go in lockstep with fear and anxiety, a reluctance to spend and a trend toward risk aversion and religion.

Add to this the fact that we will still be dealing with widespread denialism and anger, which will lead to a worsening vicious circle of loss and crisis. The ideological cracks in our society have gone from annoying to deadly.

Advertising will have to somehow negotiate these choppy waters of increased rage and reduced consumerism.

Then, predicts Christakis, starting some time in 2022, we will enter an adjustment period where we will test and rethink the fundamental aspects of our lives. We will be learning to live with COVID-19, which will be less lethal but still very much present.

We will likely still wear masks and practice social distancing. Many of us will continue to work from home. Local flare-ups will still necessitate intermittent school and business closures. We will be reluctant to be inside with more than 20 or 30 people at a time. It’s unlikely that most of us will feel comfortable getting on a plane or embarking on a cruise ship. This period, according to Christakis, will last for a couple years.

Again, advertising will have to try to thread this psychological needle between fear and hope. It will be a fractured landscape on which to build a marketing strategy. Any pretense of marketing to the masses, a concept long in decline, will now be truly gone. The market will be rife with confusing signals and mixed motivations. It will be incumbent on advertisers to become very, very good at “reading the room.”

Finally, starting in 2024, we will have finally put the pandemic behind us. Now, says Christakis, four years of pent-up demand will suddenly burst through the dam of our delayed self-gratification. We will likely follow the same path taken a century ago, when we were coming out of a war and another pandemic, in the period we call the “Roaring Twenties.”

Christakis explained: “What typically happens is people get less religious. They will relentlessly seek out social interactions in nightclubs and restaurants and sporting events and political rallies. There’ll be some sexual licentiousness. People will start spending their money after having saved it. They’ll be joie de vivre and a kind of risk-taking, a kind of efflorescence of the arts, I think.”

Of course, this burst of buying will be built on the foundation of what came before. The world will likely be very different from its pre-pandemic version. It will be hard for marketers to project demand in a straight line from what they know, because the experiences they’ve been using as their baseline are no longer valid. Some things may remain the same, but some will be changed forever.

COVID-19 will have pried many of the gaps in our society further apart — most notably those of income inequality and ideological difference. A lingering sense of nationalism and protectionism born from dealing with a global emergency could still be in place.

Advertising has always played an interesting role in our lives. It both motivates and mirrors us.

But the reflection it shows is like a funhouse mirror: It distorts some aspects of our culture and ignores others. It creates demand and hides inconvenient truths. It professes to be noble, while it stokes the embers of our ignobility. It amplifies the duality of our human nature.

Interesting times lie ahead. It remains to be seen how that is reflected in the advertising we create and consume.

The Academics of Bullsh*t

“One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted.”—

from On Bullshit,” an essay by philosopher Henry Frankfurt.

Would it surprise you to know that I have found not one, but two academic studies on organizational bullshit? And I mean that non-euphemistically. The word “bullshit” is actually in the title of both studies. I B.S. you not.

In fact, organizational bullshit has become a legitimate field of study. Academics are being paid to dig into it — so to speak. There are likely bullshit grants, bullshit labs, bullshit theories, bullshit paradigms and bullshit courses. There are definitely bullshit professors.  There is even an OBPS — the Organization Bullshit Perception Scale — a way to academically measure bullshit in a company.

Many years ago, when I was in the twilight of my time with the search agency I had founded, I had had enough of the bullshit I was being buried under, shoveled there by the company that had acquired us. I was drowning in it. So I vented right here, on MediaPost. I dared you to imagine what it would be like to actually do business without bullshit getting in the way.

My words fell on deaf ears. Bullshit has proliferated since that time. It has been enshrined up and down our social, business and governmental hierarchies, becoming part of our “new” organizational normal. It has picked up new labels, like “fake news” and “alternate facts.” It has proven more dangerous than I could have ever imagined. And it is this dangerous because we are ignoring it, which is legitimizing it.

Henry Frankfurt defined the concept and set it apart from lying. Liars know the truth and are trying to hide it. Bullshitters don’t care if what they say is true or false. They only care if their listener is persuaded. That’s as good a working definition of the last four years as any I’ve heard.

But at least one study indicates bullshit may have a social modality — acceptable in some contexts, but corrosive in others. Marketing, for example, is highlighted by the authors as an industry built on a foundation of bullshit:

“advertising and public relations agencies and consultants are likely to be ‘full of It,’ and in some cases even make the production of bullshit an important pillar of their business.”

In these studies, researchers speculate that bullshit might actually serve a purpose in organizations. It may allow for strategic motivation before there is an actual strategy in place. This brand of bullshit is otherwise known as “blue-sky thinking” or “out-of-the-box thinking.”

But if this is true, there is a very narrow window indeed where this type of bullshit could be considered beneficial. The minute there are facts to deal with, they should be dealt with. But the problem is that the facts never quite measure up to the vision of the bullshit. Once you open the door to allowing bullshit, it becomes self-perpetuating.

I grew up in the country. I know how hard it is to get rid of bullshit.

The previous example is what I would call strategic bullshit — a way to “grease the wheels” and get the corporate machine moving. But it often leads directly to operational bullshit — which is toxic to an organization, serving to “gum up the gears” and prevent anything real and meaningful from happening. This was the type of bullshit that was burying me back in 2013 when I wrote that first column. It’s also the type of bullshit that is paralyzing us today.

According to the academic research into bullshit, when we’re faced with it, we have four ways to respond: exit, voice, loyalty or neglect. Exit means we try to escape from the bullshit. Loyalty means we wallow in it, spreading it wider and thicker. Neglect means we just ignore it. And Voice means we stand up to the bullshit and confront it.  I’m guessing you’ve already found yourself in one of those four categories.

Here’s the thing. As marketers and communicators, we have to face the cold, ugly truth of our ongoing relationship with bullshit. We all have to deal with it. It’s the nature of our industry.

But how do we deal with it? Most times, in most situations, it’s just easier to escape or ignore it. Sometimes it may serve our purpose to jump on the bullshit bandwagon and spread it. But given the overwhelming evidence of where bullshit has led us in the recent past, we all should be finding our voice to call bullshit on bullshit.

Missing the Mundane

I realize something: I miss the mundane.

Somewhere along the line, mundanity got a bad rap. It became a synonym for boring. But it actually means worldly. It refers to the things you experience when you’re out in the world.

And I miss that — a lot.

There is a lot of stuff that happens when we’re living our lives that we don’t give enough credit to: Petting a dog being taken for a walk. A little flirting with another human we find attractive. Doing some people-watching while we eat our bagel in a mall’s food court. Random situational humor that plays itself out on the sidewalk in front of us. Discovering that the person cutting your hair is also a Monty Python fan. Snippets of conversation — either ones we’re participating in, or ones we overhear while we wait for the bus. Running into an old acquaintance. Even being able to smile at a stranger and have them smile back at you.

The mundane is built of all those hundreds of little, inconsequential social exchanges that happen daily in a normal world that we ordinarily wouldn’t give a second thought to.

And sometimes, serendipitously, we luck upon the holy grail of mundanity — that random “thing” that makes our day.

These are the things we live for. And now, almost all of these things have been stripped from our lives.

I didn’t realize I missed them because I never assigned any importance to them. If I did a signal-to-noise ratio analysis of my life, all these things would fall in the latter category. Most of the time, I wasn’t even fully aware that they were occurring. But I now realize when you add them all up, they’re actually a big part of what I’m missing the most. And I’ve realized that because I’ve been forced to subtract them — one by one — from my life.

I have found that the mundane isn’t boring. It’s the opposite — the seasoning that adds a little flavor to my day-to-day existence.

For the past 10 months, I thought the problem was that I was missing the big things: travel, visiting loved ones, big social gatherings. And I do miss those things. But those things are the tentpoles – the infrequent, yet consequential things that we tend to hang our happiness on. We failed to realize that in between those tentpoles, there is also the fabric of everyday life that has also been eliminated.

It’s not just that we don’t have them. It’s also that we’ve tried to substitute other things for them. And those other things may be making it worse. Things like social media and way too much time spent looking at the news. Bingeing on Netflix. Forcing ourselves into awkward online Zoom encounters just because it seems like the thing to do. A suddenly developed desire to learn Portuguese, or how to bake sourdough bread.

It’s not that all these things are bad. It’s just that they’re different from what we used to consider normal — and by doing them, it reinforces the gap that lies between then and now. They add to that gnawing discontent we have with our new forced coping mechanisms.

The mundane has always leavened our lives. But now, we’ve swapped the living of our lives for being entertained — and whether it’s the news or the new show we’re bingeing, entertainment has to be overplayed. It is nothing but peaks and valleys, with no middle ground. When we actually do the living, rather than the watching, we spend the vast majority of our time in that middle ground — the mundane, which is our emotional reprieve.

I’ve also noticed my social muscles have atrophied over the past several months due to lack of exercise. It’s been ages since I’ve had to make small talk. Every encounter now — as infrequent as they are — seems awkward. Either I’m overeager, like a puppy that’s been left alone in a house all day, or I’m just not in any mood to palaver.  

Finally, it’s these everyday mundane encounters that used to give me anecdotal evidence that not all people were awful. Every day I used to see examples of small kindnesses, unexpected generosity and just plain common courtesy. Yes, there were also counterpoints to all of these, but it almost always netted out to the good. It used to reaffirm my faith in people on a daily basis.

With that source of reaffirmation gone, I have to rely on the news and social media. And — given what those two things are — I know I will only see the extremes of human nature. It’s my “angel and asshole” theory : That we all lie on a bell curve somewhere between the two, and our current situation will push us from the center closer to those two extremes. You also know that the news and social media are going to be biased towards the “asshole” end of the spectrum.

There’s a lot to be said for the mundane — and I have. So I’ll just wrap up with my hope that my life — and yours — will become a little more mundane in the not-too-distant future.

The Timeline of Factfulness

After last Wednesday, when it seemed that our reality was splitting at the seams, I was surprised to see financial markets seemed to ignore what was happening in Washington. It racked up a 1% gain. I later learned that financial markets have a history of being rather oblivious to social upheaval.

Similarly, a newsletter I subscribe to about recent academic research was packed with recent discoveries. Not one of the 35 links in that day’s edition pointed to anything remotely relevant to what was happening at that time in Washington, D.C (or various other state capitals in the country). That was less surprising to me than the collective shrugging off of events by financial markets, but it still made an interesting contrast clear to me.

These two corners of the world are not tied to the happenings of today. Markets look forward and lay economic bets on what will be. And apparently it had bet that the events of January 6 wouldn’t have any lasting impact.

Scientific journals look backward and report on what has already happened in the world of academic research. Neither are very focused on today.

But there is another reason why these two corners of the world seemed unfazed by the news headlines of January 6th. Science and the Markets are two examples of things driven by facts and data. Yes, emotion certainly plays a part. Investors have long known that irrational exuberance or fear can drive artificial bubbles or crashes. And the choice of research paths to take is a human one, which means it’s inevitably driven by emotions.

But both these ecosystems try to systematically reduce the role of emotion as much as possible by relying on facts and data. And because facts and data do not reveal their stories immediately but rather over time in the form of trends, they have to take a longer view of the world.

Therefore, these things operate on different timelines from the news. Financial markets use what’s happening now – today – as just one of many inputs into a calculated bet that will be weeks or months in the future.

Science takes a longer view, using the challenges of today to set a research agenda that may be years away from realizing its pay-off.  Both finance and science use what’s happening right now as one input to determine what will be in the future, but neither focus exclusively on today.

In contrast, that’s exactly what the news has to do. And it hyperbolizes the now, stripping the ordinary from the extraordinary, separating it, picking it out, and concentrating it for our consumption

The fact is, both markets and science have to operate by Factfulness, to use the term coined by the late Hans Rosling, Swedish physician and well known TED speaker. To run like the rest of the world, over-focused on the amplified volatility of the here and now that fills our news feeds, would be to render them dysfunctional. They couldn’t operate. They would be in a constant state of anxiety.

Increasingly, the engines that drive our world  – such as science and financial markets – have to decouple themselves from the froth and frenzy of the immediate. They do so because the rest of the world is following a very different path – one where hyper-emotionality and polarized news outlets whip us back and forth like a rag-doll caught by a Doberman.

This decoupling has accelerated thanks to the role of technology in compressing the timelines of the worldview of most of us. We are instantly alerted to what’s happening now and are then ushered into a highly biased bubble from in which we look at the world. Our world view is not only formed by emotion, it is deliberately designed to manipulate those emotions.

Emotions are our instant response to the world. They run fully hot or cold, with nary a nuance of ration to modulate them. Also, because emotions are our natural early warning systems, we tend to be hyper-aware of them and are immediately drawn to anything that promises to push our emotional buttons. As such, they are a notoriously inaccurate lens from which to look at reality.  That is why efforts are made to minimize their impact in the worlds of science and finance.

We should hold other critical systems to the same standards. Take government, for instance. Now, more than ever, we need those that govern us to be clear eyed and dealing with facts. Unfortunately, as we saw last week, they’re running as fast as they can in the opposite direction.

Happy New Year?

“Speaking of the happy new year, I wonder if any year ever had less chance of being happy. It’s as though the whole race were indulging in a kind of species introversion — as though we looked inward on our neuroses. And the thing we see isn’t very pretty… So we go into this happy new year, knowing that our species has learned nothing, can, as a race, learn nothing — that the experience of ten thousand years has made no impression on the instincts of the million years that preceded.”

That sentiment, relevant as it is to today, was not written about 2021. It was actually written 80 years ago — in 1941 — by none other than John Steinbeck.

John was feeling a little down. I’m sure we can all relate.

It’s pretty easy to say that we have hopefully put the worst year ever behind us. I don’t know about your news feed, but mine has been like a never-ending bus tour of Dante’s 7 Circles of Hell — and I’m sitting next to the life insurance salesman from Des Moines who decided to have a Caesar salad for lunch.

An online essay by Umair Haque kind of summed up 2020 for me: “The Year of the Idiot.” In it, Haque doesn’t pull any punches,

“It was the year that a pandemic searched the ocean of human stupidity, and found, to its gleeful delight, that it appeared to be bottomless. 2020 was the year that idiots wrecked our societies.”

In case you’re not catching the drift yet, Haque goes on to say, “The average person is a massive, gigantic, malicious, selfish idiot.”

Yeah. That pretty much covers it.

Or does it? Were our societies wrecked? Is the average person truly that shitty? Is the world a vast, soul sucking, rotten-cabbage-reeking dumpster fire? Or is it just the lens we’re looking through?

If you search hard enough, you can find those who are looking through a different lens — one that happens to be backed by statistical evidence rather than what bubbles to the top of our newsfeed. One of those people is Ola Rosling. He’s carrying on the mission of his late father, Hans Rosling, who was working on the book “Factfulness” when he passed away in 2017. Bill Gates called it “one of the most educational books I’ve ever read.” And Bill reads a lot of books!

Believe it or not, if you remove a global pandemic from the equation (which, admittedly, is a whole new scale of awful) the world may actually be in better shape than it was 12 months ago. And even if you throw the pandemic into the mix, there are some glimmers of silver peeking through the clouds.

Here are some things you may have missed in your news feed:

Wild polio was eradicated from Africa. That’s big news. It’s a massive achievement that had its to-do box ticked last August. And I’m betting you never heard about it.

Also, the medical and scientific world has never before mobilized and worked together on a project like the new COVID mRNA vaccines now rolling out. Again, this is a huge step forward that will have far reaching impacts on healthcare in the future. But that’s not what the news is talking about.

Here’s another thing. At long last, it looks like the world may finally be ready to start tearing apart the layers that hide systemic racism. What we’re learning is that it may not be the idiots  — and, granted, there are many, many idiots — who are the biggest problem. It may be people like me, who have unknowingly perpetuated the system and are finally beginning to see the endemic bias baked into our culture.

These are just three big steps forward that happened in 2020. There are others. We just aren’t talking about them.

We always look on the dark side. We’re a “glass half-empty” species. That’s what Rosling’s book is about: our tendency to skip over the facts to rush to the worst possible view of things. We need no help in that regard — but we get it anyway from the news business, which, run by humans and aimed at humans, amplifies our proclivity for pessimism.

I’m as glad as anyone to see 2020 in my rear-view mirror. But I am carrying something of that year forward with me: a resolution to spend more time looking for facts and relying less on media “spun” for profit to understand the state of the world.

As we consume media, we have to remember that good news is just not as profitable as bad news. We need to broaden our view to find the facts. Hans Rosling warned us, “Forming your worldview by relying on the media would be like forming your view about me by looking only at a picture of my foot.”

Yes, 2020 was bad, but it was also good. And because there are forces that swing the pendulum both ways, many of the things that were good may not have happened without the bad. In the same letter in which Steinbeck expressed his pessimism about 1941, he went on to say this:

“Not that I have lost any hope. All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die. I don’t know why we should expect it to. It seems fairly obvious that two sides of a mirror are required before one has a mirror, that two forces are necessary in man before he is man.”

There are two sides to every story, even when it’s a horror story like 2020.

Facebook Vs. Apple Vs. Your Privacy

As I was writing last week’s words about Mark Zuckerberg’s hubris-driven view of world domination, little did I know that the next chapter was literally being written. The very next day, a full-page ad from Facebook ran in The New York Times, The Washington Post and The Wall Street Journal attacking Apple for building privacy protection prompts into iOS 14.

It will come as a surprise to no one that I line up firmly on the side of Apple in this cat fight. I have always said we need to retain control over our personal data, choosing what’s shared and when. I also believe we need to have more control over the nature of the data being shared. iOS 14 is taking some much-needed steps in that direction.

Facebook is taking a stand that sadly underlines everything I wrote just last week — a disingenuous stand for a free-market environment — by unfurling the “Save small business” banner. Zuckerberg loves to stand up for “free” things — be it speech or markets — when it serves his purpose.

And the hidden agenda here is not really hidden at all. It’s not the small business around the corner Mark is worried about. It’s the 800-billion-dollar business that he owns 60% of the voting shares in.

The headline of the ad reads, “We’re standing up to Apple for small businesses everywhere.”

Ummm — yeah, right.

What you’re standing up for, Mark, is your revenue model, which depends on Facebook’s being free to hoover up as much personal data on you as possible, across as many platforms as possible.

The only thing that you care about when it comes to small businesses is that they spend as much with Facebook as possible. What you’re trying to defend is not “free” markets or “free” speech. What you’re defending is about the furthest thing imaginable away from  “free.”  It’s $70 billion plus in revenues and $18 and a half billion in profits. What you’re trying to protect is your number-five slot on the Forbes richest people in the world list, with your net worth of $100 billion.

Then, on the very next day, Facebook added insult to injury with a second ad, this time defending the “Free Internet,”  saying Apple “will change the internet as we know it” by forcing websites and blogs “to start charging you subscription fees.”

Good. The “internet as we know it” is a crap sandwich. “Free” has led us to exactly where we are now, with democracy hanging on by a thread, with true journalism in the last paroxysms of its battle for survival, and with anyone with half a brain feeling like they’re swimming in a sea of stupidity.

Bravo to Apple for pushing us away from the toxicity of “free” that comes with our enthralled reverence for “free” things to prop up a rapidly disintegrating information marketplace. If we accept a free model for our access to information, we must also accept advertising that will become increasingly intrusive, with even less regard for our personal privacy. We must accept all the things that come with “free”: the things that have proven to be so detrimental to our ability to function as a caring and compassionate democratic society over the past decade.

In doing the research for this column, I ran into an op-ed piece that ran last year in The New York Times. In it, Facebook co-founder Chris Hughes lays out the case for antitrust regulators dismantling Facebook’s dominance in social media.

This is a guy who was one of Zuckerberg’s best friends in college, who shared in the thrill of starting Facebook, and whose name is on the patent for Facebook’s News Feed algorithm. It’s a major move when a guy like that, knowing what he knows, says, “The most problematic aspect of Facebook’s power is Mark’s unilateral control over speech. There is no precedent for his ability to monitor, organize and even censor the conversations of two billion people.”

Hughes admits that the drive to break up Facebook won’t be easy. In the end, it may not even be successful. But it has to be attempted.

Too much power sits in the Zuckerberg’s hands. An attempt has to be made to break down the walls behind which our private data is being manipulated. We cannot trust Facebook — or Mark Zuckerberg — to do the right thing with the data. It would be so much easier if we could, but it has been proven again and again and again that our trust is misplaced.

The very fact that those calling the shots at Facebook believe you’ll fall for yet another public appeal wrapped in some altruistic bullshit appeal about protecting “free” that’s as substantial as Saran Wrap should be taken as an insult. It should make you mad as hell.

And it should put Apple’s stand to protect your privacy in the right perspective: a long overdue attempt to stop the runaway train that is social media.

Looking At The World Through Zuckerberg-Colored Glasses

Mark Zuckerberg has managed to do something almost no one else has been able to do. He has actually been able to find one small patch of common ground between the far right and the far left in American politics. It seems everybody hates Facebook, even if it’s for different reasons.

The right hates the fact that they’re not given free rein to say whatever they want without Facebook tagging their posts as misinformation. The left worries about the erosion of privacy. And antitrust legislators feel Facebook is just too powerful and dominant in the social media market. Mark Zuckerberg has few friends in Washington — on either side of the aisle.

The common denominator here is control. Facebook has too much of it, and no one likes that. The question on the top of my mind is, “What is Facebook intending to do with that control?” Why is dominance an important part of Zuckerberg’s master plan?

Further, just what is that master plan?  Almost four years ago, in the early days of 2017, Zuckerberg issued a 6,000-word manifesto. In it, he addressed what he called “the most important question of all.” That question was, “Are we building the world we all want?”

According to the manifesto, the plan for Facebook includes “spreading prosperity and freedom, promoting peace and understanding, lifting people out of poverty, and accelerating science.”

Then, two years later, Zuckerberg issued another lengthy memo about his vision regarding privacy and the future of communication, which “will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure.” He explained that Facebook and Instagram are like a town square, a public place for communication. But WhatsApp and Messenger are like your living room, where you can have private conversations without worrying about those conversations.

So, how is all that wonderfulness going, anyway?

Well, first of all, there’s what Mark says, and what Facebook actually does. When he’s not firing off biennial manifestos promising a cotton-candy-colored world, he’s busy assembling all the pieces required to suck up as much data on you as possible, and fighting lawsuits when he gets caught doing something he shouldn’t be.

You have to understand that for Zuckerberg, all these plans are built on a common foundation: Everything happens on a platform that Facebook owns. And those platforms are paid for by advertising. And advertising needs data. And therein lies the problem: What the hell is Facebook doing with all this data?

I’m pretty sure it’s not spreading prosperity and freedom or promoting peace and understanding. Quite the opposite. If you look at Facebook’s fingerprints that are all over the sociological dumpster fire that has been the past four years, you could call them the Keyser Söze of shit disturbing.

And it’s only going to get worse. Facebook and other participants in the attention economy are betting heavily on facial recognition technology. This effectively eliminates our last shred of supposed anonymity online. It forever links our digital dust trail with our real-world activities. And it dumps even more information about you into the voracious algorithms of Facebook, Google and other data devourers. Again, what might be the plans for this data: putting in place the pieces of a more utopian world, or meeting next quarter’s revenue projections?

Here’s the thing. I don’t think Zuckerberg is being wilfully dishonest when he writes these manifestos. I think — at the time — he actually believes them. And he probably legitimately thinks that Facebook is the best way to accomplish them. Zuckerberg always believes he’s the smartest one in the room. And he — like Steve Jobs — has a reality distortion field that’s always on. In that distorted reality, he believes Facebook — a company that is entirely dependent on advertising for survival — can be trusted with all our data. If we just trust him, Facebook will all be okay.

The past four years have proven over and over again that that’s not true. It’s not even possible. No matter how good the intentions you go in with, the revenue model that fuels Facebook will subvert those intentions and turn them into something corrosive.

I think David Fincher summed up the problem nicely in his movie “The Social Network.” There, screenwriter Aaron Sorkin nailed the Zuckerberg nail on the head when he wrote the scene where Zuckerberg’s lawyer said to him, “You’re not an asshole, Mark. You’re just trying so hard to be.”

Facebook represents a lethal mixture that has all the classic warning signs of an abusive relationship:

  • A corporation that can survive only when its advertisers are happy.
  • Advertisers that are demanding more and more data they can use to target prospects.
  • A bro-culture where Facebook folks think they’re smarter than everyone else and believe they can actually thread the needle between being fabulous successful as an advertising platform and not being complete assholes.
  • And an audience of users who are misplacing their trust by buying into the occasional manifesto, while ignoring the red flags that are popping up every day.

Given all these factors, the question becomes: Will splitting up Facebook be a good or bad thing? It’s a question that will become very pertinent in the year to come. I’d love to hear your thoughts.

Second Thoughts about the Social Dilemma

Watched “The Social Dilemma” yet? I did, a few months ago. The Netflix documentary sets off all kinds of alarms about social media and how it’s twisting the very fabric of our society. It’s a mix of standard documentary fodder — a lot of tell-all interviews with industry insiders and activists — with an (ill-advised, in my opinion) dramatization of the effects of social media addiction in one particular family.

The one most affected is a male teenager who is suddenly drawn, zombie-like,by his social media feed into an ultra-polarized political activist group. Behind the scenes, operating in a sort of evil-empire control room setting, there are literally puppet masters pulling his strings.

It’s scary as hell. But should we be scared? Or — at least — should we be that scared?

Many of us are sounding alarms about social media and how it nets out to be a bad thing. I’m one of the worst. I am very concerned about the impact of social media, and I’ve said so many, many times in this column. But I also admit that this is a social experiment playing out in real time, so it’s hard to predict what the outcome will be. We should keep our minds open to new evidence.

I’ve also said that younger generations seem to be handling this in stride. At least, they seem to be handling it better than those in my generation. They’re quicker to adapt and to use new technologies natively to function in their environments, rather than fumble as we do, searching for some corollary to the world we grew up in.

I’ve certainly had pushback on this observation. Maybe I’m wrong. Or maybe, like so many seemingly disastrous new technological trends before it, social media may turn out to be neither bad nor good. It may just be different.

That certainly seems to be the case if you read a new study from the Institute for Family Studies at Brigham Young University’s Wheatley Institution.  

One of the lead authors of the study, Jean Twenge, previously rang the alarm bells about how technology was short-circuiting the mental wiring of our youth. In a 2017 article in The Atlantic titled “Have Smartphones Destroyed a Generation?” she made this claim:

“It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.”

The article describes a generation of zombies mentally hardwired to social media through their addiction to their iPhone. One of the more startling claims was this:

“Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan.”

Again, scary as hell, right? This sounds frighteningly familiar to the scenarios laid out in ““The Social Dilemma.”  

But what if you take this same group and this same author, fast-forward three years to the middle of the worst pandemic in our lifetimes, and check in with over 1,500 teens to see how they’re doing in a time where they have every right to be depressed? Not only are they locked inside, they’re also processing societal upheavals and existential threats like systemic racial inequality, alt-right political populism and climate change. If 2017-2018 was scary for them, 2020 is a dumpster fire.

Surprisingly, those same teens appear to be doing better than they were two years ago. The study had four measures of ill-being: loneliness, life dissatisfaction, unhappiness and depression.  The results were counterintuitive, to say the least. The number of teens indicating they were depressed actually dropped substantially, from 27% in 2018 to 17% who were quarantined during the school year in 2020. Fewer said they were lonely as well. 

The study indicated that the reasons for this could be because teens were getting more sleep and were spending more time with family.

But what about smartphones and social media? Wouldn’t a quarantined teen (a Quaran-teen?) be spending even more time on his or her phone and social media? 

Well, yes – and no. The study found screen time didn’t really go up, but the way that time was spent did shift. Surprising, time spent on social media went down, but time spent on video chats with friends or watching online streaming entertainment went up. 

As I shared in my column a few weeks ago, this again indicates that it’s not how much time we spend on social media that determines our mental state. It’s how we spend that time. If we spend it looking for connection, rather than obsessing over social status, it can be a good thing. 

Another study, from the University of Oxford, examined data on more than 300,000 adolescents and found that increased screen time has no more impact on teenager’s mental health than eating more potatoes. Or wearing glasses.

If you’re really worried about your teen’s mental health, make sure they have breakfast. Or get enough sleep. Or just spend more time with them. All those things are going to have a lot more impact than the time they spend on their phone.

To be clear, this is not me becoming a fan of Facebook or social media in general. There are still many things to be concerned about. But let’s also realize that technology — any technology– is a tool. It is not inherently good or evil. Those qualities can be found in how we choose to use technology.