The Crazy World of Our Media Obsessions

Are you watching the news less? Me too. Now that the grownups are back in charge, I’m spending much less time checking my news feed.

Whatever you might say about the last four years, it certainly was good for the news business. It was one long endless loop of driving past a horrific traffic accident. Try as we might, we just couldn’t avoid looking.

But according to Internet analysis tool Alexa.com, that may be over. I ran some traffic rank reports for major news portals and they all look the same: a ramp-up over the past 90 days to the beginning of February, and then a precipitous drop off a cliff.

While all the top portals have a similar pattern, it’s most obvious on Foxnews.com.

It was as if someone said, “Show’s over folks. There’s nothing to see here. Move along.” And after we all exhaled, we did!

Not surprisingly, we watch the news more when something terrible is happening. It’s an evolved hardwired response called negativity bias.

Good news is nice. But bad news can kill you. So it’s not surprising that bad news tends to catch our attention.

But this was more than that. We were fixated by Trump. If it were just our bias toward bad news, we would still eventually get tired of it.

That’s exactly what happened with the news on COVID-19. We worked through the initial uncertainty and fear, where we were looking for more information, and at some point moved on to the subsequent psychological stages of boredom and anger. As we did that, we threw up our hands and said, “Enough already!”

But when it comes to Donald Trump, there was something else happening.

It’s been said that Trump might have been the best instinctive communicator to ever take up residence in the White House. We might not agree with what he said, but we certainly were listening.

And while we — and by we, I mean me — think we would love to put him behind us, I believe it behooves us to take a peek under the hood of this particular obsession. Because if we fell for it once, we could do it again.

How the F*$k did this guy dominate our every waking, news-consuming moment for the past four years?

We may find a clue in Bob Woodward’s book on Trump, Rage. He explains that he was looking for a “reflector” — a person who knew Trump intimately and could provide some relatively objective insight into his character.

Woodward found a rather unlikely candidate for his reflector: Trump’s son-in-law, Jared Kushner.

I know, I know — “Kushner?” Just bear with me.

In Woodward’s book, Kushner says there were four things you needed to read and “absorb” to understand how Trump’s mind works.

The first was an op-ed piece in The Wall Street Journal by Peggy Noonan called “Over Trump, We’re as Divided as Ever.” It is not complimentary to Trump. But it does begin to provide a possible answer to our ongoing fixation. Noonan explains: “He’s crazy…and it’s kind of working.”

The second was the Cheshire Cat in Alice in Wonderland. Kushner paraphrased: “If you don’t know where you’re going, any path will get you there.” In other words, in Trump’s world, it’s not direction that matters, it’s velocity.

The third was Chris Whipple’s book, The Gatekeepers: How the White House Chiefs of Staff Define Every Presidency. The insight here is that no matter how clueless Trump was about how to do his job, he still felt he knew more than his chiefs of staff.

Finally, the fourth was Win Bigly: Persuasion in a World Where Facts Don’t Matter, by Scott Adams. That’s right — Scott Adams, the same guy who created the “Dilbert” comic strip. Adams calls Trump’s approach “Intentional Wrongness Persuasion.”

Remember, this is coming from Kushner, a guy who says he worships Trump. This is not apologetic. It’s explanatory — a manual on how to communicate in today’s world. Kushner is embracing Trump’s instinctive, scorched-earth approach to keeping our attention focused on him.

It’s — as Peggy Noonan realized — leaning into the “crazy.”  

Trump represented the ultimate political tribal badge. All you needed to do was read one story on Trump, and you knew exactly where you belonged. You knew it in your core, in your bones, without any shred of ambiguity or doubt. There were few things I was as sure of in this world as where I stood on Donald J. Trump.

And maybe that was somehow satisfying to me.

There was something about standing one side or the other of the divide created by Trump that was tribal in nature.

It was probably the clearest ideological signal about what was good and what was bad that we’ve seen for some time, perhaps since World War II or the ’60s — two events that happened before most of our lifetimes.

Trump’s genius was that he somehow made both halves of the world believe they were the good guys.

In 2018, Peggy Noonan said that “Crazy won’t go the distance.” I’d like to believe that’s so, but I’m not so sure. There are certainly others that are borrowing a page from Trump’s playbook.  Right-wing Republicans Marjorie Taylor Greene and Lauren Boebert are both doing “crazy” extraordinarily well. The fact that almost none of you had to Google them to know who they are proves this.

Whether we’re loving to love, or loving to hate, we are all fixated by crazy.

The problem here is that our media ecosystem has changed. “Crazy” used to be filtered out. But somewhere along the line, news outlets discovered that “crazy” is great for their bottom lines.

As former CBS Chairman and CEO Leslie Moonves said when Trump became the Republican Presidential forerunner back in 2016, “It may not be good for America, but it’s damned good for CBS.”

Crazy draws eyeballs like, well, like crazy. It certainly generates more user views then “normal” or “competent.”

In our current media environment  — densely intertwined with the wild world of social media — we have no crazy filters. All we have now are crazy amplifiers.

And the platforms that allow this all try to crowd on the same shaky piece of moral high ground.

According to them, it’s not their job to filter out crazy. It’s anti-free speech. It’s un-American. We should be smart enough to recognize crazy when we see it.

Hmmm. Well, we know that’s not working.

Connected Technologies are Leaving Our Seniors Behind

One of my pandemic projects has been editing a video series of oral history interviews we did with local seniors in my community. Last week, I finished the first video in the series. The original plan, pre-pandemic, was to unveil the video as a special event at a local theater, with the participants attending. Obviously, given our current reality, we had to change our plans.

We, like the rest of the world, moved our event online. As I started working through the logistics of this, I quickly realized something: Our seniors are on the other side of a wide and rapidly growing chasm. Yes, our society is digitally connected in ways we never were before, but those connections are not designed for the elderly. In fact, if you were looking for something that seems to be deliberately designed to disadvantage a segment of our population, it would be hard to find a better example than Internet connection and the elderly.

I have to admit, for much of the past year, I have been pretty focused on what I have sacrificed because of the pandemic. But I am still a pretty connected person. I can Zoom and have a virtual visit with my friends. If I wonder how my daughters are doing, I can instantly text them. If I miss their faces, I can FaceTime them. 

I have taken on the projects I’ve been able to do thanks to the privilege of being wired into the virtual world.   I can even go on a virtual bike ride with my friends through the streets of London, courtesy of Zwift.

Yes, I have given up things, but I have also been able find digital substitutes for many of those things. I’m not going to say it’s been perfect, but it’s certainly been passable.

My stepdad, who is turning 86, has been able to do none of those things. He is in a long-term care home in Alberta, Canada. His only daily social connections consist of brief interactions with staff during mealtime and when they check his blood sugar levels and give him his medication. All the activities that used to give him a chance to socialize are gone. Imagine life for him, where his sum total of connection is probably less than 30 minutes a day. And, on most days, none of that connecting is done with the people he loves.

Up until last week, family couldn’t even visit him. He was locked down due to an outbreak at his home. For my dad, there were no virtual substitutes available. He is not wired in any way for digital connection. If anyone has paid the social price of this pandemic, it’s been my dad and people like the seniors I interviewed, for whom I was desperately trying to find a way for them just to watch a 13-minute video that they had starred in.

A recent study by mobile technology manufacturer Ericsson looked specifically at the relationship between technology and seniors during the pandemic. The study focused on what the company termed the “young-old” seniors, those aged 65-74. They didn’t deal with “middle-old” (aged 75-85) or “oldest-old” (86 plus) because — well, probably because Ericsson couldn’t find enough who were connected to act as a representative sample.

But they did find that even the “young old” were falling behind in their ability to stay connected thanks to COVID-19. These are people who have owned smartphones for at least a decade, many of whom had to use computers and technology in their jobs. Up until a year ago, they were closing the technology gap with younger generations. Then, last March, they started to fall behind.

They were still using the internet, but younger people were using it even more. And, as they got older, they were finding it increasingly daunting to adopt new platforms and technology. They didn’t have the same access to “family tech support” of children or grandchildren to help get them over the learning curve. They were sticking to the things they knew how to do as the rest of the world surged forward and started living their lives in a digital landscape.

But this was not the group that was part of my video project. My experience had been with the “middle old” and “oldest old.” Half fell into the “middle old” group and half fell into the “oldest old” group. Of the eight seniors I was dealing with, only two had emails. If the “young old” are being left behind by technology, these people were never in the race to begin with. As the world was forced to reset to an online reality, these people were never given the option. They were stranded in a world suddenly disconnected from everything they knew and loved.

Predictably, the Ericsson study proposes smartphones as the solution for many of the problems of the pandemic, giving seniors more connection, more confidence and more capabilities. If only they got connected, the study says, life will be better.

But that’s not a solution with legs. It won’t go the distance. And to understand why, we just have to look at the two age cohorts the study didn’t focus on, the “middle old” and the “oldest old.”

Perhaps the hardest hit have been the “oldest old,” who have sacrificed both physical and digital connection, as this Journals of Gerontologyarticle notes.   Four from my group lived in long-term care facilities. Many of these were locked down at some point due to local outbreaks within the facility. Suddenly, that family support they required to connect with their family and friends was no longer available. The technological tools  that we take for granted — which we were able to slot in to take the place of things we were losing — were unimaginable to them. They were literally sentenced to solitary confinement.

A recent study from Germany found that only 3% of those living in long-term care facilities used an internet-connected device. A lot of the time, cognitive declines, even when they’re mild, can make trying to use technology an exercise in frustration.

When my dad went into his long-term care home, my sister and I gave him one of our old phones so he could stay connected. We set everything up and did receive a few experimental texts from him. But soon, it just became too confusing and frustrating for him to use without our constant help. He played solitaire on it for a while, then it ended up in a drawer somewhere. We didn’t push the issue. It just wasn’t the right fit.

But it’s not just my dad who struggled with technology. Even if an aging population starts out as reasonably proficient users, it can be overwhelming to keep up with new hardware, new operating systems and new security requirements. I’m not even “young old” yet, and I’ve worked with technology all my life. I owned a digital marketing company, for heaven’s sake. And even for me, it sometimes seems like a full-time job staying on top of the constant stream of updates and new things to learn and troubleshoot. As connected technology leaps forward, it does not seem unduly concerned that it’s leaving the most vulnerable segment of our population behind.

COVID-19 has pushed us into a virtual world where connection is not just a luxury, but a condition of survival. We need to connect to live. That is especially true for our seniors, who have had all the connections they relied on taken from them. We can’t leave them behind. Connected technology can no longer ignore them.

This is one gap we need to build a bridge over.

The Academics of Bullsh*t

“One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted.”—

from On Bullshit,” an essay by philosopher Henry Frankfurt.

Would it surprise you to know that I have found not one, but two academic studies on organizational bullshit? And I mean that non-euphemistically. The word “bullshit” is actually in the title of both studies. I B.S. you not.

In fact, organizational bullshit has become a legitimate field of study. Academics are being paid to dig into it — so to speak. There are likely bullshit grants, bullshit labs, bullshit theories, bullshit paradigms and bullshit courses. There are definitely bullshit professors.  There is even an OBPS — the Organization Bullshit Perception Scale — a way to academically measure bullshit in a company.

Many years ago, when I was in the twilight of my time with the search agency I had founded, I had had enough of the bullshit I was being buried under, shoveled there by the company that had acquired us. I was drowning in it. So I vented right here, on MediaPost. I dared you to imagine what it would be like to actually do business without bullshit getting in the way.

My words fell on deaf ears. Bullshit has proliferated since that time. It has been enshrined up and down our social, business and governmental hierarchies, becoming part of our “new” organizational normal. It has picked up new labels, like “fake news” and “alternate facts.” It has proven more dangerous than I could have ever imagined. And it is this dangerous because we are ignoring it, which is legitimizing it.

Henry Frankfurt defined the concept and set it apart from lying. Liars know the truth and are trying to hide it. Bullshitters don’t care if what they say is true or false. They only care if their listener is persuaded. That’s as good a working definition of the last four years as any I’ve heard.

But at least one study indicates bullshit may have a social modality — acceptable in some contexts, but corrosive in others. Marketing, for example, is highlighted by the authors as an industry built on a foundation of bullshit:

“advertising and public relations agencies and consultants are likely to be ‘full of It,’ and in some cases even make the production of bullshit an important pillar of their business.”

In these studies, researchers speculate that bullshit might actually serve a purpose in organizations. It may allow for strategic motivation before there is an actual strategy in place. This brand of bullshit is otherwise known as “blue-sky thinking” or “out-of-the-box thinking.”

But if this is true, there is a very narrow window indeed where this type of bullshit could be considered beneficial. The minute there are facts to deal with, they should be dealt with. But the problem is that the facts never quite measure up to the vision of the bullshit. Once you open the door to allowing bullshit, it becomes self-perpetuating.

I grew up in the country. I know how hard it is to get rid of bullshit.

The previous example is what I would call strategic bullshit — a way to “grease the wheels” and get the corporate machine moving. But it often leads directly to operational bullshit — which is toxic to an organization, serving to “gum up the gears” and prevent anything real and meaningful from happening. This was the type of bullshit that was burying me back in 2013 when I wrote that first column. It’s also the type of bullshit that is paralyzing us today.

According to the academic research into bullshit, when we’re faced with it, we have four ways to respond: exit, voice, loyalty or neglect. Exit means we try to escape from the bullshit. Loyalty means we wallow in it, spreading it wider and thicker. Neglect means we just ignore it. And Voice means we stand up to the bullshit and confront it.  I’m guessing you’ve already found yourself in one of those four categories.

Here’s the thing. As marketers and communicators, we have to face the cold, ugly truth of our ongoing relationship with bullshit. We all have to deal with it. It’s the nature of our industry.

But how do we deal with it? Most times, in most situations, it’s just easier to escape or ignore it. Sometimes it may serve our purpose to jump on the bullshit bandwagon and spread it. But given the overwhelming evidence of where bullshit has led us in the recent past, we all should be finding our voice to call bullshit on bullshit.

Missing the Mundane

I realize something: I miss the mundane.

Somewhere along the line, mundanity got a bad rap. It became a synonym for boring. But it actually means worldly. It refers to the things you experience when you’re out in the world.

And I miss that — a lot.

There is a lot of stuff that happens when we’re living our lives that we don’t give enough credit to: Petting a dog being taken for a walk. A little flirting with another human we find attractive. Doing some people-watching while we eat our bagel in a mall’s food court. Random situational humor that plays itself out on the sidewalk in front of us. Discovering that the person cutting your hair is also a Monty Python fan. Snippets of conversation — either ones we’re participating in, or ones we overhear while we wait for the bus. Running into an old acquaintance. Even being able to smile at a stranger and have them smile back at you.

The mundane is built of all those hundreds of little, inconsequential social exchanges that happen daily in a normal world that we ordinarily wouldn’t give a second thought to.

And sometimes, serendipitously, we luck upon the holy grail of mundanity — that random “thing” that makes our day.

These are the things we live for. And now, almost all of these things have been stripped from our lives.

I didn’t realize I missed them because I never assigned any importance to them. If I did a signal-to-noise ratio analysis of my life, all these things would fall in the latter category. Most of the time, I wasn’t even fully aware that they were occurring. But I now realize when you add them all up, they’re actually a big part of what I’m missing the most. And I’ve realized that because I’ve been forced to subtract them — one by one — from my life.

I have found that the mundane isn’t boring. It’s the opposite — the seasoning that adds a little flavor to my day-to-day existence.

For the past 10 months, I thought the problem was that I was missing the big things: travel, visiting loved ones, big social gatherings. And I do miss those things. But those things are the tentpoles – the infrequent, yet consequential things that we tend to hang our happiness on. We failed to realize that in between those tentpoles, there is also the fabric of everyday life that has also been eliminated.

It’s not just that we don’t have them. It’s also that we’ve tried to substitute other things for them. And those other things may be making it worse. Things like social media and way too much time spent looking at the news. Bingeing on Netflix. Forcing ourselves into awkward online Zoom encounters just because it seems like the thing to do. A suddenly developed desire to learn Portuguese, or how to bake sourdough bread.

It’s not that all these things are bad. It’s just that they’re different from what we used to consider normal — and by doing them, it reinforces the gap that lies between then and now. They add to that gnawing discontent we have with our new forced coping mechanisms.

The mundane has always leavened our lives. But now, we’ve swapped the living of our lives for being entertained — and whether it’s the news or the new show we’re bingeing, entertainment has to be overplayed. It is nothing but peaks and valleys, with no middle ground. When we actually do the living, rather than the watching, we spend the vast majority of our time in that middle ground — the mundane, which is our emotional reprieve.

I’ve also noticed my social muscles have atrophied over the past several months due to lack of exercise. It’s been ages since I’ve had to make small talk. Every encounter now — as infrequent as they are — seems awkward. Either I’m overeager, like a puppy that’s been left alone in a house all day, or I’m just not in any mood to palaver.  

Finally, it’s these everyday mundane encounters that used to give me anecdotal evidence that not all people were awful. Every day I used to see examples of small kindnesses, unexpected generosity and just plain common courtesy. Yes, there were also counterpoints to all of these, but it almost always netted out to the good. It used to reaffirm my faith in people on a daily basis.

With that source of reaffirmation gone, I have to rely on the news and social media. And — given what those two things are — I know I will only see the extremes of human nature. It’s my “angel and asshole” theory : That we all lie on a bell curve somewhere between the two, and our current situation will push us from the center closer to those two extremes. You also know that the news and social media are going to be biased towards the “asshole” end of the spectrum.

There’s a lot to be said for the mundane — and I have. So I’ll just wrap up with my hope that my life — and yours — will become a little more mundane in the not-too-distant future.

The Timeline of Factfulness

After last Wednesday, when it seemed that our reality was splitting at the seams, I was surprised to see financial markets seemed to ignore what was happening in Washington. It racked up a 1% gain. I later learned that financial markets have a history of being rather oblivious to social upheaval.

Similarly, a newsletter I subscribe to about recent academic research was packed with recent discoveries. Not one of the 35 links in that day’s edition pointed to anything remotely relevant to what was happening at that time in Washington, D.C (or various other state capitals in the country). That was less surprising to me than the collective shrugging off of events by financial markets, but it still made an interesting contrast clear to me.

These two corners of the world are not tied to the happenings of today. Markets look forward and lay economic bets on what will be. And apparently it had bet that the events of January 6 wouldn’t have any lasting impact.

Scientific journals look backward and report on what has already happened in the world of academic research. Neither are very focused on today.

But there is another reason why these two corners of the world seemed unfazed by the news headlines of January 6th. Science and the Markets are two examples of things driven by facts and data. Yes, emotion certainly plays a part. Investors have long known that irrational exuberance or fear can drive artificial bubbles or crashes. And the choice of research paths to take is a human one, which means it’s inevitably driven by emotions.

But both these ecosystems try to systematically reduce the role of emotion as much as possible by relying on facts and data. And because facts and data do not reveal their stories immediately but rather over time in the form of trends, they have to take a longer view of the world.

Therefore, these things operate on different timelines from the news. Financial markets use what’s happening now – today – as just one of many inputs into a calculated bet that will be weeks or months in the future.

Science takes a longer view, using the challenges of today to set a research agenda that may be years away from realizing its pay-off.  Both finance and science use what’s happening right now as one input to determine what will be in the future, but neither focus exclusively on today.

In contrast, that’s exactly what the news has to do. And it hyperbolizes the now, stripping the ordinary from the extraordinary, separating it, picking it out, and concentrating it for our consumption

The fact is, both markets and science have to operate by Factfulness, to use the term coined by the late Hans Rosling, Swedish physician and well known TED speaker. To run like the rest of the world, over-focused on the amplified volatility of the here and now that fills our news feeds, would be to render them dysfunctional. They couldn’t operate. They would be in a constant state of anxiety.

Increasingly, the engines that drive our world  – such as science and financial markets – have to decouple themselves from the froth and frenzy of the immediate. They do so because the rest of the world is following a very different path – one where hyper-emotionality and polarized news outlets whip us back and forth like a rag-doll caught by a Doberman.

This decoupling has accelerated thanks to the role of technology in compressing the timelines of the worldview of most of us. We are instantly alerted to what’s happening now and are then ushered into a highly biased bubble from in which we look at the world. Our world view is not only formed by emotion, it is deliberately designed to manipulate those emotions.

Emotions are our instant response to the world. They run fully hot or cold, with nary a nuance of ration to modulate them. Also, because emotions are our natural early warning systems, we tend to be hyper-aware of them and are immediately drawn to anything that promises to push our emotional buttons. As such, they are a notoriously inaccurate lens from which to look at reality.  That is why efforts are made to minimize their impact in the worlds of science and finance.

We should hold other critical systems to the same standards. Take government, for instance. Now, more than ever, we need those that govern us to be clear eyed and dealing with facts. Unfortunately, as we saw last week, they’re running as fast as they can in the opposite direction.

Happy New Year?

“Speaking of the happy new year, I wonder if any year ever had less chance of being happy. It’s as though the whole race were indulging in a kind of species introversion — as though we looked inward on our neuroses. And the thing we see isn’t very pretty… So we go into this happy new year, knowing that our species has learned nothing, can, as a race, learn nothing — that the experience of ten thousand years has made no impression on the instincts of the million years that preceded.”

That sentiment, relevant as it is to today, was not written about 2021. It was actually written 80 years ago — in 1941 — by none other than John Steinbeck.

John was feeling a little down. I’m sure we can all relate.

It’s pretty easy to say that we have hopefully put the worst year ever behind us. I don’t know about your news feed, but mine has been like a never-ending bus tour of Dante’s 7 Circles of Hell — and I’m sitting next to the life insurance salesman from Des Moines who decided to have a Caesar salad for lunch.

An online essay by Umair Haque kind of summed up 2020 for me: “The Year of the Idiot.” In it, Haque doesn’t pull any punches,

“It was the year that a pandemic searched the ocean of human stupidity, and found, to its gleeful delight, that it appeared to be bottomless. 2020 was the year that idiots wrecked our societies.”

In case you’re not catching the drift yet, Haque goes on to say, “The average person is a massive, gigantic, malicious, selfish idiot.”

Yeah. That pretty much covers it.

Or does it? Were our societies wrecked? Is the average person truly that shitty? Is the world a vast, soul sucking, rotten-cabbage-reeking dumpster fire? Or is it just the lens we’re looking through?

If you search hard enough, you can find those who are looking through a different lens — one that happens to be backed by statistical evidence rather than what bubbles to the top of our newsfeed. One of those people is Ola Rosling. He’s carrying on the mission of his late father, Hans Rosling, who was working on the book “Factfulness” when he passed away in 2017. Bill Gates called it “one of the most educational books I’ve ever read.” And Bill reads a lot of books!

Believe it or not, if you remove a global pandemic from the equation (which, admittedly, is a whole new scale of awful) the world may actually be in better shape than it was 12 months ago. And even if you throw the pandemic into the mix, there are some glimmers of silver peeking through the clouds.

Here are some things you may have missed in your news feed:

Wild polio was eradicated from Africa. That’s big news. It’s a massive achievement that had its to-do box ticked last August. And I’m betting you never heard about it.

Also, the medical and scientific world has never before mobilized and worked together on a project like the new COVID mRNA vaccines now rolling out. Again, this is a huge step forward that will have far reaching impacts on healthcare in the future. But that’s not what the news is talking about.

Here’s another thing. At long last, it looks like the world may finally be ready to start tearing apart the layers that hide systemic racism. What we’re learning is that it may not be the idiots  — and, granted, there are many, many idiots — who are the biggest problem. It may be people like me, who have unknowingly perpetuated the system and are finally beginning to see the endemic bias baked into our culture.

These are just three big steps forward that happened in 2020. There are others. We just aren’t talking about them.

We always look on the dark side. We’re a “glass half-empty” species. That’s what Rosling’s book is about: our tendency to skip over the facts to rush to the worst possible view of things. We need no help in that regard — but we get it anyway from the news business, which, run by humans and aimed at humans, amplifies our proclivity for pessimism.

I’m as glad as anyone to see 2020 in my rear-view mirror. But I am carrying something of that year forward with me: a resolution to spend more time looking for facts and relying less on media “spun” for profit to understand the state of the world.

As we consume media, we have to remember that good news is just not as profitable as bad news. We need to broaden our view to find the facts. Hans Rosling warned us, “Forming your worldview by relying on the media would be like forming your view about me by looking only at a picture of my foot.”

Yes, 2020 was bad, but it was also good. And because there are forces that swing the pendulum both ways, many of the things that were good may not have happened without the bad. In the same letter in which Steinbeck expressed his pessimism about 1941, he went on to say this:

“Not that I have lost any hope. All the goodness and the heroisms will rise up again, then be cut down again and rise up. It isn’t that the evil thing wins — it never will — but that it doesn’t die. I don’t know why we should expect it to. It seems fairly obvious that two sides of a mirror are required before one has a mirror, that two forces are necessary in man before he is man.”

There are two sides to every story, even when it’s a horror story like 2020.

Facebook Vs. Apple Vs. Your Privacy

As I was writing last week’s words about Mark Zuckerberg’s hubris-driven view of world domination, little did I know that the next chapter was literally being written. The very next day, a full-page ad from Facebook ran in The New York Times, The Washington Post and The Wall Street Journal attacking Apple for building privacy protection prompts into iOS 14.

It will come as a surprise to no one that I line up firmly on the side of Apple in this cat fight. I have always said we need to retain control over our personal data, choosing what’s shared and when. I also believe we need to have more control over the nature of the data being shared. iOS 14 is taking some much-needed steps in that direction.

Facebook is taking a stand that sadly underlines everything I wrote just last week — a disingenuous stand for a free-market environment — by unfurling the “Save small business” banner. Zuckerberg loves to stand up for “free” things — be it speech or markets — when it serves his purpose.

And the hidden agenda here is not really hidden at all. It’s not the small business around the corner Mark is worried about. It’s the 800-billion-dollar business that he owns 60% of the voting shares in.

The headline of the ad reads, “We’re standing up to Apple for small businesses everywhere.”

Ummm — yeah, right.

What you’re standing up for, Mark, is your revenue model, which depends on Facebook’s being free to hoover up as much personal data on you as possible, across as many platforms as possible.

The only thing that you care about when it comes to small businesses is that they spend as much with Facebook as possible. What you’re trying to defend is not “free” markets or “free” speech. What you’re defending is about the furthest thing imaginable away from  “free.”  It’s $70 billion plus in revenues and $18 and a half billion in profits. What you’re trying to protect is your number-five slot on the Forbes richest people in the world list, with your net worth of $100 billion.

Then, on the very next day, Facebook added insult to injury with a second ad, this time defending the “Free Internet,”  saying Apple “will change the internet as we know it” by forcing websites and blogs “to start charging you subscription fees.”

Good. The “internet as we know it” is a crap sandwich. “Free” has led us to exactly where we are now, with democracy hanging on by a thread, with true journalism in the last paroxysms of its battle for survival, and with anyone with half a brain feeling like they’re swimming in a sea of stupidity.

Bravo to Apple for pushing us away from the toxicity of “free” that comes with our enthralled reverence for “free” things to prop up a rapidly disintegrating information marketplace. If we accept a free model for our access to information, we must also accept advertising that will become increasingly intrusive, with even less regard for our personal privacy. We must accept all the things that come with “free”: the things that have proven to be so detrimental to our ability to function as a caring and compassionate democratic society over the past decade.

In doing the research for this column, I ran into an op-ed piece that ran last year in The New York Times. In it, Facebook co-founder Chris Hughes lays out the case for antitrust regulators dismantling Facebook’s dominance in social media.

This is a guy who was one of Zuckerberg’s best friends in college, who shared in the thrill of starting Facebook, and whose name is on the patent for Facebook’s News Feed algorithm. It’s a major move when a guy like that, knowing what he knows, says, “The most problematic aspect of Facebook’s power is Mark’s unilateral control over speech. There is no precedent for his ability to monitor, organize and even censor the conversations of two billion people.”

Hughes admits that the drive to break up Facebook won’t be easy. In the end, it may not even be successful. But it has to be attempted.

Too much power sits in the Zuckerberg’s hands. An attempt has to be made to break down the walls behind which our private data is being manipulated. We cannot trust Facebook — or Mark Zuckerberg — to do the right thing with the data. It would be so much easier if we could, but it has been proven again and again and again that our trust is misplaced.

The very fact that those calling the shots at Facebook believe you’ll fall for yet another public appeal wrapped in some altruistic bullshit appeal about protecting “free” that’s as substantial as Saran Wrap should be taken as an insult. It should make you mad as hell.

And it should put Apple’s stand to protect your privacy in the right perspective: a long overdue attempt to stop the runaway train that is social media.

Looking At The World Through Zuckerberg-Colored Glasses

Mark Zuckerberg has managed to do something almost no one else has been able to do. He has actually been able to find one small patch of common ground between the far right and the far left in American politics. It seems everybody hates Facebook, even if it’s for different reasons.

The right hates the fact that they’re not given free rein to say whatever they want without Facebook tagging their posts as misinformation. The left worries about the erosion of privacy. And antitrust legislators feel Facebook is just too powerful and dominant in the social media market. Mark Zuckerberg has few friends in Washington — on either side of the aisle.

The common denominator here is control. Facebook has too much of it, and no one likes that. The question on the top of my mind is, “What is Facebook intending to do with that control?” Why is dominance an important part of Zuckerberg’s master plan?

Further, just what is that master plan?  Almost four years ago, in the early days of 2017, Zuckerberg issued a 6,000-word manifesto. In it, he addressed what he called “the most important question of all.” That question was, “Are we building the world we all want?”

According to the manifesto, the plan for Facebook includes “spreading prosperity and freedom, promoting peace and understanding, lifting people out of poverty, and accelerating science.”

Then, two years later, Zuckerberg issued another lengthy memo about his vision regarding privacy and the future of communication, which “will increasingly shift to private, encrypted services where people can be confident what they say to each other stays secure.” He explained that Facebook and Instagram are like a town square, a public place for communication. But WhatsApp and Messenger are like your living room, where you can have private conversations without worrying about those conversations.

So, how is all that wonderfulness going, anyway?

Well, first of all, there’s what Mark says, and what Facebook actually does. When he’s not firing off biennial manifestos promising a cotton-candy-colored world, he’s busy assembling all the pieces required to suck up as much data on you as possible, and fighting lawsuits when he gets caught doing something he shouldn’t be.

You have to understand that for Zuckerberg, all these plans are built on a common foundation: Everything happens on a platform that Facebook owns. And those platforms are paid for by advertising. And advertising needs data. And therein lies the problem: What the hell is Facebook doing with all this data?

I’m pretty sure it’s not spreading prosperity and freedom or promoting peace and understanding. Quite the opposite. If you look at Facebook’s fingerprints that are all over the sociological dumpster fire that has been the past four years, you could call them the Keyser Söze of shit disturbing.

And it’s only going to get worse. Facebook and other participants in the attention economy are betting heavily on facial recognition technology. This effectively eliminates our last shred of supposed anonymity online. It forever links our digital dust trail with our real-world activities. And it dumps even more information about you into the voracious algorithms of Facebook, Google and other data devourers. Again, what might be the plans for this data: putting in place the pieces of a more utopian world, or meeting next quarter’s revenue projections?

Here’s the thing. I don’t think Zuckerberg is being wilfully dishonest when he writes these manifestos. I think — at the time — he actually believes them. And he probably legitimately thinks that Facebook is the best way to accomplish them. Zuckerberg always believes he’s the smartest one in the room. And he — like Steve Jobs — has a reality distortion field that’s always on. In that distorted reality, he believes Facebook — a company that is entirely dependent on advertising for survival — can be trusted with all our data. If we just trust him, Facebook will all be okay.

The past four years have proven over and over again that that’s not true. It’s not even possible. No matter how good the intentions you go in with, the revenue model that fuels Facebook will subvert those intentions and turn them into something corrosive.

I think David Fincher summed up the problem nicely in his movie “The Social Network.” There, screenwriter Aaron Sorkin nailed the Zuckerberg nail on the head when he wrote the scene where Zuckerberg’s lawyer said to him, “You’re not an asshole, Mark. You’re just trying so hard to be.”

Facebook represents a lethal mixture that has all the classic warning signs of an abusive relationship:

  • A corporation that can survive only when its advertisers are happy.
  • Advertisers that are demanding more and more data they can use to target prospects.
  • A bro-culture where Facebook folks think they’re smarter than everyone else and believe they can actually thread the needle between being fabulous successful as an advertising platform and not being complete assholes.
  • And an audience of users who are misplacing their trust by buying into the occasional manifesto, while ignoring the red flags that are popping up every day.

Given all these factors, the question becomes: Will splitting up Facebook be a good or bad thing? It’s a question that will become very pertinent in the year to come. I’d love to hear your thoughts.

Second Thoughts about the Social Dilemma

Watched “The Social Dilemma” yet? I did, a few months ago. The Netflix documentary sets off all kinds of alarms about social media and how it’s twisting the very fabric of our society. It’s a mix of standard documentary fodder — a lot of tell-all interviews with industry insiders and activists — with an (ill-advised, in my opinion) dramatization of the effects of social media addiction in one particular family.

The one most affected is a male teenager who is suddenly drawn, zombie-like,by his social media feed into an ultra-polarized political activist group. Behind the scenes, operating in a sort of evil-empire control room setting, there are literally puppet masters pulling his strings.

It’s scary as hell. But should we be scared? Or — at least — should we be that scared?

Many of us are sounding alarms about social media and how it nets out to be a bad thing. I’m one of the worst. I am very concerned about the impact of social media, and I’ve said so many, many times in this column. But I also admit that this is a social experiment playing out in real time, so it’s hard to predict what the outcome will be. We should keep our minds open to new evidence.

I’ve also said that younger generations seem to be handling this in stride. At least, they seem to be handling it better than those in my generation. They’re quicker to adapt and to use new technologies natively to function in their environments, rather than fumble as we do, searching for some corollary to the world we grew up in.

I’ve certainly had pushback on this observation. Maybe I’m wrong. Or maybe, like so many seemingly disastrous new technological trends before it, social media may turn out to be neither bad nor good. It may just be different.

That certainly seems to be the case if you read a new study from the Institute for Family Studies at Brigham Young University’s Wheatley Institution.  

One of the lead authors of the study, Jean Twenge, previously rang the alarm bells about how technology was short-circuiting the mental wiring of our youth. In a 2017 article in The Atlantic titled “Have Smartphones Destroyed a Generation?” she made this claim:

“It’s not an exaggeration to describe iGen as being on the brink of the worst mental-health crisis in decades. Much of this deterioration can be traced to their phones.”

The article describes a generation of zombies mentally hardwired to social media through their addiction to their iPhone. One of the more startling claims was this:

“Teens who spend three hours a day or more on electronic devices are 35 percent more likely to have a risk factor for suicide, such as making a suicide plan.”

Again, scary as hell, right? This sounds frighteningly familiar to the scenarios laid out in ““The Social Dilemma.”  

But what if you take this same group and this same author, fast-forward three years to the middle of the worst pandemic in our lifetimes, and check in with over 1,500 teens to see how they’re doing in a time where they have every right to be depressed? Not only are they locked inside, they’re also processing societal upheavals and existential threats like systemic racial inequality, alt-right political populism and climate change. If 2017-2018 was scary for them, 2020 is a dumpster fire.

Surprisingly, those same teens appear to be doing better than they were two years ago. The study had four measures of ill-being: loneliness, life dissatisfaction, unhappiness and depression.  The results were counterintuitive, to say the least. The number of teens indicating they were depressed actually dropped substantially, from 27% in 2018 to 17% who were quarantined during the school year in 2020. Fewer said they were lonely as well. 

The study indicated that the reasons for this could be because teens were getting more sleep and were spending more time with family.

But what about smartphones and social media? Wouldn’t a quarantined teen (a Quaran-teen?) be spending even more time on his or her phone and social media? 

Well, yes – and no. The study found screen time didn’t really go up, but the way that time was spent did shift. Surprising, time spent on social media went down, but time spent on video chats with friends or watching online streaming entertainment went up. 

As I shared in my column a few weeks ago, this again indicates that it’s not how much time we spend on social media that determines our mental state. It’s how we spend that time. If we spend it looking for connection, rather than obsessing over social status, it can be a good thing. 

Another study, from the University of Oxford, examined data on more than 300,000 adolescents and found that increased screen time has no more impact on teenager’s mental health than eating more potatoes. Or wearing glasses.

If you’re really worried about your teen’s mental health, make sure they have breakfast. Or get enough sleep. Or just spend more time with them. All those things are going to have a lot more impact than the time they spend on their phone.

To be clear, this is not me becoming a fan of Facebook or social media in general. There are still many things to be concerned about. But let’s also realize that technology — any technology– is a tool. It is not inherently good or evil. Those qualities can be found in how we choose to use technology. 

Have More People Become More Awful?

Is it just me, or do people seem a little more awful lately? There seems to be a little more ignorance in the world, a little less compassion, a little more bullying and a lot less courtesy.

Maybe it’s just me.

It’s been a while since I’ve checked in with eternal optimist Steven Pinker.  The Harvard psychologist is probably the best-known proponent of the argument that the world is consistently trending towards being a better place.  According to Pinker, we are less bigoted, less homophobic, less misogynist and less violent. At least, that’s what he felt pre-COVID lockdown. As I said, I haven’t checked in with him lately, but I suspect he would say the long-term trends haven’t appreciably changed. Maybe we’re just going through a blip.

Why, then, does the world seem to be going to hell in a hand cart?  Why do people — at least some people — seem so awful?

I think it’s important to remember that our brain likes to play tricks on us. It’s in a never-ending quest to connect cause and effect. Sometimes, to do so, the brain jumps to conclusions. Unfortunately, it is aided in this unfortunate tendency by a couple of accomplices — namely news reporting and social media. Even if the world isn’t getting shittier, it certainly seems to be. 

Let me give you one example. In my local town, an anti-masking rally was recently held at a nearby shopping mall. Local news outlets jumped on it, with pictures and video of non-masked, non-socially distanced protesters carrying signs and chanting about our decline into Communism and how their rights were being violated.

What a bunch of boneheads — right? That was certainly the consensus in my social media circle. How could people care so little about the health and safety of their community? Why are they so awful?

But when you take the time to unpack this a bit, you realize that everyone is probably overplaying their hands. I don’t have exact numbers, but I don’t think there were more than 30 or 40 protestors at the rally. The population of my city is about 150,000. These protestors represented .03% of the total population. 

Let’s say for every person at the rally, there were 10 that felt the same way but weren’t there. That’s still less than 1%. Even if you multiplied the number of protesters by 100, it would still be just 3% of my community. We’re still talking about a tiny fraction of all the people who live in my city. 

But both the news media and my social media feed have ensured that these people are highly visible. And because they are, our brain likes to use that small and very visible sample and extrapolate it to the world in general. It’s called availability bias, a cognitive shortcut where the brain uses whatever’s easy to grab to create our understanding of the world.

But availability bias is nothing new. Our brains have always done this. So, what’s different about now?

Here, we have to understand that the current reality may be leading us into another “mind-trap.” A 2018 study from Harvard introduced something called “prevalence-induced concept change,” which gives us a better understanding of how the brain focuses on signals in a field of noise. 

Basically, when signals of bad things become less common, the brain works harder to find them. We expand our definition of what is “bad” to include more examples so we can feel more successful in finding them.

I’m probably stretching beyond the limits of the original study here, but could this same thing be happening now? Are we all super-attuned to any hint of what we see as antisocial behavior so we can jump on it? 

If this is the case, again social media is largely to blame. It’s another example of our current toxic mix of dog whistlecancel culturevirtue signaling, pseudo-reality that is being driven by social media. 

That’s two possible things that are happening. But if we add one more, it becomes a perfect storm of perceived awfulness. 

In a normal world, we all have different definitions of the ethical signals we’re paying attention to. What you are focused on right now in your balancing of what is right and wrong is probably different from what I’m currently focused on. I may be thinking about gun control while you’re thinking about reducing your carbon footprint.

But now, we’re all thinking about the same thing: surviving a pandemic. And this isn’t just some theoretical mind exercise. This is something that surrounds us, affecting us every single day. When it comes to this topic, our nerves have been rubbed raw and our patience has run out. 

Worst of all, we feel helpless. There seems to be nothing we can do to edge the world toward being a less awful place. Behaviors that in another reality and on another topic would have never crossed our radar now have us enraged. And, when we’re enraged, we do the one thing we can do: We share our rage on social media. Unfortunately, by doing so, we’re not part of the solution. We are just pouring fuel on the fire.

Yes, some people probably are awful. But are they more awful than they were this time last year? I don’t think so. I also can’t believe that the essential moral balance of our society has collectively nosedived in the last several months. 

What I do believe is that we are living in a time where we’re facing new challenges in how we perceive the world. Now, more than ever before, we’re on the lookout for what we believe to be awful. And if we’re looking for it, we’re sure to find it.