The Split-Second Timing of Brand Trust

Two weeks ago, I talked about how brand trust can erode so quickly and cause so many issues. I intimated that advertising and branding have become decoupled — and advertising might even erode brand trust, leading to a lasting deficit.

Now I think that may be a little too simplistic. Brand trust is a holistic thing — the sum total of many moving parts. Taking advertising in isolation is misleading. Will one social media ad for a brand lead to broken trust? Probably not. But there may be a cumulative effect that we need to be aware of.

In looking at the Edelman Trust Barometer study closer, a very interesting picture emerges. Essentially, the study shows there is a trust crisis. Edelman calls it information bankruptcy.

The slide in trust is probably not surprising. It’s hard to be trusting when you’re afraid, and if there’s one thing the Edelman Barometer shows, it’s that we are globally fearful. Our collective hearts are in our mouths. And when this happens, we are hardwired to respond by lowering our trust and raising our defenses.

But our traditional sources for trusted information — government and media — have also abdicated their responsibilities to provide it. They have instead stoked our fears and leveraged our divides for their own gains. NGOs have suffered the same fate. So, if you can’t trust the news, your leaders or even your local charity, who can you trust?

Apparently, you can trust a corporation. Edelman shows that businesses are now the most trusted organizations in North America. Media, especially social media, is the least trusted institution. I find this profoundly troubling, but I’ll put that aside for a future post. For now, let’s just accept it at face value.

As I said in that previous column, we want to trust brands more than ever. But we don’t trust advertising. This creates a dilemma for the marketer.

This all brings to mind a study I was involved with a little over 10 years ago. Working with Simon Fraser University, we wanted to know how the brain responded to trusted brands. The initial results were fascinating — but unfortunately, we never got the chance to do the follow-up study we intended.

This was an ERP study (event-related potential), where we looked at how the brain responded when we showed brand images as a stimulus. ERP studies are useful to better understand the immediate response of the brain to something — the fast loop I talk so much about — before the slow loop has a chance to kick in and rationalize things.

We know now that what happens in this fast loop really sets the stage for what comes after. It essentially makes up the mind, and then the slow loop adds rational justification for what has already been decided.

What we found was interesting: The way we respond to our favorite brands is very similar to the way we respond to pictures of our favorite people. The first hint of this occurred in just 150 milliseconds, about one-sixth of a second. The next reinforcement was found at 400 milliseconds. In that time, less than half a second in total, our minds were made up. In fact, the mind was basically made up in about the same time it takes to blink an eye.  Everything that followed was just window dressing.

This is the power of trust. It takes a split second for our brains to recognize a situation where it can let its guard down. This sets in motion a chain of neurological events that primes the brain for cooperation and relationship-building. It primes the oxytocin pump and gets it flowing. And this all happens just that quickly.

On the other side, if a brand isn’t trusted, a very different chain of events occurs just as quickly. The brain starts arming itself for protection. Our amygdala starts gearing up. We become suspicious and anxious.

This platform of brand trust — or lack of it — is built up over time. It is part of our sense-making machinery. Our accumulating experience with the brand either adds to our trust or takes it away.

But we must also realize that if we have strong feelings about a brand, one way or the other, it then becomes a belief. And once this happens, the brain works hard to keep that belief in place. It becomes virtually impossible at that point to change minds. This is largely because of the split-second reactions our study uncovered.

This sets very high stakes for marketers today. More than ever, we want to trust brands. But we also search for evidence that this trust is warranted in a very different way. Brand building is the accumulation of experience over all touch points. Each of those touch points has its own trust profile. Personal experience and word of mouth from those we know is the highest. Advertising on social media is one of the lowest.

The marketer’s goal should be to leverage trust-building for the brand in the most effective way possible. Do it correctly, through the right channels, and you have built trust that’s triggered in an eye blink. Screw it up, and you may never get a second chance.

I’m a Fan of Friction

Here in North America, we are waging a war on friction. We use technology like a universal WD-40, spraying it on everything that rubs, squeaks or grinds. We want to move faster, more efficiently, rushing through our to-do list to get to whatever lies beyond it.

We are the culture of “one-click” ordering. We are the people that devour fast food. We relentlessly use apps to make our lives easier — which is our euphemistic way of saying that we want a life with less friction.

Pre-pandemic, I was definitely on board this bandwagon. I, like many of you, always thought friction was a bad thing. I relentlessly hunted efficiency.

This was especially true when I was still in the working world. I started every day with an impossibly long to-do list, and I was always looking for ways to help me work my way through it faster. I believed at the end of my to-do list was the secret of life.

But in the past 14 months, I’ve discovered that it’s friction that might be the secret of life.

There are bushels of newly budding life coaches telling us to be “mindful” and “live in the moment.” But we somehow believe those moments have to all be idyllic walks through a flower garden with those we love most, as the sun filters softly through the trees overhead.

Sometime “in the moment” is looking for sandpaper at Home Depot. Sometimes it’s dropping our coffee as we rush to catch the bus. And sometimes its realizing that you’re sitting next to someone you really don’t like on that five-hour flight to Boston.

All those things are “in the moment,” and maybe — just maybe — that’s what life is all about. Call it friction if you wish, but it’s all those little things we think are annoying until they’re gone.

Friction has some unique physical properties that we tend to overlook as we try to eliminate it. It is, according to one site, “resistance to motion of one object moving relative to another.” It forces us to slow down our motion, whatever direction that motion may be taking us in. And — according to the same site — scientists believe it “is the result of the electromagnetic attraction between charged particles in two touching surfaces.”

Ah hah, so friction is about attraction and our attempts to overcome that attraction! It is about us fighting our social instincts to bond with each other to keep moving to accomplish … what, exactly? Free up time to spend on Facebook? Spend more time playing a game on our phones? Will those things make us happier?

Here’s the other thing about friction. It generates heat. It warms things up. Here in North America, we call it friction. In Denmark, they call it “hygge.”

Denmark is a pretty happy place. In fact, last year it was the second happiest place on earth, according to the United Nations. And a lot of that can be attributed to what the Danish call “hygge,” which roughly translates as “cozy.”

The Danish live for coziness. And yes, the idyllic picture of hygge is spending time in front of the fire in a candlelit cabin, playing a board game with your closest friends. But hygge comes in many forms.

I personally believe that Denmark is an environment that leads to hygge because Denmark is a place that is not afraid of friction. Allow me to explain.

The ultimate way to avoid friction is to be alone. You can’t have “resistance to motion of one object moving relative to another” when there is no other object.

As we emerge from a pandemic that has necessitated removing the objects around us (people) and replacing them with more efficient, less friction-prone substitutes (technology) — whether it’s in our jobs, our daily routines, our shopping trips or our community obligations — we seem to be finding ways to continue to make the world a more efficient place for ourselves.

This is putting us at the center of an optimized universe and ruthlessly eliminating any points of resistance — a life designed by a Silicon Valley engineer. And, more and more often, we find ourselves alone at the center of that universe.

But that’s not how the Danes do it. They have created an environment that leads to bumping into each other. And hygge — with all its warm fuzziness — might just be a product of that environment.  I suspect that might not be by intention. It just worked out that way. But it does seem to work.

For example, Danes spend a lot of time riding the bus. Or riding a bike. Life in Copenhagen is full of bumping along in a meandering trip together to a destination somewhere in the future. The joy is found in the journey, as noted in this Medium post.

It seems to me that life in Denmark, or other perpetually happy countries like Finland, Switzerland, Iceland and Norway, has a lot to do with slowing down and actually embracing societal friction.

We just have to realize that we as a species evolved in an environment filled with friction. And evolution, in its blind wisdom, has made that friction a key part of how we find meaning and happiness. We find hygge when we slow down enough to notice it.

Facebook Friends Do Not Equal Real Friends

Last week, an acquaintance of mine posted on Facebook that he had just run his first 10 K. He also included a photo of the face of his Apple Watch showing his time.

Inevitably, snarkiness ensued in the comments section.

There were a few genuine messages of congratulations, but there was more virtual alpha headbutting along the lines of “that’s the best you could do?”

Finally, the original poster did a follow-up post saying (and I paraphrase liberally), “Hey, relax! I’m not looking for attaboys or coaching advice. I just wanted to let you know I ran 10 km, and I’m kinda proud of myself. It was important to me.”

This points out something we don’t often realize about our virtual social networks: They just don’t operate in the same way they do in the real world. And there are reasons why they don’t.

In the 1990s, British anthropologist Robin Dunbar was spending a lot of time hanging around monkeys. And he noticed something: They groom each other. A lot. In fact, they spend a huge chunk of their day grooming each other.

Why?

Intrigued, he started correlating brain size with social behavior. He found that primates in particular have some pretty impressive social coordination machinery locked up there in their noggins. Humans, for instance, seem to be able to juggle about 150 reasonably active social connections. This is now called Dunbar’s Number, which has become a pseudoscience trope — an intellectual tidbit we throw out to sound erudite.

Proof that we really don’t understand Dunbar’s original insight is to see what’s happened to his number, now updated for the social media age. For example, according to Brandwatch, the average number of Facebook friends is 338. That would be more than twice Dunbar’s Number. And so, predictably, there are those who say Dunbar’s Number is no longer valid. We can now handle much bigger friend networks thanks to social media.

But we can’t. And my example at the top of this post shows that.

Maintaining a friendship requires cognitive effort. There is a big difference between a Facebook “friend” and a true friend. True friends will pick lice out of your fur — or they would, if they were monkeys. Facebook Friends feel they’re entitled to belittle your 10K run. See the difference?

Let’s go back to Robin’s Dunbar’s original thesis. Dunbar actually mentioned many numbers (all are approximations):

— Five “intimate” friends. This is your support group — the people who know you best.

— 15 “sympathetic” friends whom you can confide in.

— 50 “close” friends. You may not see them all the time, but if you were having a milestone birthday party, they’d be on your guest list.

— Now we have the 150 “friends.” If you ran into them on the street, you’d probably suggest a cup of coffee (or, in my case, a beer) for a chance to catch up.

— The next circle out is 500 “acquaintances.” You probably know just the briefest of back stories about them — like how you know them.

—  Finally, we have 1,500 as our cognitive limit. On a good day, we may remember their name if we see them.

Here’s a quick and clever thought exercise to sort your network into one of these groups (this courtesy of my daughter Lauren — I give credit where credit is due). Imagine someone walks up to you and asks, “How are you doing?”

How you answer this question will depend on which group the questioner falls into. The biggest group of 1,500 probably won’t ask. They don’t care. The group of 500 acquaintances will get a standard “Fine” in response. There will be no follow-up. The 150 will get a little more — a few details of a big event or life development if relevant. The 50 “close friends” will get slightly more honesty. Perhaps you’ll be willing to guardedly open up some sensitive areas. The 15 “sympathetic friends” are a safe zone. You’ll feel like you can open up completely. And the five “intimate friends” don’t have to ask. They know how you’re doing.

I’ve talked before about strong ties and weak ties in our social networks. Strong ties are built through shared experiences and understanding. You really have to know someone to have a strong tie with them. They are the ties that bind the first two of Dunbar’s circles.

As we move to the third circle, the “close friends,” we’re moving into the transition zone between strong ties and weak ties. From there on, it’s all weak ties. If you need a job or a recommendation of a good plumber, you’d reach out. Otherwise, you have little in common.

The stronger the tie, the more effort it takes to maintain it. These are the cognitive limits that Dunbar was talking about. You have to remember all those back stories, those things they love and hate, what motivates them, what makes them sad. It takes time to learn all those things. And it takes a frequency of connection to keep up with them as they change. We are not static creatures — as has been shown especially in the last year.

This is the problem with social media. When we post something, we generally don’t post just for our intimate friends, or our sympathetic friends. We post it across our whole network, bound with both strong and weak ties. We have lost the common social understanding that keeps us sane in the real world.

For my 10K runner, those in their closest circles would have responded appropriately. But most of those who did comment, the ones who had no strong ties to the poster, didn’t know the 10K was a big deal.

Facebook does have some tools for limited posts to selected groups, but almost none of us use them or maintain them. We don’t have the time.

This is where Dunbar’s insight on our social capabilities breaks down when it comes to social media.  In the real world, multiple factors —  including physical proximity, shared circumstances and time spent with each other —  naturally keep our network sorted into the right categories.

But these factor don’t apply in social media. We broadcast out to all circles at once. And those circles, in turn, feel entitled by the false intimacy of social media to respond without the context needed to do so appropriately.

Our current circumstances are exacerbating this problem. In normal times, we might not be posting as much as we currently are on social media. But for many of us, Facebook might be all we’ve got. We just have to realize that if we’re depending on it for social affirmation, this virtual world doesn’t play by the same rules as the physical one.

Social Media Reflects Rights Vs. Obligations Split

Last week MediaPost writer (and my own editor here on Media Insider) Phyllis Fine asked this question in a post: “Can Social Media Ease the Path to Herd Immunity?” The question is not only timely, but also indicative of the peculiar nature of social media that could be stated thus: for every point of view expressed, there is an equal — and opposite — point of view. Fine’s post quotes a study from the Institute of Biomedical Ethics and History of Medicine at the University of Zurich, which reveals, “Anti-vaccination supporters find fertile ground in particular on Facebook and Twitter.”

Here’s the thing about social media. No matter what the message might be, there will be multiple interpretations of it. Often, the most extreme interpretations will be diametrically opposed to each other. It’s stunning how the very same content can illustrate the vast ideological divides that separate us.

I’ve realized that the only explanation for this is that our brains must work differently. We’re not even talking apples and oranges here. This is more like ostrich eggs and vacuum cleaners.

This is not my own revelation. There’s a lot of science behind it. An article in Scientific American catalogs some of the difference between conservative and liberal brains. Even the actual structure is different. According to the article: “The volume of gray matter, or neural cell bodies, making up the anterior cingulate cortex, an area that helps detect errors and resolve conflicts, tends to be larger in liberals. And the amygdala, which is important for regulating emotions and evaluating threats, is larger in conservatives.”

We have to understand that a right-leaning brain operates very differently than a left-leaning brain. Recent neuro-imaging studies have shown that they can consider the very same piece of information and totally different sections of their respective brains light up. They process information differently.

In a previous post about this topic, I quoted biologist and author Robert Sapolsky as saying, “Liberals are more likely to process information systematically, recognize differences in argument quality, and to be persuaded explicitly by scientific evidence, whereas conservatives are more likely to process information heuristically, attend to message-irrelevant cues such as source similarity, and to be persuaded implicitly through evaluative conditioning. Conservatives are also more likely than liberals to rely on stereotypical cues and assume consensus with like-minded others.”

Or, to sum it up in plain language: “Conservatives start gut and stay gut; liberals go from gut to head.”

This has never been clearer than in the past year. Typically, the information being processed by a conservative brain would have little overlap with the information being processed by a liberal brain. Each would care and think about different things.

But COVID-19 has forced the two circles of this particular Venn diagram together, creating a bigger overlap in the middle. We are all focused on information about the pandemic. And this has created a unique opportunity to more directly compare the cognitive habits of liberals versus conservatives.

Perhaps the biggest difference is in the way each group defines morality. At the risk of a vast oversimplification, the right tends to focus on individual rights, especially those they feel they’re personally are at risk of losing. The left thinks more in terms of societal obligations: What do we need to do — or not do — for the greater good of us all?  To paraphrase John F. Kennedy, conservatives ask what their country can do for them; liberals ask what they can do for their country.

This theory is part of Jonathon Haidt’s Moral Foundations Theory. What Haidt, working with others, has found is that both the right and left have morals, but they are defined differently. This “moral pluralism” means that two people can look at the same social media post but take two entirely different messages from it. And both will insist their interpretation is the correct one. Liberals can see a post about getting a vaccine as an appeal to their concern for the collective well being of their community. Conservatives see it as an attack on their personal rights.

So when we ask a question like “Can social media ease the path to herd immunity?” we run into the problem of message interpretation. For some, it will be preaching to the choir. For others, it will have the same effect as a red cape in front of a bull.

It’s interesting that the vaccine question is being road-blocked by this divide between rights and obligations. It shows just how far the two sides are apart. With a vaccine, at least both sides have skin in the game. Getting a vaccine can save your life, no matter how you vote. Wearing a face mask is a different matter.

In my lifetime, I have never seen a more overt signalling of ideological leanings than whether you choose to wear a face mask or not. When we talk about rights vs obligations, this is the ultimate acid test. If I insist on wearing a mask, as I do, I’m not wearing it for me, I’m wearing it for you. It’s part of my obligation to my community. But if you refuse to wear a mask, it’s pretty obvious who you’re focused on.

The thing that worries me the most about this moral dualism is that a moral fixation on individual rights is not sustainable. It’s assuming that our society is a zero-sum game. In order for me to win, you must lose. If we focus instead on our obligations, we approach society with an abundance mentality. As we contribute, we all benefit.

At least, that’s how my brain sees it.

COVID And The Chasm Crossing

For most of us, it’s been a year living with the pandemic. I was curious what my topic was a year ago this week. It was talking about the brand crisis at a certain Mexican brewing giant when its flagship brand was suddenly and unceremoniously linked with a global pandemic. Of course, we didn’t know then just how “global” it would be back then.

Ahhh — the innocence of early 2020.

The past year will likely be an historic inflection point in many societal trend lines. We’re not sure at this point how things will change, but we’re pretty sure they will change. You can’t take what has essentially been a 12-month anomaly in everything we know as normal, plunk it down on every corner of the globe and expect everything just to bounce back to where it was.

If I could vault 10 years in the future and then look back at today, I suspect I would be talking about how our relationship with technology changed due to the pandemic. Yes, we’re all sick of Zoom. We long for the old days of actually seeing another face in the staff lunchroom. And we realize that bingeing “Emily in Paris” on Netflix comes up abysmally short of the actual experience of stepping in dog shit as we stroll along the Seine.

C’est la vie.

But that’s my point. For the past 12 months, these watered-down digital substitutes have been our lives. We were given no choice. And some of it hasn’t sucked. As I wrote last week, there are times when a digital connection may actually be preferable to a physical one.

There is now a whole generation of employees who are considering their work-life balance in the light of being able to work from home for at least part of the time. Meetings the world over are being reimagined, thanks to the attractive cost/benefit ratio of being able to attend virtually. And, for me, I may have permanently swapped riding my bike trainer in my basement for spin classes in the gym. It took me a while to get used to it, but now that I have, I think it will stick.

Getting people to try something new — especially when it’s technology — is a tricky process. There are a zillion places on the uphill slope of the adoption curve where we can get mired and give up. But, as I said, that hasn’t been an option for us in the past 12 months. We had to stick it out. And now that we have, we realize we like much of what we were forced to adopt. All we’re asking for is the freedom to pick and choose what we keep and what we toss away.

I suspect  many of us will be a lot more open to using technology now that we have experienced the tradeoffs it entails between effectiveness and efficiency. We will make more room in our lives for a purely utilitarian use of technology, stripped of the pros and cons of “bright shiny object” syndrome.

Technology typically gets trapped at both the dread and pseudo-religious devotion ends of the Everett Rogers Adoption Curve. Either you love it, or you hate it. Those who love it form the market that drives the development of our technology, leaving those who hate it further and further behind.

As such, the market for technology tends to skew to the “gee whiz” end of the market, catering to those who buy new technology just because it’s new and cool. This bias has embedded an acceptance of planned obsolescence that just seems to go hand-in-hand with the marketing of technology. 

My previous post about technology leaving seniors behind is an example of this. Even if seniors start out as early adopters, the perpetual chase of the bright shiny object that typifies the tech market can leave them behind.

But COVID-19 changed all that. It suddenly forced all of us toward the hump that lies in the middle of the adoption curve. It has left the world no choice but to cross the “chasm” that  Geoffrey Moore wrote about 30 years ago in his book “Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers.” He explained that the chasm was between “visionaries (early adopters) and pragmatists (early majority),” according to Wikipedia.

This has some interesting market implications. After I wrote my post, a few readers reached out saying they were working on solutions that addressed the need of seniors to stay connected with a device that is easier for them to use and is not subject to the need for constant updating and relearning. Granted, neither of them was from Apple nor Google, but at least someone was thinking about it.

As the pandemic forced the practical market for technology to expand, bringing customers who had everyday needs for their technology, it created more market opportunities. Those opportunities create pockets of profit that allow for the development of tools for segments of the market that used to be ignored.

It remains to be seen if this market expansion continues after the world returns to a more physically based definition of normal. I suspect it will.

This market evolution may also open up new business model opportunities — where we’re actually willing to pay for online services and platforms that used to be propped up by selling advertising. This move alone would take technology a massive step forward in ethical terms. We wouldn’t have this weird moral dichotomy where marketers are grieving the loss of data (as fellow Media Insider Ted McConnell does in this post) because tech is finally stepping up and protecting our personal privacy.

Perhaps — I hope — the silver lining in the past year is that we will look at technology more as it should be: a tool that’s used to make our lives more fulfilling.

To Be There – Or Not To Be There

According to Eventbrite, hybrid events are the hottest thing for 2021. So I started thinking, what would that possibly look like, as a planner or a participant?

The interesting thing about hybrid events is that they force us to really think about how we experience things. What process do we go through when we let the outside world in? What do we lose if we do that virtually? What do we gain, if anything? And, more importantly, how do we connect with other people during those experiences?

These are questions we didn’t think much about even a year ago. But today, in a reality that’s trying to straddle both the physical and virtual worlds, they are highly relevant to how we’ll live our lives in the future.

The Italian Cooking Lesson

First, let’s try a little thought experiment.

In our town, the local Italian Club — in which both my wife and I are involved — offered cooking lessons before we were all locked down. Groups of eight to 12 people would get together with an exuberant Italian chef in a large commercial kitchen, and together they would make an authentic dish like gnocchi or ravioli. There was a little vino, a little Italian culture and a lot of laughter. These classes were a tremendous hit.

That all ended last March. But we hope to we start thinking about offering them again late in 2021 or 2022. And, if we do, would it make sense to offer them as a “hybrid” event, where you can participate in person or pick up a box of preselected ingredients and follow along in your own kitchen?

As an event organizer, this would be tempting. You can still charge the full price for physical attendance where you’re restricted to 12 people, but you could create an additional revenue stream by introducing a virtual option that could involve as many people as possible. Even at a lower registration fee, it would still dramatically increase revenue at a relatively small incremental cost. It would be “molto” profitable.

But now consider this as an attendee.Would you sign up for a virtual event like that? If you had no other option to experience it, maybe. But what if you could actually be there in person? Then what? Would you feel relegated to a second-class experience by being isolated in your own kitchen, without many of the sensory benefits that go along with the physical experience?

The Psychology of Zoom Fatigue

When I thought about our cooking lesson example, I was feeling less than enthused. And I wondered why.

It turns out that there’s some actual brain science behind my digital ennui. In an article in the Psychiatric Times, Jena Lee, MD, takes us on a “Neuropsychological Exploration of Zoom Fatigue.”

A decade ago, I was writing a lot about how we balance risk and reward. I believe that a lot of our behaviors can be explained by how we calculate the dynamic tension between those two things. It turns out that it may also be at the root of how we feel about virtual events. Dr. Lee explains,

“A core psychological component of fatigue is a rewards-costs trade-off that happens in our minds unconsciously. Basically, at every level of behavior, a trade-off is made between the likely rewards versus costs of engaging in a certain activity.”

Let’s take our Italian cooking class again. Let’s imagine we’re there in person. For our brain, this would hit all the right “reward” buttons that come with being physically “in the moment.” Subconsciously, our brains would reward us by releasing oxytocin and dopamine along with other “pleasure” neurochemicals that would make the experience highly enjoyable for us. The cost/reward calculation would be heavily weighted toward “reward.”

But that’s not the case with the virtual event. Yes, it might still be considered “rewarding,” but at an entirely different — and lesser — scale of the same “in-person” experience. In addition, we would have the additional costs of figuring out the technology required, logging into the lesson and trying to follow along. Our risk/reward calculator just might decide the tradeoffs required weren’t worth it.

Without me even knowing it, this was the calculation that was going on in my head that left me less than enthused.

 But there is a flip side to this.

Reducing the Risk Virtually

Last fall, a new study from Oracle in the U.K. was published with the headline, “82% of People Believe Robots Can Support Their Mental Health Better than Humans.”

Something about that just didn’t seem right to me. How could this be? Again, we had the choice between virtual and physical connection, and this time the odds were overwhelmingly in favor of the virtual option.

But when I thought about it in terms of risk and reward, it suddenly made sense. Talking about our own mental health is a high-risk activity. It’s sad to say, but opening up to your manager about job-related stress could get you a sympathetic ear, or it could get you fired. We are taking baby steps towards destigmatizing mental health issues, but we’re at the beginning of a very long journey.

In this case, the risk/reward calculation is flipped completely around. Virtual connections, which rely on limited bandwidth — and therefore limited vulnerability on our part — seem like a much lower risk alternative than pouring our hearts out in person. This is especially true if we can remain anonymous.

It’s All About Human Hardware

The idea of virtual/physical hybrids with expanded revenue streams will be very attractive to marketers and event organizers. There will be many jumping on this bandwagon. But, like all the new opportunities that technology brings us, it has to interface with a system that has been around for hundreds of thousands of years — otherwise known as our brain.

The Crazy World of Our Media Obsessions

Are you watching the news less? Me too. Now that the grownups are back in charge, I’m spending much less time checking my news feed.

Whatever you might say about the last four years, it certainly was good for the news business. It was one long endless loop of driving past a horrific traffic accident. Try as we might, we just couldn’t avoid looking.

But according to Internet analysis tool Alexa.com, that may be over. I ran some traffic rank reports for major news portals and they all look the same: a ramp-up over the past 90 days to the beginning of February, and then a precipitous drop off a cliff.

While all the top portals have a similar pattern, it’s most obvious on Foxnews.com.

It was as if someone said, “Show’s over folks. There’s nothing to see here. Move along.” And after we all exhaled, we did!

Not surprisingly, we watch the news more when something terrible is happening. It’s an evolved hardwired response called negativity bias.

Good news is nice. But bad news can kill you. So it’s not surprising that bad news tends to catch our attention.

But this was more than that. We were fixated by Trump. If it were just our bias toward bad news, we would still eventually get tired of it.

That’s exactly what happened with the news on COVID-19. We worked through the initial uncertainty and fear, where we were looking for more information, and at some point moved on to the subsequent psychological stages of boredom and anger. As we did that, we threw up our hands and said, “Enough already!”

But when it comes to Donald Trump, there was something else happening.

It’s been said that Trump might have been the best instinctive communicator to ever take up residence in the White House. We might not agree with what he said, but we certainly were listening.

And while we — and by we, I mean me — think we would love to put him behind us, I believe it behooves us to take a peek under the hood of this particular obsession. Because if we fell for it once, we could do it again.

How the F*$k did this guy dominate our every waking, news-consuming moment for the past four years?

We may find a clue in Bob Woodward’s book on Trump, Rage. He explains that he was looking for a “reflector” — a person who knew Trump intimately and could provide some relatively objective insight into his character.

Woodward found a rather unlikely candidate for his reflector: Trump’s son-in-law, Jared Kushner.

I know, I know — “Kushner?” Just bear with me.

In Woodward’s book, Kushner says there were four things you needed to read and “absorb” to understand how Trump’s mind works.

The first was an op-ed piece in The Wall Street Journal by Peggy Noonan called “Over Trump, We’re as Divided as Ever.” It is not complimentary to Trump. But it does begin to provide a possible answer to our ongoing fixation. Noonan explains: “He’s crazy…and it’s kind of working.”

The second was the Cheshire Cat in Alice in Wonderland. Kushner paraphrased: “If you don’t know where you’re going, any path will get you there.” In other words, in Trump’s world, it’s not direction that matters, it’s velocity.

The third was Chris Whipple’s book, The Gatekeepers: How the White House Chiefs of Staff Define Every Presidency. The insight here is that no matter how clueless Trump was about how to do his job, he still felt he knew more than his chiefs of staff.

Finally, the fourth was Win Bigly: Persuasion in a World Where Facts Don’t Matter, by Scott Adams. That’s right — Scott Adams, the same guy who created the “Dilbert” comic strip. Adams calls Trump’s approach “Intentional Wrongness Persuasion.”

Remember, this is coming from Kushner, a guy who says he worships Trump. This is not apologetic. It’s explanatory — a manual on how to communicate in today’s world. Kushner is embracing Trump’s instinctive, scorched-earth approach to keeping our attention focused on him.

It’s — as Peggy Noonan realized — leaning into the “crazy.”  

Trump represented the ultimate political tribal badge. All you needed to do was read one story on Trump, and you knew exactly where you belonged. You knew it in your core, in your bones, without any shred of ambiguity or doubt. There were few things I was as sure of in this world as where I stood on Donald J. Trump.

And maybe that was somehow satisfying to me.

There was something about standing one side or the other of the divide created by Trump that was tribal in nature.

It was probably the clearest ideological signal about what was good and what was bad that we’ve seen for some time, perhaps since World War II or the ’60s — two events that happened before most of our lifetimes.

Trump’s genius was that he somehow made both halves of the world believe they were the good guys.

In 2018, Peggy Noonan said that “Crazy won’t go the distance.” I’d like to believe that’s so, but I’m not so sure. There are certainly others that are borrowing a page from Trump’s playbook.  Right-wing Republicans Marjorie Taylor Greene and Lauren Boebert are both doing “crazy” extraordinarily well. The fact that almost none of you had to Google them to know who they are proves this.

Whether we’re loving to love, or loving to hate, we are all fixated by crazy.

The problem here is that our media ecosystem has changed. “Crazy” used to be filtered out. But somewhere along the line, news outlets discovered that “crazy” is great for their bottom lines.

As former CBS Chairman and CEO Leslie Moonves said when Trump became the Republican Presidential forerunner back in 2016, “It may not be good for America, but it’s damned good for CBS.”

Crazy draws eyeballs like, well, like crazy. It certainly generates more user views then “normal” or “competent.”

In our current media environment  — densely intertwined with the wild world of social media — we have no crazy filters. All we have now are crazy amplifiers.

And the platforms that allow this all try to crowd on the same shaky piece of moral high ground.

According to them, it’s not their job to filter out crazy. It’s anti-free speech. It’s un-American. We should be smart enough to recognize crazy when we see it.

Hmmm. Well, we know that’s not working.

Connected Technologies are Leaving Our Seniors Behind

One of my pandemic projects has been editing a video series of oral history interviews we did with local seniors in my community. Last week, I finished the first video in the series. The original plan, pre-pandemic, was to unveil the video as a special event at a local theater, with the participants attending. Obviously, given our current reality, we had to change our plans.

We, like the rest of the world, moved our event online. As I started working through the logistics of this, I quickly realized something: Our seniors are on the other side of a wide and rapidly growing chasm. Yes, our society is digitally connected in ways we never were before, but those connections are not designed for the elderly. In fact, if you were looking for something that seems to be deliberately designed to disadvantage a segment of our population, it would be hard to find a better example than Internet connection and the elderly.

I have to admit, for much of the past year, I have been pretty focused on what I have sacrificed because of the pandemic. But I am still a pretty connected person. I can Zoom and have a virtual visit with my friends. If I wonder how my daughters are doing, I can instantly text them. If I miss their faces, I can FaceTime them. 

I have taken on the projects I’ve been able to do thanks to the privilege of being wired into the virtual world.   I can even go on a virtual bike ride with my friends through the streets of London, courtesy of Zwift.

Yes, I have given up things, but I have also been able find digital substitutes for many of those things. I’m not going to say it’s been perfect, but it’s certainly been passable.

My stepdad, who is turning 86, has been able to do none of those things. He is in a long-term care home in Alberta, Canada. His only daily social connections consist of brief interactions with staff during mealtime and when they check his blood sugar levels and give him his medication. All the activities that used to give him a chance to socialize are gone. Imagine life for him, where his sum total of connection is probably less than 30 minutes a day. And, on most days, none of that connecting is done with the people he loves.

Up until last week, family couldn’t even visit him. He was locked down due to an outbreak at his home. For my dad, there were no virtual substitutes available. He is not wired in any way for digital connection. If anyone has paid the social price of this pandemic, it’s been my dad and people like the seniors I interviewed, for whom I was desperately trying to find a way for them just to watch a 13-minute video that they had starred in.

A recent study by mobile technology manufacturer Ericsson looked specifically at the relationship between technology and seniors during the pandemic. The study focused on what the company termed the “young-old” seniors, those aged 65-74. They didn’t deal with “middle-old” (aged 75-85) or “oldest-old” (86 plus) because — well, probably because Ericsson couldn’t find enough who were connected to act as a representative sample.

But they did find that even the “young old” were falling behind in their ability to stay connected thanks to COVID-19. These are people who have owned smartphones for at least a decade, many of whom had to use computers and technology in their jobs. Up until a year ago, they were closing the technology gap with younger generations. Then, last March, they started to fall behind.

They were still using the internet, but younger people were using it even more. And, as they got older, they were finding it increasingly daunting to adopt new platforms and technology. They didn’t have the same access to “family tech support” of children or grandchildren to help get them over the learning curve. They were sticking to the things they knew how to do as the rest of the world surged forward and started living their lives in a digital landscape.

But this was not the group that was part of my video project. My experience had been with the “middle old” and “oldest old.” Half fell into the “middle old” group and half fell into the “oldest old” group. Of the eight seniors I was dealing with, only two had emails. If the “young old” are being left behind by technology, these people were never in the race to begin with. As the world was forced to reset to an online reality, these people were never given the option. They were stranded in a world suddenly disconnected from everything they knew and loved.

Predictably, the Ericsson study proposes smartphones as the solution for many of the problems of the pandemic, giving seniors more connection, more confidence and more capabilities. If only they got connected, the study says, life will be better.

But that’s not a solution with legs. It won’t go the distance. And to understand why, we just have to look at the two age cohorts the study didn’t focus on, the “middle old” and the “oldest old.”

Perhaps the hardest hit have been the “oldest old,” who have sacrificed both physical and digital connection, as this Journals of Gerontologyarticle notes.   Four from my group lived in long-term care facilities. Many of these were locked down at some point due to local outbreaks within the facility. Suddenly, that family support they required to connect with their family and friends was no longer available. The technological tools  that we take for granted — which we were able to slot in to take the place of things we were losing — were unimaginable to them. They were literally sentenced to solitary confinement.

A recent study from Germany found that only 3% of those living in long-term care facilities used an internet-connected device. A lot of the time, cognitive declines, even when they’re mild, can make trying to use technology an exercise in frustration.

When my dad went into his long-term care home, my sister and I gave him one of our old phones so he could stay connected. We set everything up and did receive a few experimental texts from him. But soon, it just became too confusing and frustrating for him to use without our constant help. He played solitaire on it for a while, then it ended up in a drawer somewhere. We didn’t push the issue. It just wasn’t the right fit.

But it’s not just my dad who struggled with technology. Even if an aging population starts out as reasonably proficient users, it can be overwhelming to keep up with new hardware, new operating systems and new security requirements. I’m not even “young old” yet, and I’ve worked with technology all my life. I owned a digital marketing company, for heaven’s sake. And even for me, it sometimes seems like a full-time job staying on top of the constant stream of updates and new things to learn and troubleshoot. As connected technology leaps forward, it does not seem unduly concerned that it’s leaving the most vulnerable segment of our population behind.

COVID-19 has pushed us into a virtual world where connection is not just a luxury, but a condition of survival. We need to connect to live. That is especially true for our seniors, who have had all the connections they relied on taken from them. We can’t leave them behind. Connected technology can no longer ignore them.

This is one gap we need to build a bridge over.

Our Disappearing Attention Spans

Last week, Mediapost Editor in Chief Joe Mandese mused about our declining attention spans. He wrote, 

“while in the past, the most common addictive analogy might have been opiates — as in an insatiable desire to want more — these days [consumers] seem more like speed freaks looking for the next fix.”

Mandese cited a couple of recent studies, saying that more than half of mobile users tend to abandon any website that takes longer than three seconds to load. That

“has huge implications for the entire media ecosystem — even TV and video — because consumers increasingly are accessing all forms of content and commerce via their mobile devices.”

The question that begs to be asked here is, “Is a short attention span a bad thing?” The famous comparison is that we are now more easily distracted than a goldfish. But does a shorter attention span negatively impact us, or is it just our brain changing to be a better fit with our environment?

Academics have been debating the impact of technology on our ability to cognitively process things for some time. Journalist Nicholas Carr sounded the warning in his 2010 book, “The Shallows,” where he wrote, 

” (Our brains are) very malleable, they adapt at the cellular level to whatever we happen to be doing. And so the more time we spend surfing, and skimming, and scanning … the more adept we become at that mode of thinking.”

Certainly, Carr is right about the plasticity of our brains. It’s one of the most advantageous features about them. But is our digital environment forever pushing our brains to the shallow end of the pool? Well, it depends. Context is important. One of the biggest factors in determining how we process the information we’re seeing is the device where we’re seeing it.

Back in 2010, Microsoft did a large-scale ethnographic study on how people searched for information on different devices. The researchers found those behaviors differed greatly depending on the platform being used and the intent of the searcher. They found three main categories of search behaviors:

  • Missions are looking for one specific answer (for example, an address or phone number) and often happen on a mobile device.
  • Excavations are widespread searches that need to combine different types of information (for example, researching an upcoming trip or major purchase). They are usually launched on a desktop.
  • Finally, there are Explorations: searching for novelty, often to pass the time. These can happen on all types of devices and can often progress through different devices as the exploration evolves. The initial search may be launched on a mobile device, but as the user gets deeper into the exploration, she may switch to a desktop.

The important thing about this research was that it showed our information-seeking behaviors are very tied to intent, which in turn determines the device used. So, at a surface level, we shouldn’t be too quick to extrapolate behaviors seen on mobile devices with certain intents to other platforms or other intents. We’re very good at matching a search strategy to the strengths and weaknesses of the device we’re using.

But at a deeper level, if Carr is right (and I believe he is) about our constant split-second scanning of information to find items of interest making permanent changes in our brains, what are the implications of this?

For such a fundamentally important question, there is a small but rapidly growing body of academic research that has tried to answer it. To add to the murkiness, many of the studies done contradict each other. The best summary I could find of academia’s quest to determine if “the Internet is making us stupid” was a 2015 article in academic journal The Neuroscientist.

The authors sum up by essentially saying both “yes” — and “no.” We are getting better at quickly filtering through reams of information. We are spending fewer cognitive resource memorizing things we know we can easily find online, which theoretically leaves those resources free for other purposes. Finally, for this post, I will steer away from commenting on multitasking, because the academic jury is still very much out on that one.

But the authors also say that 

“we are shifting towards a shallow mode of learning characterized by quick scanning, reduced contemplation and memory consolidation.”

The fact is, we are spending more and more of our time scanning and clicking. There are inherent benefits to us in learning how to do that faster and more efficiently. The human brain is built to adapt and become better at the things we do all the time. But there is a price to be paid. The brain will also become less capable of doing the things we don’t do as much anymore. As the authors said, this includes actually taking the time to think.

So, in answer to the question “Is the Internet making us stupid?,” I would say no. We are just becoming smart in a different way.

But I would also say the Internet is making us less thoughtful. And that brings up a rather worrying prospect.

As I’ve said many times before, the brain thinks both fast and slow. The fast loop is brutally efficient. It is built to get stuff done in a split second, without having to think about it. Because of this, the fast loop has to be driven by what we already know or think we know. Our “fast” behaviors are necessarily bounded by the beliefs we already hold. It’s this fast loop that’s in control when we’re scanning and clicking our way through our digital environments.

But it’s the slow loop that allows us to extend our thoughts beyond our beliefs. This is where we’ll find our “open minds,” if we have such a thing. Here, we can challenge our beliefs and, if presented with enough evidence to the contrary, willingly break them down and rebuild them to update our understanding of the world. In the sense-making loop, this is called reframing.

The more time we spend “thinking fast” at the expense of “thinking slow,” the more we will become prisoners to our existing beliefs. We will be less able to consolidate and consider information that lies beyond those boundaries. We will spend more time “parsing” and less time “pondering.” As we do so, our brains will shift and change accordingly.

Ironically, our minds will change in such a way to make it exceedingly difficult to change our minds.

The Ebbs and Flows of Consumerism in a Post-Pandemic World

As MediaPost’s Joe Mandese reported last Friday, advertising was, quite literally, almost decimated worldwide in 2020. If you look at the forecasts of the top agency holding companies, ad spends were trimmed by an average of 6.1%. It’s not quite one dollar in 10, but it’s close.

These same companies are forecasting a relative bounceback in 2021, starting slow and accelerating quarter by quarter through the year — but that still leaves the 2021 spend forecast back at 2018 levels.

And as we know, everything about 2021 is still very much in flux. If the year 2021 was a pack of cards, almost every one of them would be wild.

This — according to physician, epidemiologist and sociologist Nicholas Christakis — is not surprising.

Christakis is one of my favorite observers of network effects in society. His background in epidemiological science gives him a unique lens to look at how things spread through the networks of our world, real and virtual. It also makes him the perfect person to comment on what we might expect as we stagger out of our current crisis.

In his latest book, “Apollo’s Arrow,” he looks back to look forward to what we might expect — because, as he points out, we’ve been here before.

While the scope and impact of this one is unusual, such health crises are nothing new. Dozens of epidemics and a few pandemics have happened in my lifetime alone, according to this Wikipedia chart.

This post goes live on Groundhog Day, perhaps the most appropriate of all days for it to run. Today, however, we already know what the outcome will be. The groundhog will see its shadow and there will be six more months (at least) of pandemic to deal with. And we will spend that time living and reliving the same day in the same way with the same routine.

Christakis expects this phase to last through the rest of this year, until the vaccines are widely distributed, and we start to reach herd immunity.

During this time, we will still have to psychologically “hunker down” like the aforementioned groundhog, something we have been struggling with. “As a society we have been very immature,” said Christakis. “Immature, and typical as well, we could have done better.”

This phase will be marked by a general conservatism that will go in lockstep with fear and anxiety, a reluctance to spend and a trend toward risk aversion and religion.

Add to this the fact that we will still be dealing with widespread denialism and anger, which will lead to a worsening vicious circle of loss and crisis. The ideological cracks in our society have gone from annoying to deadly.

Advertising will have to somehow negotiate these choppy waters of increased rage and reduced consumerism.

Then, predicts Christakis, starting some time in 2022, we will enter an adjustment period where we will test and rethink the fundamental aspects of our lives. We will be learning to live with COVID-19, which will be less lethal but still very much present.

We will likely still wear masks and practice social distancing. Many of us will continue to work from home. Local flare-ups will still necessitate intermittent school and business closures. We will be reluctant to be inside with more than 20 or 30 people at a time. It’s unlikely that most of us will feel comfortable getting on a plane or embarking on a cruise ship. This period, according to Christakis, will last for a couple years.

Again, advertising will have to try to thread this psychological needle between fear and hope. It will be a fractured landscape on which to build a marketing strategy. Any pretense of marketing to the masses, a concept long in decline, will now be truly gone. The market will be rife with confusing signals and mixed motivations. It will be incumbent on advertisers to become very, very good at “reading the room.”

Finally, starting in 2024, we will have finally put the pandemic behind us. Now, says Christakis, four years of pent-up demand will suddenly burst through the dam of our delayed self-gratification. We will likely follow the same path taken a century ago, when we were coming out of a war and another pandemic, in the period we call the “Roaring Twenties.”

Christakis explained: “What typically happens is people get less religious. They will relentlessly seek out social interactions in nightclubs and restaurants and sporting events and political rallies. There’ll be some sexual licentiousness. People will start spending their money after having saved it. They’ll be joie de vivre and a kind of risk-taking, a kind of efflorescence of the arts, I think.”

Of course, this burst of buying will be built on the foundation of what came before. The world will likely be very different from its pre-pandemic version. It will be hard for marketers to project demand in a straight line from what they know, because the experiences they’ve been using as their baseline are no longer valid. Some things may remain the same, but some will be changed forever.

COVID-19 will have pried many of the gaps in our society further apart — most notably those of income inequality and ideological difference. A lingering sense of nationalism and protectionism born from dealing with a global emergency could still be in place.

Advertising has always played an interesting role in our lives. It both motivates and mirrors us.

But the reflection it shows is like a funhouse mirror: It distorts some aspects of our culture and ignores others. It creates demand and hides inconvenient truths. It professes to be noble, while it stokes the embers of our ignobility. It amplifies the duality of our human nature.

Interesting times lie ahead. It remains to be seen how that is reflected in the advertising we create and consume.