The Ten Day Tech Detox

I should have gone cold turkey on tech. I really should have.

It would have been the perfect time – should have been the perfect time.

But I didn’t. As I spent 10 days on BC’s gorgeous sunshine coast with family, I also trundled along my assortment of connected gadgets. 

But I will say it was a partially successful detox. I didn’t crack open the laptop as much as I usually do. I generally restricted use of my iPad to reading a book.

But my phone – it was my phone, always within reach, that tempted me with social media’s siren call.

In a podcast, Andrew Selepak, social media professor at the University of Florida, suggests that rather than doing a total detox that is probably doomed to fail, you use vacations as an opportunity to use tech as a tool rather than an addiction.

I will say that for most of the time, that’s what I did. As long as I was occupied with something I was fine. 

Boredom is the enemy. It’s boredom that catches you. And the sad thing was, I really shouldn’t have been bored. I was in one of the most beautiful places on earth. I had the company of people I loved. I saw humpback whales – up close – for Heaven’s sake. If ever there was a time to live in the moment, to embrace the here and now, this was it. 

The problem, I realized, is that we’re not really comfortable any more with empty spaces – whether they be in conversation, in our social life or in our schedule of activities. We feel guilt and anxiety when we’re not doing anything.

It was an interesting cycle. As I decompressed after many weeks of being very busy, the first few days were fine. “I need this,” I kept telling myself. It’s okay just to sit and read a book. It’s okay not to have every half-hour slot of the day meticulously planned to jam as much in as possible.

That lasted about 48 hours. Then I started feeling like I should be doing something. I was uncomfortable with the empty spaces.

The fact is, as I learned – boredom always has been part of the human experience. It’s a feature – not a bug. As I said, boredom represents the empty spaces that allow themselves to be filled with creativity.  Alicia Walf, a neuroscientist and a senior lecturer in the Department of Cognitive Science at Rensselaer Polytechnic Institute, says it is critical for brain health to let yourself be bored from time to time.

“Being bored can help improve social connections. When we are not busy with other thoughts and activities, we focus inward as well as looking to reconnect with friends and family. 

Being bored can help foster creativity. The eureka moment when solving a complex problem when one stops thinking about it is called insight.

Additionally, being bored can improve overall brain health.  During exciting times, the brain releases a chemical called dopamine which is associated with feeling good.  When the brain has fallen into a predictable, monotonous pattern, many people feel bored, even depressed. This might be because we have lower levels of dopamine.”

That last bit, right there, is the clue why our phones are particularly prone to being picked up in times of boredom. Actually, three things are at work here. The first is that our mobile devices let us carry an extended social network in our pockets. In an article from Harvard, this is explained: “Thanks to the likes of Facebook, Snapchat, Instagram, and others, smartphones allow us to carry immense social environments in our pockets through every waking moment of our lives.”

As Walf said, boredom is our brains way of cueing us to seek social interaction. Traditionally, this was us getting the hell out of our cave – or cabin – or castle – and getting some face time with other humans. 

But technology has short circuited that. Now, we get that social connection through the far less healthy substitution of a social media platform. And – in the most ironic twist – we get that social jolt not by interacting with the people we might happen to be with, but by each staring at a tiny little screen that we hold in our hand.

The second problem is that mobile devices are not designed to leave us alone, basking in our healthy boredom. They are constantly beeping, buzzing and vibrating to get our attention. 

The third problem is that – unlike a laptop or even a tablet – mobile devices are our device of choice when we are jonesing for a dopamine jolt. It’s our phones we reach for when we’re killing time in a line up, riding the bus or waiting for someone in a coffee shop. This is why I had a hard time relegating my phone to being just a tool while I was away.

As a brief aside – even the term “killing time” shows how we are scared to death of being bored. That’s a North American saying – boredom is something to be hunted down and eradicated. You know what Italians call it? “Il dolce far niente” – the sweetness of doing nothing. Many are the people who try to experience life by taking endless photos and posting on various feeds, rather than just living it. 

The fact is, we need boredom. Boredom is good, but we are declaring war on it, replacing it with a destructive need to continually bath our brains in the dopamine high that comes from checking our Facebook feed or latest Tiktok reel. 

At least one of the architects of this vicious cycle feels some remorse (also from the article from Harvard). “ ‘I feel tremendous guilt,’ admitted Chamath Palihapitiya, former Vice President of User Growth at Facebook, to an audience of Stanford students. He was responding to a question about his involvement in exploiting consumer behavior. ‘The short-term, dopamine-driven feedback loops that we have created are destroying how society works,’ “

That is why we have to put the phone down and watch the humpback whales. That, miei amici, is il dolci far niente!

Of Streaming, Satellites and Sunsets

I’ve been out of the loop for the last 3 weeks as I actually did life stuff. Today, looking to get back into the loop so I could write a column about media, I ran through several emails from Mediapost to see what y’all have been talking about in my absence.

Two caught my eye. The first was a Media Insider from Dave Morgan titled “Cross-Training for Cross-Platform TV”. Dave’s jist, paraphrasing heavily, is that to get a decent audience for high engagement video ads, we’ll have to get comfortable with fishing in a whole bunch of smaller ponds rather than casting our net in a single ocean.

That checks out. As our entertainment choices and information sources keep multiplying exponentially, it’s natural that the big blocks of purchasable attention advertisers used to rely on are getting split into smaller and smaller chunks. This is certainly true for video-based media. In the column, Morgan said, the next decade will mean navigating a mix of linear and streaming TV channels and platforms to have any hope of efficiently reaching audiences at scale.”

Now, I don’t pretend to know anything about buying video ads –Mr. Morgan certainly has forgotten more than I’ll ever know – but I do know this. I recently caught up on a network series through getting it on demand on the network’s streaming platform. The ad execution was abysmal, to say the least. The creative, the delivery and the viewer experience was excruciating to sit through. I literally hated any brands that placed ads through the channel by the time I was done.

If I had to guess, I would say that this was treated like an advertising bargain bin – a last minute throw in for network advertisers that no one really thought or cared about. Some of the creative wasn’t even designed for the platform. The images didn’t execute correctly on the screen (tablet) I was watching it on. Whatever it cost these advertisers for this exposure, it was completely wasted on this audience of one.

The other item was more of a WTF moment – a column by Mediapost staff writer Wayne Friedman. In the column – Look Into The Night Sky – You Might See An Ad For Car Insurance”Friedman tells of a recent study that “looked at the possibility of a ‘space advertising’ mission, where one could advertise in the twilight over a particular urban area or city.”

This would be done by launching a number of satellites into a station orbit and letting them literally unfurl an advertising banner every night just after sunset.

Again, WTF. Do I want an ad popping up after a spectacular sunset telling me said sunset was brought to me by the MyPillow guy? No.

And knowing that advertisers can be a little obtuse sometimes, I’ll repeat – a little more emphatically – “F*&k NO!”

I had just a little taste of this the last week when I happened to see a Starlink train head across the night sky above me. If you haven’t seen this, it’s a perfect row of SpaceX Starlink satellites in orbit that can be seen in just the right conditions. In my case, there were probably about 50 satellites in a row.

Was it cool? Sure. But it was also unsettling. The night sky is supposed to be messy and spectacular, not precisely lined up like a set of Christmas lights. It was disconcerting to see something so obviously man-made encroaching on nature’s firmament.

Look advertisers, I get that it’s getting harder and harder to get our attention with your ads. That’s probably because we don’t want to give it to you, and – increasingly – we don’t have to. If that sounds harsh, it’s because you’ve burnt out any goodwill you might have had by sledgehammering us over the head with poorly executed, ham-fisted ads delivered ad-nauseum without any concern for our experience on the receiving end. That will be true on any platform you choose to deliver those ads on.

So, to circle back to Dave Morgan’s message, if you’re going to do it, at least try to do it well.

And finally, just so we’re clear, stay the hell out of my sunset!

Crisis? What Crisis?

Never let a good crisis go to waste.

— Winston Churchill, approximately 1944

Crisis? What crisis?

— Supertramp album, 1975

I’ll be honest. I was struggling to finish this column. It was actually heading for the digital dustbin when I happened on MediaPost Editor in Chief Joe Mandese’s excellent commentary, “It’s Time For A Change, And By That, I Mean A Crisis.”

Much as I respect Joe, whose heart and head are definitely in the right place, I think we may have to agree to disagree. He says,

“What the ad industry really needs to do is organize a massive global campaign to change the way people think, feel and behave about the climate — moving from a not-so-alarmist ‘change’ to an ‘our house is on fire’ crisis.”

Joe Mandese – Mediapost

But exactly how do you make people pay attention to an existential crisis? How do you communicate threat?

The problem may be that we can’t. It may simply not be possible.

That was crystallized in the scariest way possible recently on the U.K.’s GB News channel, where an anchor desperately tried to make light of the meteorologist’s dire predictions of potential fatalities ahead of an unprecedented heat wave in England.

Weather expert John Hammond issues a warning over the ‘extreme’ conditions expected next week – GB News – July 14, 2022

The Basics of Communication

There are typically four parts to any communication model: the sender, the message, the medium and the receiver. Joe’s post said the problem may be in the message — it hasn’t been urgent enough. I disagree. I think the problem is at the end of the chain, with the receiver. The message is already effective. It’s just not getting through.

In online course on business communications, Lumen Learning lists a number of potential barriers to communication. I’d like to focus on three that were mentioned: filtering, bias and lack of trust.

The first one is the big one, but the last two contribute. And they all lie on the receiving end of the communication model, with the receiver, who just doesn’t want to receive the message.

The problem, most of all, is one of entitlement.

I’m not pointing fingers — unless I’m pointing at myself. I live a privileged lifestyle. I don’t think I’ve let the message, with all its implications, fully get through to me, because to accept that message is unimaginably depressing and scary. I fully admit I’m filtering, because I feel overwhelmed. Climate change has gone from being an inconvenient truth to something we’re determined to ignore, even if it kills us.

If I count all the people whose lifestyle I have some understanding of, it’s aboiut a thousand people. I think an overwhelming majority of them get the massive implications of climate change.  Of all those people, I can count on the fingers of one hand (maybe two) those who have truly made substantive changes in their lifestyle to really address climate change. That’s — at best – .5% to 1% of everyone I know.

 I’m not judging. I haven’t made the changes required myself. Not really.

I have done all 10 of the UN’s suggestions of 10 ways you can help fight climate crisis to one extent or another. But I can’t help feeling that even doing all 10 is like peeing on a forest fire. Given the high stakes we’re talking about here, I really don’t feel I’m making a meaningful difference.  I haven’t sold either of my two vehicles, I haven’t stopped planning trips that involve air travel, or moved into a more energy- efficient house. I still eat red meat (although not as much as before).

The fact is, when a message is trying to tell us that our inevitable future means we’re going to have less than we have today, we will ignore that message. 

I get it. I truly do. I started and stopped this column several times because it depressed the hell out of me. But I am now determined to plow through to the end, so let’s talk about entitlement. We use this word a lot, especially lately. But what does it mean? It means we believe we have the right to the lifestyle we currently have.

But there’s no one to give us that right. Our lifestyle isn’t granted to us by anyone. If we live a good life, as I do, we like to think that it’s due to our hard work and wise choices. That’s why we’re entitled to everything we have. But if we rationally pick apart our success, we find that plain old dumb luck plays a bigger role than we’d like to admit. In my case, I was born a white, anglo male in one of the richest countries in the world. I came out of the womb with advantages most of the world can only dream of.

Entitlement is actually the result of a cognitive bias – or rather, a bundle of cognitive biases that include loss aversion and endowment effect. It’s a quirk in our mental wiring. It’s a mistaken belief – an illusion. I’m not owed the life I have. I have that life because of a convergence of lucky factors, and it appears my luck may be running out. There is no arbitrator of privilege that has granted North America the right to be the single biggest consumer of natural resources (per capita) in the world. But we seem prepared gamble our planet away on this mistaken belief about our own entitlement.

In psychology, there’s something called the Psychological Entitlement Scale. It measures the strength of this cognitive bias. A recent study showed just how strongly this was correlated with our ability to ignore messaging that we didn’t want to hear because we felt it interfered with our “rights.” In this case, the message was about health guidelines during COVID-19. And we all know how that turned out. Even something as ridiculously simple as wearing a face mask whipped up a shitstorm of entitlement. 

This is not a problem of messaging. We are not going to be persuaded to do the right thing.  We are being asked to give up too much.

Climate change can only be addressed by two things: legislation and a mobilization of the market. We cannot be left with the option of doing nothing — or too little — any longer.

We must be forced to be better. We need more massive omnibus bills, like the recent Manchin-Schumer deal, that mobilize industry and incentivize better behavior. I only hope my own Canadian government follows suit soon.

Much as I wish Joe Mandese were right that by turning up the intensity of the messaging, we could persuade consumers to really move the needle on the climate threat, I don’t think this would work. It’s not that we don’t know about climate change. It’s that we can’t let ourselves care, because our entitlement won’t let us.

Putting a Label on It

We know that news can be toxic. The state of affairs is so bad that many of the media sources we rely on for information have been demonstrated to be extremely harmful to our society. Misinformation, in its many forms, leads to polarization, the destruction of democracy, the engendering of hate and the devaluing of social capital. It is – quite likely – one of the most destructive forces we face today.

To make matters worse, a study conducted by Ben Lyons from the University of Utah found that we’re terrible at spotting misinformation, yet many of us think we can’t be fooled. Seventy five percent of us overestimate our ability to spot fake news by as much at 22 percentile points. And the more overconfident we are, the more likely we are to share false news.

Given the toxic effects of unreliable news reporting, it was only natural that – sooner or later – someone would come up with the logical idea of putting a warning label on it. And that’s exactly what NewsGuard does. Using “trained journalists” to review the most popular news platforms (they say the cover 95% of our news source engagement) they give each source a badge, ranging from green to red, showing its reliability. In a recent report, they highlighted some of the U.S.’s biggest misinformation culprits (NewsMax.com, TheGatewayPundit.com and the Federalist.com) and some of the sources that are most reliable (MSNBC.com, NYTimes.com, WashingtonPost.com and NPR.com).

But here’s the question. Just because you slap a warning label on toxic news sources, will it have any effect? That’s exactly what a group of researchers at New York University’s Center for Social Media and Politics wanted to find out. And the answer is both yes and no.

Kevin Aslett, lead author of the paper, said,

“While our study shows that, overall, credibility ratings have no discernible effect on misperceptions or online news consumption behavior of the average user, our findings suggest that the heaviest consumers of misinformation — those who rely on low-credibility sites — may move toward higher-quality sources when presented with news reliability ratings.”

Kevin Aslett, NYU Center for Social Media and Politics

This is interesting. In essence, this study is saying that if you run into the odd unreliable news source and you see a warning label, it will probably have no effect. But if you make a steady diet of unreliable news and see warning label after warning label, it may eventually sink in and cause you to improve your sources for news consumption. This seems to indicate warning labels might have a cumulative effect. The more you’re exposed to them, the more effective they become.

We are literally of two minds – one driven by ration and one by emotion. Warning labels try to appeal to one mind, but our likelihood to ignore them comes from our other mind. The effectiveness of these labels depends on which mind is in the driver’s seat. There is a wide spectrum of circumstances that may bring you face to face with a warning label and the effectiveness of that label may depend on some sort of cognitive “Russian roulette” – a game of odds to determine if the label will impact you. If this is the case, it makes sense that the more you see a warning label, the greater the odds that – at least one time – you might be of a mind to pay attention to it.

Up in Smoke

This might help explain the so-so track record of warning labels in other arenas. Probably the longest trial run of warning labels has been on cigarette packages. The United States started requiring these labels in 1966. In 2001, my own country – Canada – was the first country in the world to introduce graphic warning labels; huge and horrible pictures of the effects of smoking plastered across every pack of smokes.

This past week, we in Canada went one better. Again, we’re going to be the first country in the world to require warning labels on each and every cigarette. Apparently, our government has bought into the exposure effect of warning labels – more is better.

It seems to be working. In 1965 the smoking rate in Canada was 50%. In 2020 it was 13%.

But a recent study (Strong, Pierce, Pulvers et al) showed that if smokers aren’t ready to quit, warning labels may have “decreased positive perceptions of cigarettes associated with branded cigarette packs but without clearly increasing health concerns. They also increased quitting cognitions but did not affect either cigarette cessation or consumption levels.”

Like I said – just because you get through to one mind doesn’t mean you’ll have any luck with the other.

Side Effects May Include….

Perhaps the most interesting case of warnings in the consumer marketplace are with prescription drugs. Because the United States is one of the few places in the world (New Zealand is the other one) that can advertise prescription drugs direct to the consumer, the Food and Drug Administration has mandated that ads must include a fair balance of rewards and risks. Advertisers being advertisers, the rewards take up much of the ad, with sunlight infused shots of people enjoying life thanks to the miracles of the drug in question. But, at the end, there is a laundry list of side effects read in a voice over, typically at breakneck pace in a deadly monotone.

It’s this example that highlights perhaps the main issue with warning labels; they require a calculation of risk vs reward. If this wasn’t true, we wouldn’t need a warning label. Nobody needs to tell us not to drink battery acid. That’s all risk and no reward. If there’s a label on it, it’s probably on something we want to do but know we shouldn’t.

A study of the effectiveness of these warnings in DTC prescription ads found they become less effective because of something called argument dilution effect. Ads that only include the worst side effects are more effective than ads that include every potential side effect, even the minor ones. Hence the laundry list. If a drug could cause both sudden heart attacks and minor skin rashes, our mind tends to let these things cancel each other out.

This effect is an example of the heuristic nature of our risk vs reward decision making. It needs to operate quickly, so it relies on the irrational, instinctive part of our neural circuitry. We don’t take the time to weigh everything logically – we make a gut call. Marketers know the science behind this and continually use it to their advantage.

Warning labels are an easy legislative fix to try to plug this imperfectly human loophole. It seems to make sense, but it doesn’t really address the underlying factors. Given enough time and enough exposure, they can shift behaviors, but we shouldn’t rely on them too much.

Sarcastic Much?

“Sarcasm is the lowest form of wit, but the highest form of intelligence.”

Oscar Wilde

I fear the death of sarcasm is nigh. The alarm bells started going when I saw a tweet from John Cleese that referenced a bit from “The Daily Show.”  In it, Trevor Noah used sarcasm to run circles around the logic of Supreme Court Justice Brett Kavanaugh, who had opined that Roe v. Wade should be overturned, essentially booting the question down to the state level to decide.

Against my better judgement, I started scrolling through the comments on the thread — and, within the first couple, found that many of those commenting had completely missed Noah’s point. They didn’t pick up on the sarcasm — at all. In fact, to say they missed the point is like saying Columbus “missed” India. They weren’t even in the same ocean. Perhaps not the same planet.

Sarcasm is my mother tongue. I am fluent in it. So I’m very comfortable with sarcasm. I tend to get nervous in overly sincere environments.

I find sarcasm requires almost a type of meta-cognition, where you have to be able to mentally separate the speaker’s intention from what they’re saying. If you can hold the two apart in your head, you can truly appreciate the art of sarcasm. It’s this finely balanced and recurrent series of contradictions — with tongue firmly placed in cheek — that makes sarcasm so potentially powerful. As used by Trevor Noah, it allows us to air out politically charged issues and consider them at a mental level at least one step removed from our emotional gut reactions.

As Oscar Wilde knew — judging by his quote at the beginning of the post — sarcasm can be a nasty form of humor, but it does require some brain work. It’s a bit of a mental puzzle, forcing us to twist an issue in our heads like a cognitive Rubik’s Cube, looking at it from different angles. Because of this, it’s not for everyone. Some people are just too earnest (again, with a nod to Mr. Wilde) to appreciate sarcasm.

The British excel at sarcasm. John Cleese is a high priest of sarcasm. That’s why I follow him on Twitter. Wilde, of course, turned sarcasm into art. But as Ricky Gervais (who has his own black belt in sarcasm) explains in this piece for Time, sarcasm — and, to be more expansive, all types of irony — have been built into the British psyche over many centuries. This isn’t necessarily true for Americans. 

“There’s a received wisdom in the U.K. that Americans don’t get irony. This is of course not true. But what is true is that they don’t use it all the time. It shows up in the smarter comedies but Americans don’t use it as much socially as Brits. We use it as liberally as prepositions in everyday speech. We tease our friends. We use sarcasm as a shield and a weapon. We avoid sincerity until it’s absolutely necessary. We mercilessly take the piss out of people we like or dislike basically. And ourselves. This is very important. Our brashness and swagger is laden with equal portions of self-deprecation. This is our license to hand it out.”

Ricky Gervais – Time, November 9, 2011

That was written just over a decade ago. I believe it’s even more true today. If you chose to use sarcasm in our age of fake news and social media, you do so at your peril. Here are three reasons why:

First, as Gervais points out, sarcasm doesn’t play equally across all cultures.  Americans — as one example — tend to be more sincere and, as such, take many things meant as sarcastic at face value. Sarcasm might hit home with a percentage of an U.S. audience, but it will go over a lot of American heads. It’s probably not a coincidence that many of those heads might be wearing MAGA hats.

Also, sarcasm can be fatally hamstrung by our TL;DR rush to scroll to the next thing. Sarcasm typically saves its payoff until the end. It intentionally creates a cognitive gap, and you have to be willing to stay with it to realize that someone is, in the words of Gervais, taking the “piss out of you.” Bail too early and you might never recognize it as sarcasm. I suspect more than a few of those who watched Trevor Noah’s piece didn’t stick through to the end before posting a comment.

Finally, and perhaps most importantly, social media tends to strip sarcasm of its context, leaving it hanging out there to be misinterpreted. If you are a regular watcher of “The Daily Show with Trevor Noah,” or “Last Week Tonight with John Oliver,” or even “Late Night with Seth Meyers” (who is one American that’s a master of sarcasm), you realize that sarcasm is part and parcel of it all. But when you repost any bit from any of these shows to social media, moving it beyond its typical audience, you have also removed all the warning signs that say “warning: sarcastic content ahead.” You are leaving the audience to their own devices to “get it.” And that almost never turns out well on social media.

You may say that this is all for the good. The world doesn’t really need more sarcasm. An academic study found that sarcastic messages can be more hurtful to the recipient than a sincere message. Sarcasm can cut deep, and because of this, it can lead to more interpersonal conflict.

But there’s another side to sarcasm. That same study also found that sarcasm can require us to be more creative. The mental mechanisms you use to understand sarcasm are the very same ones we need to use to be more thoughtful about important issues. It de-weaponizes these issues by using humor, while it also forces us to look at them in new ways.

Personally, I believe our world needs more Trevor Noahs, John Olivers and Seth Meyers. Sarcasm, used well, can make us a little smarter, a little more open-minded, and — believe it or not — a little more compassionate.

Two Timely Stories from my Royal 10 Typewriter

I love old typewriters. My favourite is a 1917 Royal Typewriter 10, made by the Royal Typewriter Company of New York, New York. It is a beautiful piece of engineering, with intricate and mysterious mechanical connections that are revealed by two bevelled glass windows in each side. It was made in a factory in Hartford, Connecticut. Ian Fleming wrote his Bond books on a Royal. Ernest Hemingway also used one.

My Royal 10 is pretty beaten up. There are parts missing and it’s far from functional. But that only adds to its gravitas and charm. It sits beside me in a special place in my office as I write this on an Apple MacBook Pro, circa 2018 – two examples of technical elegance separated by a century. I suspect – though – that the MacBook won’t be on anyone’s desk in 2120.

Last week, I told you about the Lost Generation. If I had been part of that generation, I would probably be writing this on that typewriter.  Today, I wonder what other stories it might have to tell. With that, I’d like to pick up where I left off last week and share two stories that I like to believe lie buried in the Royal 10’s mechanical magnificence. I hope they are two stories that have something to relate to us in our current reality.

First, the Royal 10 was introduced in 1914, the start of World War 1. It was Royal’s first “upright” design and, along with the Underwood 5, it would become a trusted workhorse for many, including news reporters. Much of the coverage of the war, and later, the Spanish Flu epidemic of 1918 – 1920, was probably hammered out on a Royal 10.

But here’s the interesting thing. The 10 represented a pinnacle of typewriter design. Other than cosmetic changes, the inner workings of typewriters didn’t change for decades after its introduction. That meant that most of the major developments of the next 20 years would also be written on a typewriter like the one that sits in my office.

As the world rolled out the ‘Teens and into the Twenties, my Royal was one of the few things that didn’t change. The double gut punch of a global war and a massive pandemic rocked the world back on its feet, but not for long. After a brief but deep recession in 1920 and 21, a global explosion of creativity and enterprise transformed everything. The Roarin’ Twenties ushered in technological revolutions on multiple fronts, including the automobile, aviation, movies, radio, telephones, electricity and medicine.

Society was radically transformed in the Twenties. Jazz and speakeasies flourished. Women marched for their rights. Culture was transformed by writers and artists that were both more liberal in their themes and more apt to criticize hypocrisy and greed. Art deco became the dominant design movement of the decade.

The story is this: after 6 years of devastation and destruction, the world entered 8 years of explosive creativity and change.

That brings me to the second story. The Royal Typewriter Company had to shift gears in 1940. The factory in Hartford no longer built typewriters. They began manufacturing machine guns, rifles, bullets, propellers and spare parts for airplane engines as part of American industry’s support of the war effort. They wouldn’t go back to building typewriters until September 1945.

You, like me, may have become tragically addicted to pandemic models that show the importance of flattening the curve. These models have one thing in common, a flat horizontal line that represents our current capacity to handle the crisis.

But is that line necessarily flat? Just like the Royal Company in 1940, industry is once again ramping up to defeat a common enemy. This time, it’s face masks instead of machine guns; ventilators instead of propellers. The output is different, but the motivation is the same: we can beat this thing.

Again, history may provide us a little perspective. In 1939, the US was the furthest thing imaginable away from being ready to go to war. General George Patton was given command of a unit of 325 tanks that were desperately in need of nuts and bolts in order to keep working. After unsuccessfully trying to order through official channels he ended up ordering from a Sears catalogue and paying for them himself. The American army was so short of equipment that it would borrow Good Humor ice cream trucks to stand in as tanks when they practiced military manoeuvres.

Despite this, by 1945, American industry had produced two-thirds of all the weapons and equipment used by the Allies to win the war. That happened because thousands of companies like the Royal Typewriter Company unified behind a common goal. Chief Executives worked for a $1.00 a year. Volunteer laborers worked double shifts. Factories retooled their production lines. Capacity expanded exponentially when it had to because it’s not a static flat line.

I think my Royal 10 is trying to tell me two things. One I need to remember going into this crisis, and one for when we eventually come out of it.

First, we can do amazing things in the face of incredible adversity when we have to. We just need a little time to be our best.

And secondly, humans tend to bounce higher when we’re on the rebound from adversity.

Social Network Nastiness is the New Normal

British ethologist and evolutionary biologist Richard Dawkins — he of “The Selfish Gene” Fame — was pondering recently on Twitter:

“Curious about Twitter nastiness. In conversation, we say ‘I don’t agree because…’ On Twitter, ‘you vile piece of shit’ replaces ‘because…’ Why? Has it escalated like loudness of talk in crowded room? Starts quiet but escalates till all yell to be heard. Nastier than thou?”

Dawkin’s hunch, which is fairly self-explanatory, makes sense. It aligns with my own suspicion that in a world of hyperbolic noise, we are increasingly becoming desensitized to “normal” and drawn to messaging that is “jagged” and polarized enough to go viral.

But Dawkin’s tweet prompted speculation on some other possible hypotheses, which he summarized in a follow-up tweet:

“Thanks for interesting responses. Many favour ‘road rage’ theory, a version of the ‘anonymity protects cowards’ theory. Some support my ‘Nastier than thou’ escalation theory, perhaps reinforced by (I hadn’t thought of this) hunger for Likes. But who likes nastiness & why?”

There is a third possibility that also emerged, a tribal “us” vs “them” chant where the tweets are a way for  tweeters to virtue signal to their tribe and, in the process, self-identify their position in the strongest possible terms.

At the end of the day, all these theories point out an interesting tipping point in social discourse: We are now doing it with some physical distance between participants, and we are doing it for an audience. Both of these factors can lead to social behaviors that are new for us.

Let’s begin with the question of distance.  Our most noble instincts were built on the foundation of proximity. Empathy and caring began as a tribal exercise. We came prewired with a hierarchy of humanity — a ranking of whom we care most about.

At the top are those we share the most DNA with. Next come those we share our time and physical space with. One rung down are those that look most like ourselves. Everybody else falls somewhere under that.

We are not locked into this hierarchy, but we have to understand that it is the hair-triggered reaction that naturally fires before our brain gets a chance to mindfully ponder what an appropriate behavior might be. Much as this may not be the “woke” thing to admit, we do ourselves a disservice by not acknowledging it.

That brings us to the “audience” part of the new normal: We now broadcast our beliefs to our social network. Much of what we do online is done for the benefit of our audience. And when we become “de-individualized” as part of a group, our behavior changes. We rely less on our individual belief of what is right or wrong and more on what we believe the social norms of the group define as acceptable. We are bonding with our tribe and seeking acceptance.

If you combine both these factors, you have an environment where it is not only okay to be nasty to someone, it’s expected and encouraged. We say and do things to people that would be abhorrent to us if we were face-to-face with them. All three of Dawkin’s proposed explanations can and do arise.

The most discouraging part is that this is a Pandora’s box now open. There is no closing it again. This is the new normal, the new standard for discourse and debate.  Behavior stripped of any innate instinct for face-to-face civility is setting the stage for elections, governance, public debate and ideological alignment. It is defining our laws, our media and our culture. It is now part of our society.

I don’t know where it will lead. I only know there is no turning back.

The View from the Other Side

After a life time in marketing I am now sitting on the other side of the table. Actually, I’m sitting on all sides of the table. In my newest venture it’s just me, so I have to do everything. And I don’t mind telling you I’m overwhelmed. These past few years have given me a whole new appreciation of how damned difficult it is to be a business owner. And my circumstances are probably better than 90% of others out there. This started as a hobby that – with surprisingly little direction from me – somehow grew into a business.  There

Is no real financial pressure on me. There are no sales numbers I have to hit. I have no investors to answer to. I have no debt to service. My business is very limited in scope.

But still – somehow – I feel like I’m drowning. I couldn’t imagine doing this if the stakes were higher

It’s Hard to Find the Time to Build a Better Mousetrap…

I’ve always been of the opinion that the core of the business and the marketing of that business should be inseparable. But as I’ve learned, that’s a difficult balancing act to pull off. Marketing is a vast black hole that can suck up all your time. And in any business, there is just a lot of stuff that requires a lot of time to do. It requires even more time if you want to do it well. Something has to give. So what should that something be? That sounds trite, but it’s not.

Take me, for example. I decided to offer bike tours. Sound simple enough, right? I had no idea how many permits, licenses and authorizations I needed to have. That all takes time. And it was time I had to spend before I could do anything else.

Like I said, to do things well takes time. Businesses naturally have to evolve. Almost none of us gets it right out of the gate. We make mistakes and then have to figure out how not to make those mistakes again. This is good and natural. I believe a good business has to have a leader that sweats the details, because the details are where shit goes wrong. I’m a big picture guy but I’ve discovered that big pictures are actually a mosaic of a million little pieces that someone has to pay attention to. And that takes time.

The Fear of a Not Doing Everything Right Now

New companies used to have the luxury of time. No one expected them to hit the home run in their first year. Well, Google and Facebook screwed that up for everyone, didn’t they? We are now all supposed to operate within some ridiculously compressed timeline for success. Our business lives are all about rushing things to market, rapid iteration, agile development. And while we’re doing all that, we should also be keeping up with our Instagram posts and building a highly engaged online community. If we don’t successfully do all those things, we feel like we’ve failed.

I’m calling bullshit on that. Most studies done on this subject show the odds of survival for a new company lasting five years are somewhere between 40 and 50%. That’s not great, but I have to believe that given the coin toss survival rate, there are a lot of companies that may not have a fully optimized Facebook business page that have somehow managed to survive bankruptcy. And even the businesses that do wrap it up are not always financial failures. Many times it’s because the founder has just had enough.

I completely understand that. I started this busIness because I wanted to have fun. And while not many of us give that reason for starting a business, I don’t believe I’m the only one. If this isn’t fun, why the hell are we doing it? But juggling a zillion balls knowing that I’m guaranteed to drop many of them isn’t all that much fun. Each morning begins with a dropped ball inventory. It seems that business today is all about reactive triage. What did I do? What didn’t I do? What might kill me and what’s only going to hurt for a while?

I’d like to end this column with some pat advice, some strategy to deal with the inevitable inundation of stuff that is demanding your time. But I’m struggling. I believe it’s hidden somewhere between my two previous points – deal with what’s potentially fatal and try to have some fun. At least, that’s what I’m trying to do.

The Comfort of our Tribe

It was a shattering blow to the very heart of Canada. On Friday, April 6, on an intersection in Northern Saskatchewan, a semi truck slammed into the side of a bus carrying the Humboldt Broncos, a junior hockey team. 16 people on that bus, including most of the team, are gone. We have been collectively staggered by the loss.

This column is not about the accident. This column is how we’ve dealt with the grief that came from it. It you want to get to the heart of Canada, there is no more direct route than through hockey. At least half the country has had sons and daughters that have also been on buses or vans, riding through the Canadian winter with their team on their way to a tournament. The horror of the Humboldt Broncos was personal because it was so easily imagined.

Three things are helping us through. And these three things show that no matter how technology may have influenced how we define community, when the worst happens, we need a much more primal definition of connection.

One – Tribes

Make no mistake, tribalism is alive and well in Canada. The boundaries of the tribes are defined by the hockey teams we root for. We love to wear tribal colors. We just call them hockey jerseys. But for one day, all tribes were united. On April 12, we all wore our jerseys, no matter the team colors, as a show of solidarity for those most directly impacted by the loss of the Humboldt Broncos. Teachers, students, bankers, lawyers, civic workers, nurses, doctors, bus drivers – it didn’t matter who we were or the categories we normally belong to. On that day, we were all part of the same tribe, united in the same goal. We were honoring the Humboldt Broncos.

https://globalnews.ca/video/embed/4141267/

Two – Totems

Here in the Pacific Northwest, where I live, we’re very familiar with totems. But if you’re not, a totem is “a spirit being, sacred object, or symbol that serves as an emblem of a group of people, such as a family, clan, lineage, or tribe.”

In Canada for the past few weeks, our totems have been hockey sticks. We left them on our front door step as a sign of remembrance. The symbolism was perfectly captured by a front door cam when a young boy came home, found the stick on his parents stoop, played with it for a while, then gently kissed it and placed it back. If you want to understand the primal power of a totem, take 44 seconds to watch this video.

Three – Togetherness

We may choose to grieve along, but we heal together. For Canada, as we came together, we adopted the team’s mantra – Humboldt Strong. We came together in arenas, churches, synagogues, parking lots and lobbies. It didn’t matter where. What mattered was that we were together – somewhere, anywhere – and the healing that came from us sharing the same space and the physicality of our grief.

Digital Connection may be efficient, but it’s not effective. It’s why you don’t log in to a Google Hang Out or a Facebook live feed for the funeral or the wedding of a loved one. You have to be there. You have to share. You have to connect – eye to eye and heart to heart. That’s how humans were built.

It’s how we – a nation in grief – are staying Humboldt Strong.

Watching TV Through The Overton Window

Tell me, does anyone else have a problem with this recent statement by HBO CEO Richard Plepler: “I am trying to build addicts — and I want people addicted to something every week”?

I read this in a MediaPost column about a month ago. At the time, I filed it away as something vaguely troubling. I just checked and found no one else had commented on it. Nothing. We all collectively yawned as we checked out the next series to binge watch. That’s just what we do now.

When did enabling addiction become a goal worth shooting for? What made the head of a major entertainment corporation think it was OK to use a term that is defined as “persistent, compulsive use of a substance known to the user to be harmful” to describe a strategic aspiration? And, most troubling of all, when did we all collectively decide that that was OK?

Am I overreacting? Is bulk consuming an entire season’s worth of “Game of Thrones” or “Big Little Lies” over a 48-hour period harmless?

Speaking personally, when I emerge from my big-screen basement cave after watching more than two episodes of anything in a row, I feel like crap. And there’s growing evidence that I’m not alone. I truly believe this is not a healthy direction for us.

But my point here is not to debate the pros and cons of binge watching. My point is that Plepler’s statement didn’t cause any type of adverse reaction. We just accepted it. And that may because of something called the Overton Window.

The Overton Window was named after Joseph Overton, who developed the concept at a libertarian think tank  — the Mackinac Center for Public Policy — in the mid-1990s.

Typically, the term is used to talk about the range of policies acceptable to the public in the world of politics. In the middle of the window lies current policy. Moving out from the center in both directions (right and left) are the degrees of diminishing acceptability. In order, these are: Popular, Sensible, Acceptable, Radical and Unthinkable.

Overton_Window_diagram.svgThe window can move, with ideas that were once unthinkable eventually becoming acceptable or even popular due to the shifting threshold of public acceptance. The concept, which has roots going back over 150 years, has again bubbled to the top of our consciousness thanks to Trumpian politics, which make “extreme things look normal,” according to a post on Vox.

Political strategists have embraced and leveraged the concept to try to bring their own agendas within the ever-moving window. Because here’s the interesting thing about the Overton Window: If you want to move it substantially, the fastest way to do it is to float something outrageous to the public and ask them to consider it. Once you’ve set a frame of consideration towards the outliers, it tends to move the window substantially in that direction, bringing everything less extreme suddenly within the bounds of the window.

This has turned The Overton Window into a political strategic tug of war, with the right and left battling to shift the window by increasingly moving to the extremes.

What’s most intriguing about the Overton Window is how it reinforces the idea that much of our social sensibility is relative rather than absolute. Our worldview is shaped not only by what we believe, but what we believe others will find acceptable. Our perspective is constantly being framed relative to societal norms.

Perhaps — just perhaps — the CEO of HBO can now use the word “addict” when talking about entertainment because our perspective has been shifted toward an outlying idea that compulsive consumption is OK, or even desirable.

But I have to call bullshit on that. I don’t believe it’s OK. It’s not something we as an industry — whether that industry is marketing or entertainment — should be endorsing. It’s not ennobling us; it’s enabling us.

There’s a reason why the word “addict” has a negative connotation. If our “window” of acceptability has shifted to the point where we just blithely accept these types of statements and move on, perhaps it’s time to shift the window in the opposite direction.