Why Quitting Facebook is Easier Said than Done

Not too long ago, I was listening to an interview with a privacy expert about… you guessed it, Facebook. The gist of the interview was that Facebook can’t be trusted with our personal data, as it has proven time and again.

But when asked if she would quit Facebook completely because of this — as tech columnist Walt Mossberg did — the expert said something interesting: “I can’t really afford to give up Facebook completely. For me, being able to quit Facebook is a position of privilege.”

Wow!  There is a lot living in that statement. It means Facebook is fundamental to most of our lives — it’s an essential service. But it also means that we don’t trust it — at all.  Which puts Facebook in the same category as banks, cable companies and every level of government.

Facebook — in many minds anyway – became an essential service because of Metcalfe’s Law, which states that the effect of a network is proportional to the square of the number of connected users of the system. More users = exponentially more value. Facebook has Metcalfe’s Law nailed. It has almost two and a half billion users.

But it’s more than just sheer numbers. It’s the nature of engagement. Thanks to a premeditated addictiveness in Facebook’s design, its users are regular users. Of those 2.5 billion users, 1.6 billion log in daily. 1.1 billion log in daily from their mobile device. That means that 15% of all the people in the world are constantly — addictively– connected to Facebook.

And that’s why Facebook appears to be essential. If we need to connect to people, Facebook is the most obvious way to do it. If we have a business, we need Facebook to let our potential customers know what we’re doing. If we belong to a group or organization, we need Facebook to stay in touch with other members. If we are social beasts at all, we need Facebook to keep our social network from fraying away.

We don’t trust Facebook — but we do need it.

Or do we? After all, we homo sapiens have managed to survive for 99.9925% of our collective existence without Facebook. And there is mounting research that indicates  going cold turkey on Facebook is great for your own mental health. But like all things that are good for you, quitting Facebook can be a real pain in the ass.

Last year, New York Times tech writer Brian Chen decided to ditch Facebook. This is a guy who is fully conversant in tech — and even he found making the break is much easier said than done. Facebook, in its malevolent brilliance, has erected some significant barriers to exit for its users if they do try to make a break for it.

This is especially true if you have fallen into the convenient trap of using Facebook’s social sign-in on sites rather than juggling multiple passwords and user IDs. If you’re up for the challenge, Chen has put together a 6-step guide to making a clean break of it.

But what if you happen to use Facebook for advertising? You’ve essentially sold your soul to Zuckerberg. Reading through Chen’s guide, I’ve decided that it’s just easier to go into the Witness Protection Program. Even there, Facebook will still be tracking me.

By the way, after six months without Facebook, Chen did a follow-up on how his life had changed. The short answer is: not much, but what did change was for the better. His family didn’t collapse. His friends didn’t desert him. He still managed to have a social life. He spent a lot less on spontaneous online purchases. And he read more books.

The biggest outcome was that advertisers “gave up on stalking” him. Without a steady stream of personal data from Facebook, Instagram thought he was a woman.

Whether you’re able to swear off Facebook completely or not, I wonder what the continuing meltdown of trust in Facebook will do for its usage patterns. As in most things digital, young people seem to have intuitively stumbled on the best way to use Facebook. Use it if you must to connect to people when you need to (in their case, grandmothers and great-aunts) — but for heaven’s sake, don’t post anything even faintly personal. Never afford Facebook’s AI the briefest glimpse into your soul. No personal affirmations, no confessionals, no motivational posts and — for the love of all that is democratic — nothing political.

Oh, one more thing. Keep your damned finger off of the like button, unless it’s for your cousin Shermy’s 55th birthday celebration in Zihuatanejo.

Even then, maybe it’s time to pick up the phone and call the ol’ Shermeister. It’s been too long.

The Hidden Agenda Behind Zuckerberg’s “Meaningful Interactions”

It probably started with a good intention. Facebook – aka Mark Zuckerberg – wanted to encourage more “Meaningful Interactions”. And so, early last year, Facebook engineers started making some significant changes to the algorithm that determined what you saw in your News Feed. Here are some excerpts from Zuck’s post to that effect:

“The research shows that when we use social media to connect with people we care about, it can be good for our well-being. We can feel more connected and less lonely, and that correlates with long term measures of happiness and health. On the other hand, passively reading articles or watching videos — even if they’re entertaining or informative — may not be as good.”

That makes sense, right? It sounds logical. Zuckerberg went on to say how they were changing Facebook’s algorithm to encourage more “Meaningful Interactions.”

“The first changes you’ll see will be in News Feed, where you can expect to see more from your friends, family and groups.

As we roll this out, you’ll see less public content like posts from businesses, brands, and media. And the public content you see more will be held to the same standard — it should encourage meaningful interactions between people.”


Let’s fast-forward almost two years and we now see the outcome of that good intention…an ideological landscape with a huge chasm where the middle ground used to be.

The problem is that Facebook’s algorithm naturally favors content from like-minded people. And surprisingly, it doesn’t take a very high degree of ideological homogeneity to create a highly polarized landscape. This shouldn’t have come as a surprise. American Economist Thomas Schelling showed us how easy it was for segregation to happen almost 50 years ago.

The Schelling Model of Segregation was created to demonstrate why racial segregation was such a chronic problem in the U.S., even given repeated efforts to desegregate. The model showed that even when we’re pretty open minded about who our neighbors are, we will still tend to self-segregate over time.

The model works like this. A grid represents a population with two different types of agents: X and O. The square that the agent is in represents where they live. If the agent is satisfied, they will stay put. If they aren’t satisfied, they will move to a new location. The variable here is the level of satisfaction determined by what percentage of their immediate neighbours are the same type of agent as they are. For example, the level of satisfaction might be set at 50%; where the X agent needs at least 50% of its neighbours to also be of type X. (If you want to try the model firsthand, Frank McCown, a Computer Science professor at Harding University, created an online version.)

The most surprising thing that comes out of the model is that this threshold of satisfaction doesn’t have to be set very high at all for extensive segregation to happen over time. You start to see significant “clumping” of agent types at percentages as low as 25%. At 40% and higher, you see sharp divides between the X and O communities. Remember, even at 40%, that means that Agent X only wants 40% of their neighbours to also be of the X persuasion. They’re okay being surrounded by up to 60% Os. That is much more open-minded than most human agents I know.

Now, let’s move the Schelling Model to Facebook. We know from the model that even pretty open-minded people will physically segregate themselves over time. The difference is that on Facebook, they don’t move to a new part of the grid, they just hit the “unfollow” button. And the segregation isn’t physical – it’s ideological.

This natural behavior is then accelerated by the Facebook “Meaningful Encounter” Algorithm which filters on the basis of people you have connected with, setting in motion an ever-tightening spiral that eventually restricts your feed to a very narrow ideological horizon. The resulting cluster then becomes a segment used for ad targeting. We can quickly see how Facebook both intentionally built these very homogenous clusters by changing their algorithm and then profits from them by providing advertisers the tools to micro target them.

Finally, after doing all this, Facebook absolves themselves of any responsibility to ensure subversive and blatantly false messaging isn’t delivered to these ideologically vulnerable clusters. It’s no wonder comedian Sascha Baron Cohen just took Zuck to task, saying “if Facebook were around in the 1930s, it would have allowed Hitler to post 30-second ads on his ‘solution’ to the ‘Jewish problem’”. 

In rereading Mark Zuckerberg’s post from two years ago, you can’t help but start reading between the lines. First of all, there is mounting evidence that disproves his contention that meaningful social media encounters help your well-being. It appears that quitting Facebook entirely is much better for you.

And secondly, I suspect that – just like his defence of running false and malicious advertising by citing free speech – Zuck has an not-so-hidden agenda here. I’m sure Zuckerberg and his Facebook engineers weren’t oblivious to the fact that their changes to the algorithm would result in nicely segmented psychographic clusters that would be like catnip to advertisers – especially political advertisers. They were consolidating exactly the same vulnerabilities that were exploited by Cambridge Analytica.

They were building a platform that was perfectly suited to subvert democracy.

Running on Empty: Getting Crushed by the Crush It Culture

“Nobody ever changed the world on 40 hours a week.”

Elon Musk

Those damned Protestants and their work ethic. Thanks to them, unless you’re willing to put in a zillion hours a week, you’re just a speed bump on the road to all that is good in the world. Take Mr. Musk, for example. If you happen to work at Tesla, or SpaceX, or the Boring Company, Elon has figured out what your average work week should be, “(It) Varies per person, but about 80 sustained, peaking above 100 at times. Pain level increases exponentially above 80.”

“Pain level increases exponentially above 8o”? WTF, Mr. Musk!

But he’s not alone. Google famously built their Mountainview campus so employees never had to go home. Alibaba Group founder Jack Ma calls the intense work culture at his company a “huge blessing.” He calls it the “996” work schedule, 9 am to 9 pm 6 days a week. That’s 72 hours, if you’re counting. But even that wouldn’t cut it if you work for Elon Musk. You’d be a dead beat.

This is the “Crush It” culture, where long hours equate to dedication and – by extension – success. No pain, no gain.

We spend lots of time talking about the gain – so let me spend just one column talking about the pain. Pain such as mental illness, severe depression, long term disabilities and strokes. Those that overwork are more likely to over-eat, smoke, drink excessively and develop other self-destructive habits.

You’re not changing the world. You’re shortening your life. The Japanese call it karoshi; death by overwork.

Like so many things, this is another unintended consequence of a digitally mediated culture. Digital speeds everything up. But our bodies – and brains – aren’t digital. They burn out if they move too fast – or too long.

Overwork as a sign of superior personal value is a fairly new concept in the span of human history. It came from the Puritans who settled in New England. They believed that those that worked hard at their professions were those chosen to get into heaven. The more wealth you amassed from your work, the more evidence there was that you were one of the chosen.

Lately, the creeping Capitalist culture of over-working has most firmly embedded itself in the tech industry. There, the number of hours you work has become a proxy of your own worth. A twisted type of machismo has evolved and has trapped us all into thinking that an hour not spent at our jobs is an hour wasted. We are looked down upon for wanting some type of balance in our lives.

Unfortunately for the Musks and Mas and other modern-day task masters – the biology just doesn’t support their proposed work schedules.

First, our brains need rest. Back in the 18th century when those Puritans proved their worth through work, earning a living was usually a physical endeavour. The load of overwork was spread amongst the fairly simple mechanical machinery of our own bodies. Muscles got sore. Joints ached. But they recovered.

The brain is a much more complex beast. When it gets overworked, it loses its executive ability to focus on the task at hand. When your work takes place on a desktop or laptop where there are unlimited diversions just a click away, you suddenly find yourself 45 minutes into an unplanned YouTube marathon or scrolling through your Facebook feed. It becomes a downward spiral that benefits no one.

An overworked mind also loses its ability to spin down in the evening so you can get an adequate amount of sleep. When your co-workers start boasting of being able to function on just 3 or 4 hours of sleep – they are lying. They are lying to you, but worse, they are lying to themselves. Very few of us can function adequately on less than 7 or 8 hours of sleep. For the rest of us, the negative effects start to accumulate. A study found that sleep deprivation has the same impact as drinking too much. Those that were getting less than 7 hours of sleep faired the same or worse on a cognitive test as those that had a 0.05% blood alcohol level. The legal limit in most states is 0.08%.

Finally, in an essay on Medium, Rachel Thomas points out that the Crush It Culture is discriminatory. Those that have a disability or chronic illness simply have fewer hours in the day to devote to work. They need time for medical support and usually require more sleep. In an industry like Tech where there is an unhealthy focus on the number of hours worked, these workers – which Thomas says makes up at least 30% of the total workforce – are shut out.

The Crush It Culture is toxic. The science simply doesn’t support it. The only ones evangelizing it are those that directly benefit from this modernized version of feudalism.  It’s time to call Bullshit on them.

This Election, Canucks were “Zucked”

Note: I originally wrote this before results were available. Today, we know Trudeau’s Liberals won a minority government, but the Conservatives actually won the popular vote: 34.4% vs 33.06% for the Liberals. It was a very close election.

As I write this, Canadians are going to the polls in our national election. When you read this, the outcome will have been decided. I won’t predict — because this one is going to be too close to call.

For a nation that is often satirized for our tendencies to be nice and polite, this has been a very nasty campaign. So nasty, in fact, that in focusing on scandals and personal attacks, it forgot to mention the issues.

Most of us are going to the polls today without an inkling of who stands for what. We’re basically voting for the candidate we hate the least. In other words, we’re using the same decision strategy we used to pick the last guest at our grade 6 birthday party.

The devolvement of democracy has now hit the Great White North, thanks to Facebook and Mark Zuckerberg.

While the amount of viral vitriol I have seen here is still a pale shadow of what I saw from south of the 49th in 2016, it’s still jarring to witness. Canucks have been “Zucked.” We’re so busy slinging mud that we’ve forgotten to care about the things that are essential to our well being as a nation.

It should come as news to no one that Facebook has been wantonly trampling the tenets of democracy. Elizabeth Warren recently ran a fake ad on Facebook just to show she could. Then Mark Zuckerberg defended Facebook last week when he said: “While I certainly worry about an erosion of truth, I worry about living in a world where you can only post things that tech companies decide to be 100 per cent true.”

Zuckerberg believes the onus lies with the Facebook user to be able to judge what is false and what is not. This is a suspiciously convenient defense of Facebook’s revenue model wrapped up as a defense of freedom of speech. At best it’s naïve, not to mention hypocritical. What we see is determined by Facebook’s algorithm. At worst it’s misleading and malicious.

Hitting hot buttons tied to emotions is nothing new in politics. Campaign runners have been drawing out and sharpening the long knives for decades now. TV ads added a particularly effective weapon into the political arsenal. In the 1964 presidential campaign, it even went nuclear with Lyndon Johnson’s famous “Daisy” Ad.

But this is different. For many reasons.

First of all, there is the question of trust in the channel. We have been raised in a world where media channels historically take some responsibility to delineate between what they say is factual (i.e., the news) and what is paid persuasion (i.e., the ads).

In his statement, Zuckerberg is essentially telling us that giving us some baseline of trust in political advertising is not Facebook’s job and not their problem. We should know better.

But we don’t. It’s a remarkably condescending and convenient excuse for Zuckerberg to appear to be telling us “You should be smarter than this” when he knows that this messaging has little to do with our intellectual horsepower.

This is messaging that is painstakingly designed to be mentally processed before the rational part of our brain even kicks in.

In a recent survey, three out of four Canadians said they had trouble telling which social media accounts were fake. And 40% of Canadians say they had found links to stories on current affairs that were obviously false. Those were only the links they knew were fake. I assume that many more snuck through their factual filters. By the way, people of my generation are the worst at sniffing out fake news.

We’ve all seen it, but only one third of Canadians 55 and over realize it. We can’t all be stupid.

Because social media runs on open platforms, with very few checks and balances, it’s wide open for abuse. Fake accounts, bots, hacks and other digital detritus litter the online landscape. There has been little effective policing of this. The issue is that cracking down on this directly impacts the bottom line. As Upton Sinclair said: “It is difficult to get a man to understand something, when his salary depends on his not understanding it.”

Even given these two gaping vulnerabilities, the biggest shift when we think of social media as an ad platform is that it is built on the complexity of a network. The things that come with this — things like virality, filter bubbles, threshold effects — have no corresponding rule book to play by. It’s like playing poker with a deck full of wild cards.

Now — let’s talk about targeting.

When you take all of the above and then factor in the data-driven targeting that is now possible, you light the fuse on the bomb nestled beneath our democratic platforms. You can now segment out the most vulnerable, gullible, volatile sectors of the electorate. You can feed them misinformation and prod them to action. You can then sit back and watch as the network effects play themselves out. Fan — meet shit. Shit — meet fan.

It is this that Facebook has wrought, and then Mark Zuckerberg feeds us some holier-than-thou line about freedom of speech.

Mark, I worry about living in a world where false — and malicious — information can be widely disseminated because a tech company makes a profit from it.

The Internet: Nasty, Brutish And Short

When the internet ushered in an explosion of information in the mid to late 90s there were many — I among them — who believed humans would get smarter. What we didn’t realize then is that the opposite would eventually prove to be true.

The internet lures us into thinking with half a brain. Actually, with less than half a brain – and the half we’re using is the least thoughtful, most savage half. The culprit is the speed of connection and reaction. We are now living in a pinball culture, where the speed of play determines that we have to react by instinct. There is no time left for thoughtfulness.

Daniel Kahneman’s monumental book, “Thinking, Fast and Slow,” lays out the two loops we use for mental processing. There’s the fast loop, our instinctive response to situations, and there’s the slow loop, our thoughtful processing of reality.

Humans need both loops. This is especially true in the complexity of today’s world. The more complex our reality, the more we need the time to absorb and think about it.

 If we could only think fast, we’d all believe in capital punishment, extreme retribution and eye-for-eye retaliation. We would be disgusted and pissed off almost all the time. We would live in the Hobbesian State of Nature (from English philosopher Thomas Hobbes): The “natural condition of mankind” is what would exist if there were no government, no civilization, no laws, and no common power to restrain human nature. The state of nature is a “war of all against all,” in which human beings constantly seek to destroy each other in an incessant pursuit for power. Life in the state of nature is “nasty, brutish and short.”

That is not the world I want to live in. I want a world of compassion, empathy and respect. But the better angels of our nature rely on thoughtfulness. They take time to come to their conclusions.

With its dense interconnectedness, the internet has created a culture of immediate reaction. We react without all the facts. We are disgusted and pissed off all the time. This is the era of “cancel” and “callout” culture. The court of public opinion is now less like an actual court and more like a school of sharks in a feeding frenzy.

We seem to think this is OK because for every post we see that makes us rage inside, we also see posts that make us gush and goo. Every hateful tweet we see is leavened with a link to a video that tugs at our heartstrings. We are quick to point out that, yes, there is the bad — but there is an equal amount of good. Either can go viral. Social media simply holds up a mirror that reflects the best and worst of us.

But that’s not really true. All these posts have one thing in common: They are digested too quickly to allow for thoughtfulness. Good or bad, happy or mad — we simply react and scroll down. FOMO continues to drive us forward to the next piece of emotionally charged clickbait. 

There’s a reason why social media is so addictive: All the content is aimed directly at our “Thinking Fast” hot buttons. And evolution has reinforced those hot buttons with generous discharges of neurocchemicals that act as emotional catalysts. Our brain online is a junkie jonesing for a fix of dopamine or noradrenaline or serotonin. We get our hit and move on.

Technology is hijacking our need to pause and reflect. Marshall McLuhan was right: The medium is the message and, in this case, the medium is one that is hardwired directly to the inner demons of our humanity.It took humans over five thousand years to become civilized. Ironically, one of our greatest achievements is dissembling that civilization faster than we think. Literally.

The Inevitability of the Pendulum Effect

In the real world, things never go in straight lines or predictable curves. The things we call trends are actually a saw tooth profile of change, reaction and upheaval. If you trace the path, you’ll see evidence of the Law of the Pendulum.

In the physical world, the Law is defined as: “the movement in one direction that causes an equal movement in a different direction.

In the world of human behavior, it’s defined as: “the theory holding that trends in culture, politics, etc., tend to swing back and forth between opposite extremes.

Politically and socially, we’re in the middle of a swing to the right. But this will be countered inevitably with a swing to the left. We could call it Newton’s Third Law of Social Motion: For every action there is an equal and opposite reaction.

Except that’s not exactly true. If it were, the swings would cancel each other out and we’d end in the same place we started from. And we know that’s not the case. Let me give you one example that struck me recently.

This past week, I visited a local branch of my bank. The entire staff were wearing Pride T-shirts in support of their employer’s corporate sponsorship of Pride Week. That is not really a cause for surprise in our world of 2019. No one batted an eye. But I couldn’t help thinking that it’s parsecs removed from the world I grew up in, in the late 60’s and early 70’s.

I won’t jump into the debate of the authenticity of corporate political correctness, but there’s no denying that when it comes to sexual preference, the world is a more tolerant place than it was 50 years ago. The pendulum has swung back and forth, but the net effect has been towards – to use Steven Pinker’s term – the better angels of our nature.

When talking about the Pendulum Effect, we also have to keep an eye on Overton’s Window. This was something I talked about in a previous column some time ago. Overton’s window defines the frame of what the majority of us – as a society – find acceptable. As the pendulum swings back and forth between extremes, somewhere in the middle is a collective view that most of us can live with. But Overton’s window is always moving. And I believe that the window today frames a view of a more tolerant, more empathetic world than the world of 50 years ago – or almost any time in our past. That’s true every day. Lately, it might not even be true most days. But this is probably a temporary thing. The pendulum will swing back eventually, and we’ll be in a better place.

My question is: why? Why – when we even out the swings – are we becoming better people? So far, this column has little to do with media, digital or otherwise. But I think the variable here is information. Stewart Brand, founder of the Whole Earth Catalog, once said “Information wants to be free.” But I think information also wants to set us free – free from the limitations of our gene bound prejudice and pettiness. Where ever you find the pendulum swinging backwards, you’ll find a dearth of information. We need information to be thoughtful. And we need thoughtfulness to create a more just, more tolerant, more empathetic society.

We – in our industry – deal with information as our stock in trade. It is our job to ensure that information spreads as far as possible. It’s the one thing that will ensure that the pendulum swings in the right direction. Eventually. 

Data does NOT Equal People

We marketers love data. We treat it like a holy grail: a thing to be worshipped. But we’re praying at the wrong altar. Or, at the very least, we’re praying at a misleading altar.

Data is the digital residue of behavior. It is the contrails of customer intent — a thin, wispy proxy for the rich bandwidth of the real world. It does have a purpose, but it should be just one tool in a marketer’s toolbox. Unfortunately, we tend to use it as a Swiss army knife, thinking it’s the only tool we need.

The problem is that data is seductive. It’s pliable and reliable, luring us into manipulation because it’s so easy to do. It can be twisted and molded with algorithms and spreadsheets.

But it’s also sterile. There is a reason people don’t fit nicely into spreadsheets. There are simply not enough dimensions and nuances to accommodate real human behavior.

Data is great for answering the questions “what,” “who,” “when” and “where.” But they are all glimpses of what has happened. Stopping here is like navigating through the rear-view mirror.

Data seldom yields the answer to “why.” But it’s why that makes the magic happen, that gives us an empathetic understanding that helps us reliably predict future behaviors.

Uncovering the what, who, when and where makes us good marketers. But it’s “why” that makes us great. It’s knowing why that allows us to connect the distal dots, hacking out the hypotheses that can take us forward in the leaps required by truly great marketing. As Tom Goodwin, the author of “Digital Darwinism,” said in a recent post, “What digital has done well is have enough of a data trail to claim, not create, success.”

We as marketers have to resist stopping at the data. We have to keep pursuing why.

Here’s one example from my own experience. Some years ago, my agency did an eye-tracking study that looked at gender differences in how we navigate websites.

For me, the most interesting finding to fall out of the data was that females spent a lot more time than males looking at a website’s “hero” shot, especially if it was a picture that had faces in it. Males quickly scanned the picture, but then immediately moved their eyes up to the navigation menu and started scanning the options there. Females lingered on the graphic and then moved on to scan text immediately adjacent to it.

Now, I could have stopped at “who” and “what,” which in itself would have been a pretty interesting finding. But I wanted to know “why.” And that’s where things started to get messy.

To start to understand why, you have to rely on feelings and intuition. You also have to accept that you probably won’t arrive at a definitive answer. “Why” lives in the realm of “wicked” problems, which I defined in a previous column as “questions that can’t be answered by yes or no — the answer always seems to be maybe.  There is no linear path to solve them. You just keep going in loops, hopefully getting closer to an answer but never quite arriving at one. Usually, the optimal solution to a wicked problem is ‘good enough – for now.’”

The answer to why males scan a website differently than females is buried in a maze of evolutionary biology, social norms and cognitive heuristics. It probably has something to do with wayfinding strategies and hardwired biases. It won’t just “fall out” of data because it’s not in the data to begin with.

Even half-right “why” answers often take months or even years of diligent pursuit to reveal themselves. Given that, I understand why it’s easier to just focus on the data. It will get you to “good,” and maybe that’s enough.

Unless, of course, you’re aiming to “put a ding in the universe,” as Steve Jobs said in an inspirational commencement speech at Stanford University. Then you have to shoot for great.