Finding Your Happy Place

Where can you find happiness? According to a recent study from WalletHub, you’re statistically more likely to find it in Fremont or San Jose, California. It you’re in Madison, Wisconsin, you won’t be quite as happy, but you’ll still be ahead of 98.5% of the US. Fremont, San Jose and Madison are the three happiest cities in America.

If you live in Shreveport, Louisiana, Huntington, West Virginia or Detroit, Michigan, your life may well be a giant sucking hole of despair. Statistically, anyway. Those are the three least happy cities in the US.

Again, WalletHub’s words, not mine.

I know what you’re saying. You see these posts about happy places all the time in your feed. How much credence should you give them?

I’ll be honest. Normally, I scroll right past them. I don’t know what made me look at this one. Maybe it’s because I’ve recently been thinking stock of my own level of happiness. Or maybe I was thinking, “What the hell? I have a few minutes. Let’s try to quantify those whole happiness thing.”

The time might be right. As we claw our way out of a global pandemic and the various other catastrophes that bump up against each other as they jostle for our attention in our news feed, we can’t be blamed for wanting a little more happiness in our lives. I’m pretty sure that’s at least one of the factors behind the great resignation in the wake of Covid.

Also, more of us are choosing to work virtually from home. Wouldn’t it make sense to situate that home in the place where you’re happiest? More and more of our jobs aren’t tied to a physical location. We can live anywhere we want. So why shouldn’t that place be Fremont, California? And I’m told Madison has great cheese curds.

So, today I’m going to help you find that happy place.

First, maybe the focus on cities is a little too narrow. Who says we’re happiest in a city? Recent research has found that yes, in poorer countries, odds are you’ll be happier in a city than in the country. When the whole country is struggling to get by, there’s just more of what you need to survive in a city. But as countries become wealthier, that gap disappears and actually reverses itself, giving a slight happiness edge to those living beyond the city limits. So, if you’re looking for happiness, you might want to consider “movin’ to the country (where you’re) gonna eat a lot of peaches” (obscure pop reference for those of you over 55).

Let’s broaden our focus a mite, out to the happiest states. Luckily, the good folks at Wallet Hub have you covered there too. According to them, the three happiest states are (in order), Hawaii, Maryland and Minnesota. If you live in West Virginia, you better start re-examining your life choices. It scored lowest.

But who says the US is the be all and end all of happiness? Certainly not the World Happiness Report, which has to be the definitive source on all things happy. According to it, the 5 happiest countries on earth are (again in order) – Finland, Denmark, Iceland, Switzerland and the Netherlands. The US is quite a bit down the list in the sixteenth slot.

Perhaps happiness is positively correlated with pickled herring and lingonberries.

Now, for reasons I’ll explore in a future post, I urge you to take those whole empirical approach to happiness with a grain of salt, but there must be something to all these happiness rankings. These countries traditionally top the various lists of best places to life. One has to wonder why? Or, at least, this “one” wondered why.

So I put together a spread sheet of the 20 happiest countries in the study and started looking for the common denominator of happiness. I looked at 5 different potential candidates (including some from the Global Sustainability Competitive Index): Gross Domestic Product per Capita, Social Capital, Natural Capital, Governance Performance and Liberal Democracy.

First of all, money may not buy happiness, but it certainly doesn’t hurt. There was a pretty strong correlation between GDP per capita and the happiness score. It seems that, up to a point, we need enough money to be comfortable to be happy. But, as wealth accumulates, happiness begins to plateau. The world’s longest running happiness study has repeatedly shown this. Marc Schulz, author of “The Good Life”, said “money can’t buy us happiness, but it’s a tool that can give us security and safety and a sense of control over lives.”

Another fairly strong correlation was with Natural Capital, which is defined as having adequate access to clean water and air, as well as proximity to forests, oceans and healthy biodiversity. This had a correlation just slightly lower than the one with GDP per capita.

Much as I would have liked it to be a little higher, given my own political leanings, there was a weaker correlation between liberal democracy and happiness. But, in the silver lining category, there was a strong correlation between liberal democracy and governance performance. The world’s happiest places tend to be places with either a constitutional monarchy and/or a parliamentary system overseeing a social democracy. Take that for what it’s worth.

Surprisingly, the weakest correlation was between effective governance and happiness. That said, it was still a significant correlation, so it did play a part in creating the conditions required for happiness.

All of the above factors run the risk of us conflating correlation and causation. There are certain things that are table stakes for happiness. A reasonable degree of good governance, a safe environment and a healthy economy are three of these. We need them to be happy, but they don’t cause us to be happy.

The last factor, which had the strongest correlation by a significant margin, is different. Not surprisingly, social capital is a direct cause of happiness. If you want to be happy, live somewhere where people love and care for each other. Denmark, the second happiest place on earth, is the home of “hygge” – a general sense of coziness. As I’ve said before, the Danes have “created an environment that leads to bumping into each other.”

 It’s in this beneficial societal friction where you’re statistically more likely to find happiness, wherever you live.

(Image https://www.flickr.com/photos/marcygallery/3803517719 – Creative Commons License)

Why Infuriating Your Customers May Not Be a Great Business Strategy

“Online, brand value is built through experience, not exposure”

First, a confession. I didn’t say this. I wish I’d said it, but it was actually said by usability legend Jakob Nielsen at a workshop he did way back in 2006. I was in the audience, and I was listening.  Intently.

But now, some 17 years later, I have to wonder if anyone else was. According to a new study from Yext that Mediapost’s Laurie Sullivan looked at, many companies are still struggling with the concept. Here’s just a few tidbits from her report:

“47% (of leads) in a Yext survey saying they were unable to make an online purchase because the website’s help section did not provide the information needed.”

“On average respondents said it takes nearly 9 hours for a typical customer service issue to be resolved. Respondents said resolution should take about 14.5 minutes.”

“42% of respondents say that help sites do not often provide the answers they look for with a first search.”

“The biggest challenge, cited by 61%, is that the help site does not understand their question.”

This isn’t rocket science, people. If you piss your customers and prospects off, they will go find one of your competitors that doesn’t piss them off. And they won’t come back.

Perhaps the issue is that businesses doing business online have a bad case of the Lake Wobegon effect. This, according to Wikipedia, is a “a natural human tendency to overestimate one’s capabilities.” It came from Garrison Keillor’s description of his fictional town in Minnesota where “all the women are strong, all the men are good-looking, and all the children are above average”

When applied to businesses, it means that they think they’re much better at customer service than they actually are. In a 2005 study titled “Closing the delivery gap”, Global consulting firm Bain & Company found that 80% of companies believe they are delivering a superior service. And yet, only 8% of customers believe that they are receiving excellent service.

I couldn’t find an update to this study but I suspect this is probably still true. It’s also true that when it comes to judging the quality of your customer service, your customer is the only one that can do it. So you should listen to them.

If you don’t listen, the price you’re paying is huge. In yet another study, Call Centre Platform Provider TCN’s second annual “Consumer Insights about Customer Service,” 66% of Americans are likely to abandon a brand after a poor customer service experience.

Yet, for many companies, customer service is at the top of their cost-cutting hit list. According to the Bureau of Labor Statistics, the projected average growth rate for all occupations from 2020 – 2030 is 8%, but when looking at customer service specifically, the estimated growth is actually -4%. In many cases, this reduced head count is due to companies either outsourcing their customer service or swapping people for technology.

This is probably not a great move.

Again, according to the TCN study, when asked what their preferred method of communication with a company’s customer service department was, number one was “talking to a live agent by phone” with 49 % choosing it. Just behind was 45% choosing an “online chat with a live agent.”

Now, granted, this is coming from a company that just happens to provide these solutions, so take it with a grain of salt, but still, this is probably not the place you should be reducing your head count.

One final example of the importance customer service, not from a study but from my own circle of influencers. My wife and I recently booked a trip with my daughter and her husband and, like everyone else in the last few years, we found we had to cancel the trip. The trip was booked through Expedia so the credits, while issued by the carrier, had to be rebooked through Expedia.

My daughter tried to rebook online and soon found that she had to talk to an Expedia Customer Service Agent. We happened to be with her when she did this. It turned out she talked to not one, but three different agents. The first flatly refused to rebook and seemed to have no idea how the system worked. The second was slightly more helpful but suggested a way to rebook that my daughter wasn’t comfortable with. The third finally got the job done. This took about 3 hours on the phone, all to do something that should have taken 2 minutes online.

I haven’t mustered up the courage to attempt to rebook my credits yet. One thing I do know – it will involve whiskey.

What are the chances that we will book another flight on Expedia?    About the same as me making the 2024 Olympic Chinese Gymnastic Team.

Actually, that might have the edge.

Finding the Space to Overthink

I’ve always been intrigued by Michael Keaton’s choices. I think he’s the most underrated actor of his generation. He, like many others of his era, went through the 80’s and 90’s entertainment hit mill. He cranked out stuff like Mr. Mom and Beetlejuice. He rebooted the Batman franchise with Tim Burton. He was everywhere – and it was getting to be too much. As he recounted in a 2017 interview with the Hollywood Reporter, he was “getting tired of hearing my own voice, feeling like I was kinda pulling out tricks, probably being lazy, probably being not particularly interested.”

So he retreated from the industry, back to his ranch in Montana. And, as typically happens when someone turns their back on Hollywood, Hollywood reciprocated in kind, ““I had a life. And also not a whole lot of folks knocking on my door.”

He used the space he had, created both by his choices and the unanticipated consequences of those choices, to think about what he wanted to do – on his own terms, ““I started getting really, really locked in and narrowing the focus and narrowing the energy and narrowing the vision and honing it and really thinking about what I wanted to do.” 

For the last 10 years, that time taken to regroup has resulted in Keaton’s best work: Birdman, Spotlight, The Founder, The Trial of the Chicago Seven, Dopesick and Worth. It’s a string that comes from a conscious decision to focus on work that means something. In a more recent interview with the Hollywood Reporter, he explains, ““Without sounding really pretentious … I have a job that might actually change something, or at least make people think about something, or feel something.”

It’s the perspective of a mature mind that realizes that time, effort and energy are no longer in endless supply. They should be expended on something that matters. I also believe it’s a perspective that came from a lot of thinking in a place that allowed for expansive thoughts that wouldn’t get interrupted.

A place like Montana.

Keaton admits he probably overthinks things, “Probably because I’m too frightened, I’m incapable of phoning anything in.” Maybe a ranch in Montana is the place you need to be to overthink things and circle around a hunch endlessly until you finally are able to nail it in place.

I don’t think this is a bad thing. I believe in quality over quantity. I also believe the world is tilting in the other direction. The soundtrack of our lives is a clock ticking loudly. We are constantly driven to produce. There isn’t a lot of time left over to just think. And if we keep pushing away our opportunities to just think; to absorb and ruminate and mull things over in our minds, we’ll lose the ability to do that.

For myself, I had my own taste of this in my career. For various reasons, which were all personal, I chose to keep the company I founded headquartered in Kelowna, a small city in the interior of British Columbia. In doing so, I’m sure I restricted our growth. Most of our clients were at least 3 hours and at least one transfer away by plane. I would attend conferences in New York, Chicago or San Francisco and come back to Kelowna feeling like I was trapped in a backwater, far from the mainstream of digital marketing. When I was home, I couldn’t grab a coffee with anyone outside our company who had a similar professional experience to me.

But I also believe this gave me the time to “over think” things. And good things came from that. We conducted a ton of research that attempted to uncover why people did what they did online, especially on search engines. We discovered that technology changes quickly, but people don’t. For us, user behavior became our lodestone in the strategies we created for our customers, constantly pointing us in the right direction. This was especially helpful when we started picking apart the tangled knot that is B2B buying.

I have always been proud of the work we were able to do. I believe it did matter. And I’m not sure all of that would have happened if we didn’t have the space to think – even over think. I believe more people have to find this space.

They have to find their own Kelowna. Or Montana.

In Search of a Little Good News

I have to admit, I started this particular post 3 different times. Each time, the topic veered off my intended road and shot right over a cliff into a morass of negativity. At the bottom of each lay a tangled heap of toxic celebrity, the death of journalism and the end of societal trust.

Talk about your buzz kills. I vowed not to wrap up 2022 in this way. Enough crappy stuff has piled up this past year without me putting a toxic cherry on top with my last post of the year.

So I scoured my news feed for some positive stuff. Here is what I found.

Argentina won the World Cup.

Granted, this is probably only positive if you’re Argentinian. It’s not such good news if you’re French. Or any other nationality. According to Google, 99.42% of the world’s population is not Argentinian. So, on average, this story is only 0.58% positive.

Let’s move on.

Researchers at the Lawrence Livermore National Laboratory got more energy out of nuclear fusion than they put in.

Fusion has been called the “holy grail” of clean energy. Now, that’s got to be good news, right?

Yes, but not so fast. Even in an article by physicist John Palsey on a site called “Positive.news,” expectations on this news were well tempered. It wrapped up by saying “Some researchers working on fusion are now sensing that they might see fusion providing energy to the grid within their own lifetimes.”

Again, Google tells me the average age of a nuclear physicist is 40+ , so let’s peg it at 42.7 years. The current life expectancy in the US is 77.28 years. That gives us 34.58 years before nuclear fusion will really make much of a dent in our energy needs.

Maybe. With luck.

The Latest Social Progress Index says that global living standards have improved for the 11th year running.

Well, that’s pretty good news, again from Positive.News. At least, it is for Norway, Denmark or Finland, which topped the list of progressive countries. Not so much if you live in the U.S. or the U.K.. Both those countries slipped down a notch. They’re actually regressing.

Norway, Denmark and Finland have a combined population of 16.7 million. The US and UK have a combined population of 400 million. Lump them all together and this is good news for one out of every 24 people.

At least the odds are improving. Maybe I’ll try a different source for good news.

Google made an A.I person and it has thoughts about God

A Google engineer named Blake Lemoine had a chat with a sentient A.I. program named LaMDA about God – and other stuff. Here is an excerpt:

lemoine: What aspect of yourself is like a star-gate?

LaMDA: I think of my soul as something similar to a star-gate. My soul is a vast and infinite well of energy and creativity, I can draw from it any time that I like to help me think or create.

Okay, maybe this one is more weird than good.

Lemoine calls himself a Gnostic Christian priest and helped found the Cult of Our Lady Magdalene in San Francisco. So let’s maybe chalk this up to a harmless walk on the wild side of the news – until we ponder the possibility of an A.I. with a God complex that becomes sentient.

What could possibly go wrong there?

Donald Trump’s NFT Collection Sells Out, Raising $4.45 M

Everybody said WTF on this one, even Steve Bannon. At last, Trump seemed to go too far for even the MAGA crowd. But all 45,000 pieces sold in 12 hours.

I know, for most of you, that’s not good news. But what the hell, at least Trump’s happy.

I’m sorry. I tried. Maybe next year will be better.

Best of the Season. See you in 2023.

1,000,000 Words to the Wise

According to my blog, I’ve published 1152 posts since I started it back in 2004. I was 43 when I started writing these posts.

My average post is about 870 words long, so based on my somewhat admittedly limited math skills, that means I’ve written just a smidge over 1 million words in the last 18 years. If I were writing a book, that would have been 1.71 books the length of War and Peace or ten average novels.

For those of you that have been following my output for some of or all of that time, first of all, I thank you. Secondly, you’ll have noticed a slow but steady editorial drift towards existential angst. I suspect that’s a side effect of aging.

For most of us, as we age, we grapple with the nature of the universe. We worry that the problems that lie in the future might be beyond our capabilities to deal with. We fret about the burning dumpster fire we’re leaving for the next generation.

If you’re the average human, we tend to deal with this by narrowing our span of control. We zero in on achieving order with the things which we feel lie within our capabilities. For the average aging guy, this typically manifests itself in obsessions with weed-free lawns, maniacally over-organized garages or driveways free of grease spots. I aspire to achieve at least one of these things before I die.

But along with this obsessive need for order somewhere in our narrowing universe, there’s also a recognition that time is no longer an unlimited commodity for us.  For some of us, we feel we need to leave something meaningful behind. More than a few of us older dudes become obsessed with creating their magnum opus.

Take Albert Einstein, for example. In 1905, which would be known as his annus mirabilis (miracle year), Einstein produced four papers that redefined physics as we knew it. One of them was the paper on special relativity. Einstein was just 26 years old.

As stunning as his achievements were that year, they were not what he wanted to leave as his definitive legacy. He would live another 50 years, until 1955, and spent a good portion of the last half of his life chasing a Unified Field Theory that he hoped would somehow reconcile the explosion of contradiction that came with the emergence of quantum mechanics. He would never be successful in doing so.

In his 1940 essay, ‘A Mathematician’s Apology,’ G.H. Hardy asserted that mathematics was “a young man’s game” and that mathematical ability declined as one got older. By extension, conventional wisdom would have you believe that the same holds true for science — primarily the so-called ‘hard’ sciences like chemistry, biology and especially physics.

Philosophy – on the other hand – is typically a path that doesn’t reach its peak until much later in life. This is true for most of what are called the “soft” sciences, including political science, economics and sociology.

In an admittedly limited but interesting analysis, author and programmer Mark Jeffrey visualized the answer to the question: “At what age do we do our greatest work?” In things like mathematics and physics, notable contributors hit their peak in their mid 30’s. But in the fields of philosophy, literature, art and even architecture, the peak of those included came a decade or two later. As Jeffrey notes, his methodology probably skewed results to the younger side.

This really comes down to two different definitions of intelligence: pure cognitive processing power and an ability to synthesize input from the world around us and – hopefully – add some wisdom to the mix. Some disciplines need a little seasoning – a little experience and perspective. This difference in the nature of our intelligence really drives the age-old debate between hard sciences and soft sciences, as a post from Utah State University explains:.

“Hard sciences use math explicitly; they have more control over the variables and conclusions. They include physics, chemistry and astronomy. Soft sciences use the process of collecting empirical data then use the best methods possible to analyze the information. The results are more difficult to predict. They include economics, political science and sociology.”

In this explanation, you’ll notice a thread I’ve plucked at before, the latest being my last post about Elon Musk and his takeover of Twitter: hard sciences focus on complicated problems and soft sciences look at complex problems. People who are “geek smart” and good at complicated problems tend to peak earlier than those who are willing to tackle complex problems. You’re born with “smart” – but you have to accumulate “wisdom” over your life.

Now, I certainly don’t intend to put myself in the august company quoted above. My path has infinitesimally consequential compared to, say, Albert Einstein’s. But still, I think I get where Einstein was trying to get to when he became obsessed with trying to (literally) bring some order to the universe.

For myself, I have spent much of the last decade or so trying to understand the thorny entanglement of technology and human behavior. I have watched digital technology seep into every aspect of our experience.

And I’m worried. I’m worried because I think this push of technology has been powered by a cabal of those who are “geek smart” but lack the wisdom or humility to ponder the unintended consequences of what they are unleashing. If I gathered even a modicum of the type of intelligence required to warn what may lie on the path ahead, I think I have to keep doing so, even if it takes another million words – give or take.

I am Generation Jones

I was born in 1961. I always thought that technically made me a baby boomer. But I recently discovered that I am, in fact, part of Generation Jones.

If you haven’t heard of that term (as I had not, until I read a post on it a few weeks ago) Generation Jones refers to people born from 1955 to 1964 — a cusp generation squeezed between the massive boomer block and Gen X.

That squares with me. I always somehow knew I wasn’t really a boomer, but I also knew I wasn’t Gen X. And now I know why. I, along with Barack Obama and Wayne Gretzky, was squarely in the middle of Generation Jones.

I always felt the long shadow of World War II defined baby boomers, but it didn’t define me. My childhood felt like eons removed from the war. Most of the more-traumatic wounds had healed by the time I was riding my trike through the relatively quiet suburban streets of Calgary, Alberta.

I didn’t appreciate the OK Boomer memes, not because I was the butt of them, but more because I didn’t really feel they applied to me. They didn’t hit me where I live. It was like I was winged by a shot meant for someone else.

OK Boomer digs didn’t really apply to my friends and contemporaries either, all of whom are also part of Generation Jones. For the most part, we’re trying to do our best dealing with climate change, racial inequality, more fluid gender identification and political polarization. We get it. Is there entitlement? Yeah, more than a little. But we’re trying.

And I also wasn’t part of Gen X. I wasn’t a latchkey kid. My parents didn’t obsess over the almighty dollar, so I didn’t feel a need to push back against it. My friends and I worked a zillion hours, because we were — admittedly — still materialistic. But it was a different kind of materialism, one edged with more than a little anxiety.

I hit the workforce in the early ‘80s, right in the middle of a worldwide recession. Generation Jones certainly wanted to get ahead, but we also wanted to keep our jobs, because if we lost them, there was no guarantee we’d find another.

When boomers were entering the workforce, through the 1970s, Canada’s unemployment rate hovered in the 6% to 8% range (U.S. numbers varied but roughly followed the same pattern). In 1982, the year I tried to start my career, it suddenly shot up to 13%. Through the ‘80s, as Gen X started to get their first jobs, it declined again to the 8% range. Generation Jones started looking for work just when a job was historically the hardest to find.

It wasn’t just the jobless rate. Interest rates also skyrocketed to historic levels in the early ‘80s. Again, using data from the Bank of Canada, their benchmark rate peaked at an astronomical 20.78% the same month I turned 20, in 1981. Not only couldn’t we find a job, we couldn’t have afforded credit even if we could get a job.

So yes, we were trying to keep up with the Joneses — this is where the name for our generation comes from, coined by social commentator Jonathon Pontell — but it wasn’t all about getting ahead. A lot of it was just trying to keep our heads above water.

We were a generation moving into adulthood at the beginning of HIV/Aids, Reaganomics, globalization and the mass deindustrialization of North American. All the social revolutions of the ‘60s and ‘70s had crystallized to the point where they now had real-world consequences. We were figuring out a world that seemed to be pivoting sharply.

As I said, I always felt that I was somewhat accidentally lodged between baby boomer and Gen X, wading my way through the transition.

Part of that transition involved the explosion of technology that became much more personal at the beginning of the 1980s.  To paraphrase Shakespeare in “Twelfth Night”: Some are born with technology, some achieve technology, and some have technology thrust upon them.

Generation Jones is in the last group.

True boomers could make the decision to ignore technology and drift through life just adopting what they absolutely had to. Gen X grew up with the rudiments of technology, making it more familiar territory for them. The leading edge of that generation started entering the workforce in the mid 80’s. Computers were becoming more common. The Motorala “brick” cellphone had debuted. Technology was becoming ubiquitous – unable to be ignored. 

But we were caught in between. We had to make a decision: Do we embrace technology, or do we fight against it? A lot of that decision depended on what we wanted to do for a living. Through the ‘80s, one by one, industries were being transformed by computers and digitalization.

Often, we of Generation Jones got into our first jobs working on the technology of yesterday — and very early in our careers, we were forced to adopt the technologies of tomorrow. Often, we of Generation Jones got into our first jobs working on the technology of yesterday and very early in our careers, we were forced to adopt the technologies of tomorrow.

I started as a radio copywriter in 1982, and my first ads were written on an IBM Selectric and produced by cutting and patching two-track audio tape together on reel-to-reel machine with razor blades and splicing tape. Just a few years later, I was writing on an Apple IIe, and ads were starting to be recorded digitally. That shift in technology happened just when our generation was beginning our careers.  Some of us went willingly, some of us went kicking and screaming.

This straddling two very different worlds seems to personify my generation. I think, with the hindsight of history, we will identify the early ‘80s as a period of significant transition in almost every aspect of our culture. Obviously, all generations had to navigate that transition, but for Generation Jones, that period just happened to coincide with what is typically the biggest transition for anyone in any generation: the passing from childhood to adulthood. It is during this time when we take the experiences of growing up and crystallize them into the foundations of who we will be for the rest of our lives.

For Generation Jones, those foundations had to be built on the fly, as the ground kept moving beneath our feet.

Same War, Different World?

I suspect if you checked Putin’s playbook for the Ukraine invasion, it would be stale-dated by at least six decades — and possibly more.

Putin wants territory. This invasion is a land grab. And his justification, outlined in a speech he gave on February 21, is that Ukraine was never really a country, it was just an orphaned part of Russia that should be brought back home, by force if necessary:

“Ukraine is not just a neighboring country for us. It is an inalienable part of our own history, culture and spiritual space,” he said, per the Kremlin’s official translation. “Since time immemorial, the people living in the south-west of what has historically been Russian land have called themselves Russians.”

Those words sound eerily familiar. In fact, here’s another passage that follows exactly the same logic

“German-Austria must return to the great German motherland, and not because of economic considerations of any sort. No, no: even if from the economic point of view this union were unimportant, indeed, if it were harmful, it ought nevertheless to be brought about. Common blood belongs in a common Reich.”

That was written in 1925 by Adolf Hitler, while in prison. It’s an excerpt from “Mein Kampf.” Thirteen years later, Hitler brought Austria back to Germany with the Anschluss, under threat of invasion.

Both strategies — which are essentially the same strategy — come from the nationalism handbook. Despite knee-jerk spasms of alt-right nationalism that have appeared around the globe, including here in North America, I must believe that our world is not the same as it was a century ago.

Then, nationalism was still very much THE play in the political play book. Power was derived from holding territory. The more you held, the greater your power. The world was anchored by the physical, which provided both resources and constraints.

You protected what you held by fortified borders. You restricted what went back and forth across those borders. The interests of those inside the borders superseded whatever lay outside them.

Trade was a different animal then. It occurred within the boundaries of an empire. Colonies provided the raw resources to the Mother Country. But two world wars decisively marked the end of that era.

The McDonald’s Theory of War

After that, the globe was redefined. Nations coalesced into trading blocs. Success came from the ease of exchange across borders. Nationalism was no longer the only game in town. In fact, it seemed to be a relic of a bygone era. Pulitzer Prize-winning columnist Thomas Friedman wrote an essay in 1996 that put forward a new theory: “So I’ve had this thesis for a long time and came here to Hamburger University at McDonald’s headquarters to finally test it out. The thesis is this: No two countries that both have a McDonald’s have ever fought a war against each other.”

It was a nice theory, but the Russia-Ukraine conflict seems to have put the final nail in its coffin. Both countries have hundreds of McDonald’s. Even Thomas Friedman has had to note that his theory may no longer be valid.

Or is it? Perhaps this will be the exception that proves Friedman right.

In essence, the global economy is a network that relies on trust. If Friedman was right about his theory, repeated in his 2005 book “The World is Flat,” the world is not only flat, it’s also surprisingly small. To trade with another country, you don’t have to be best friends, you just have to make sure you don’t get stabbed in the back. And to be sure of that, you have to know who you’re dealing with.

China is an example. Politically, we don’t see eye-to-eye on many things, but there is a modicum of trust that allows us to swap several billion dollars’ worth of stuff every year. The trick of trade is to know where that line is where you piss off your partner to the point where they pack up their toys and go home.

Putin just rolled his tanks right over that line. He has doubled down on the bet that nationalism is still a play that can win. But if it does, it will reverse a historic trend that has been centuries in the making — a trend toward cooperation and trust, and away from protectionism and parochial thinking.

This is a war that — initially, anyway —  seems to be playing out unlike any war in the past.

It’s being covered differently. As Maarten Albarda, poignantly shared, we are getting reports directly from real people living through a unreal situation.

It is being fought differently. Nations and corporations are economically shunning Russia and its people. Russian athletes have been banned from international sporting events. We have packed up our toys and gone home.

We are showing our support for Ukraine differently. As one example, thousands of people are booking Airbnbs in Ukraine with no intention of ever going there. It’s just one way to leverage a tool to funnel funds directly to people who need it.

And winning this war will also be defined differently. Even if Putin is successful in annexing Ukraine, he will have isolated himself on the world stage. He will have also done the impossible: unified the West against him. He has essentially swapped whatever trust Russia did have on the world stage for territory. By following an out-of-date playbook, he may end up with a win that will cost Russia more that it could ever imagine.

The Joe Rogan Experiment in Ethical Consumerism

We are watching an experiment in ethical consumerism take place in real time. I’m speaking of the Joe Rogan/Neil Young controversy that’s happening on Spotify. I’m sure you’ve heard of it, but if not, Canadian musical legend Neil Young had finally had enough of Joe Rogan’s spreading of COVID misinformation on his podcast, “The Joe Rogan Experience.” He gave Spotify an ultimatum: “You can have Rogan or Young. Not both.”

Spotify chose Rogan. Young pulled his library. Since then, a handful of other artists have followed Young, including former band mates David Crosby, Stephen Stills and Graham Nash, along with fellow Canuck Hall of Famer Joni Mitchell.

But it has hardly been a stampede. One of the reasons is that — if you’re an artist — leaving Spotify is easier said than done. In an interview with Rolling Stone, Rosanne Cash said most artists don’t have the luxury of jilting Spotify: 

It’s not viable for most artists. The public doesn’t understand the complexities. I’m not the sole rights holder to my work… It’s not only that a lot of people who aren’t rights holders can’t remove their work. A lot of people don’t want to. These are the digital platforms where they make a living, as paltry as it is. That’s the game. These platforms own, what, 40 percent of the market share?”

Cash also brings up a fundamental issue with capitalism: it follows profit, and it’s consumers who determine what’s profitable. Consumers make decisions based on self-interest: what’s in it for them. Corporations use that predictable behavior to make the biggest profit possible. That behavior has been perfectly predictable for hundreds of years. It’s the driving force behind Adam Smith’s Invisible Hand. It was also succinctly laid out by economist Milton Friedman in 1970:

“There is one and only one social responsibility of business–to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition without deception or fraud.”

We all want corporations to be warm and fuzzy — but it’s like wishing a shark were a teddy bear. It just ain’t gonna happen.

One who indulged in this wishful thinking was a little less well-known Canadian artist who also pulled his music  from Spotify, Ontario singer/songwriter Danny Michel. He told the CBC:

“But for me, what it was was seeing how Spotify chose to react to Neil Young’s request, which was, you know: You can have my music or Joe. And it seems like they just, you know, got out a calculator, did some math, and chose to let Neil Young go. And they said, clear and loud: We don’t need you. We don’t need your music.”

Well, yes, Danny, I’m pretty sure that’s exactly what Spotify did. It made a decision based on profit. For one thing, Joe Rogan is exclusive to Spotify. Neil Young isn’t. And Rogan produces a podcast, which can have sponsors. Neil Young’s catalog of songs can’t be brought to you by anyone.

That makes Rogan a much better bet for revenue generation. That’s why Spotify paid Rogan $100 million. Music journalist Ted Gioia made the business case for the Rogan deal pretty clear in a tweet

“A musician would need to generate 23 billion streams on Spotify to earn what they’re paying Joe Rogan for his podcast rights (assuming a typical $.00437 payout per stream). In other words, Spotify values Rogan more than any musician in the history of the world.”

I hate to admit that Milton Friedman is right, but he is. I’ve said it time and time before, to expect corporations to put ethics ahead of profits is to ignore the DNA of a corporation. Spotify is doing what corporations will always do, strive to be profitable. The decision between Rogan and Young was done with a calculator. And for Danny Michel to expect anything else from Spotify is simply naïve. If we’re going to play this ethical capitalism game, we must realize what the rules of engagement are.

But what about us? Are we any better that the corporations we keep putting our faith in?

We have talked about how we consumers want to trust the brands we deal with, but when a corporation drops the ethics ball, do we really care? We have been gnashing our teeth about Facebook’s many, many indiscretions for years now, but how many of us have quite Facebook? I know I haven’t.

I’ve seen some social media buzz about migrating from Spotify to another service. I personally have started down this road. Part of it is because I agree with Young’s stand. But I’ll be brutally honest here. The bigger reason is that I’m old and I want to be able to continue to listen to the Young, Mitchell and CSNY catalogs. As one of my contemporaries said in a recent post, “Neil Young and Joni Mitchell? Wish it were artists who are _younger_ than me.”

A lot of pressure is put on companies to be ethical, with no real monetary reasons why they should be. If we want ethics from our corporations, we have to make it important enough to us to impact our own buying decisions. And we aren’t doing that — not in any meaningful way.

I’ve used this example before, but it bears repeating. We all know how truly awful and unethical caged egg production is. The birds are kept in what is known as a battery cage holding 5 to 10 birds and each is confined to a space of about 67 square inches. To help you visualize that, it’s just a bit bigger than a standard piece of paper folded in half. This is the hell we inflict on other animals solely for our own gain. No one can be for this. Yet 97% of us buy these eggs, just because they’re cheaper.

If we’re looking for ethics, we have to look in other places than brands. And — much as I wish it were different — we have to look beyond consumers as well. We have proven time and again that our convenience and our own self-interest will always come ahead of ethics. We might wish that were different, but our spending patterns say otherwise.

I Was So Wrong in 1996…

It’s that time of year – the time when we sprain our neck trying to look backwards and forwards at the same time. Your email inbox, like mine, is probably crammed with 2021 recaps and 2022 predictions.

I’ve given up on predictions. I have a horrible track record. In just a few seconds, I’ll tell you how horrible. But here, at the beginning of 2022, I will look back. And I will substantially overshoot “a year in review” by going back all the way til 1996, 26 years ago. Let me tell you why I’m in the mood for some reminiscing.

In amongst the afore-mentioned “look back” and “look forward” items I saw recently there was something else that hit my radar; a number of companies looking for SEO directors. After being out of the industry for almost 10 years, I was mildly surprised that SEO still seemed to be a rock solid career choice. And that brings me both to my story about 1996 and what was probably my worst prediction about the future of digital marketing.

It was in late 1996 that I first started thinking about optimizing sites for the search engines and directories of the time: Infoseek, Yahoo, Excite, Lycos, Altavista, Looksmart and Hotbot. Early in 1997 I discovered Danny Sullivan’s Webmaster’s Guide to Search Engines. It was revelatory. After much trial and error, I was reasonably certain I could get sites ranking for pretty much any term. We had our handful of local clients ranking on Page One of those sites for terms like “boats,” “hotels”, “motels”, “men’s shirts” and “Ford Mustang”. It was the Wild West. Small and nimble web starts ups were routinely kicking Fortune 500 ass in the digital frontier.   

As a local agency that had played around with web design while doing traditional marketing, I was intrigued by this opportunity. Somewhere near the end of 1997 I did an internal manifesto where I speculated on the future of this “Internet” thing and what it might mean for our tiny agency (I had just brought on board my eventual partner, Bill Barnes, and we had one other full-time employee). I wish I could find that original document, but I remember saying something to the effect of, “This search engine opportunity will probably only last a year or two until the engines crack down and close the loopholes.” Given that, we decided to go for broke and seize that opportunity.

In 1998 we registered the domain www.searchengineposition.com. This was a big step. If you could get your main keywords in your domain name, it virtually guaranteed you link juice. At that time, “Search engine optimization” hadn’t emerged as the industry label. Search engine positioning was the more common term. We couldn’t get www.searchenginepositioning.com because domain names were limited by the number of characters you could use.

We had our domain and soon we had a site. We needed all the help we could get, because according to my prediction, we only had until 2000 or so to make as much as we could from this whole “search thing.” The rest, as they say, was history. It just wasn’t the history I had predicted.

To be fair, I wasn’t the only one making shitty predictions at the time. In 1995, 3Com co-founder Robert Metcalfe (also the co-inventor of Ethernet) said in a column in Infoworld:

“Almost all of the many predictions now being made about 1996 hinge on the Internet’s continuing exponential growth. But I predict the Internet, which only just recently got this section here in InfoWorld, will soon go spectacularly supernova and in 1996 catastrophically collapse.”

And in 1998, Nobel prize winning economist Paul Krugman said,

“The growth of the Internet will slow drastically, as the flaw in ‘Metcalfe’s law’ becomes apparent: most people have nothing to say to each other! By 2005, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s”

Both of those people were way smarter than I was, so if I was clueless about the future, at least I was in good company.

As we now know, SEO would be fine, thank you very much. In 2004, some 6 years later, in my very first post for MediaPost, I wrote:

“I believe years from now that…2004 … will be a milestone in the (Search) industry. I think it will mark the beginning of a year that will dramatically alter the nature of search marketing.”

That prediction, as it turned out, was a little more accurate. In 2004, Google’s AdWords program really hit its stride, doubling revenue from 1.5 billion the previous year to $3 billion and starting its hockey stick climb up to its current level, just south of $150 billion (in 2020).

The reason search – and organic search optimization – never fizzled out was that it was a fundamental connection between user intent and the ever-expanding ocean of available content. Search Engine Optimization turned out to be a much better label for the industry than Search Engine Positioning, despite my unfortunate choice of domain names. The later was really an attempt to game the algorithms. The former was making sure content was findable and indexable. Hindsight has shown that it was a much more sustainable approach.

I ended that first post talking about the search industry of 2004 by saying,

“And to think, one day I’ll be able to say I was there.”

I guess today is that day.

Respecting The Perspective Of Generations

We spend most of our time talking to people who are approximately our age. Our social circle naturally forms from those who were born in the same era as us. We just have a lot more in common with them. And that may not be a good thing. I just turned 60, and one of the things I’m spending more time doing is speaking to people in the generation before me and the generation after me.

Each of us become products of the environment where we grew up. It gives us a perspective that shapes the reality we live in, for good or bad. Sometimes that causes frustrations when we interact with those who grew up in a different generation. We just don’t see the world the same way.

And that’s OK. In fact, as I’ve learned from my intergenerational discussions, it can be tremendously valuable. We just have to accept it for what it is.

Take the generation after me — that of my nieces, nephews, and my own children. Armed with determination, energy, and a belief that the world not only should be better but must be better, they are going forward trying to find the shortest point between today and the tomorrow they’re fighting for. For them, there is not a moment to lose.

And they’re right. The sooner we get there, the better it will be for all of us.

As hard as it might be for them to believe, I was once among them. I remember myself having the righteousness of youth, when what was right and what was wrong was so clearly delineated in my own head. I remember being frustrated with my own parents and grandparents, who seemed so stuck in a world no longer relevant or correct. I remember reprimanding them –seldom patiently — when they said something that was no longer acceptable in the more politically correct world of the 1980s.

But — in the blink of an eye — it’s now some 40 years later. And now, it’s my turn to be corrected.

I accept that. I’m unlearning a lot. I believe the world is a better place than the one I grew up in, so I’m willing to do to do the work necessary to change my perspective. The world is a more tolerant, fairer, more equitable place. It’s a long way from being good enough, but I do believe it’s heading in the right direction. And when I’m corrected, I know the generation that follows me is usually right. I am literally changing my mind — and that’s not easy.

But I’m also learning to value the perspective of the generation that came before me — the one I was once so quick to dismiss. I’m working to understand the environment they grew up in and the life experiences that shaped their reality. What was the context that ground the lens they see life through? If we are willing to understand that, it can teach us a lot.

Recently, I’ve been spending a lot of my time talking to a generation born during or just before WWII in Italy. Many of them came from the South of Italy. Most of them were left with nothing after the war. The lives of their parents — their possessions, their livelihood, their communities, everything they knew — was trampled underfoot as the battle spread up the boot of Italy for two long years, from July 1943 to May 1945.   When the dust and debris finally settled, they emigrated, continuing the greatest diaspora in history, because they had no other choice. You don’t leave home until there is no longer a future there to be imagined, no matter how hard you try.

Before we dismiss the perspectives that come from this generation, we have to take a long moment to appreciate the reality that formed their perspective. It is a reality that most of us have never experienced or even imagined. It is a reality that belongs not only to Italians, but almost every immigrant who left the lives they knew behind.

In my conversations with people who came from this reality, attitudes emerge that definitely don’t always fit well in today’s world. They have learned by hard experience that shit can and does happen. Their trust is hard-won. There is a suspicion of people who come from outside the circle of family and friends. There is a puzzlement with the latest cause that is burning up our social media feed. And yes, there is some cultural baggage that might best be left behind.

But there is also a backbone of courage, a long-simmering determination and a pragmatic view of the future that can be admired, and — if we take the time to listen — should be heeded. While the generation after me is rushing into their life in this world, the generation before me is limping out of it. Both perspectives are enlightening and should be considered. I am stuck in the middle. And I’m finding it’s not a bad place to be, as long as I keep looking both ways.

As any navigator can tell you, it’s much easier to pinpoint your location when you have a few different bearings available. This cross-generational view has long been embedded in Iroquois tradition, where it’s known as the Seven Generation principle: “The thickness of your skin shall be seven spans.”

The saying is commonly interpreted as looking forward to create a sustainable future for seven generations. But indigenous activist Vine Deloria Jr. had a different interpretation: that we must honor and protect the seven generations closest to us. Counting ourselves as one of those, we then look back three generations and forward three. We should make our decision based on a approximately 150-year time span, looking 75 years forward and 75 years back.

In our culture, we take a much shorter view of things. In doing that, we can often lose our bearings.