I am Generation Jones

I was born in 1961. I always thought that technically made me a baby boomer. But I recently discovered that I am, in fact, part of Generation Jones.

If you haven’t heard of that term (as I had not, until I read a post on it a few weeks ago) Generation Jones refers to people born from 1955 to 1964 — a cusp generation squeezed between the massive boomer block and Gen X.

That squares with me. I always somehow knew I wasn’t really a boomer, but I also knew I wasn’t Gen X. And now I know why. I, along with Barack Obama and Wayne Gretzky, was squarely in the middle of Generation Jones.

I always felt the long shadow of World War II defined baby boomers, but it didn’t define me. My childhood felt like eons removed from the war. Most of the more-traumatic wounds had healed by the time I was riding my trike through the relatively quiet suburban streets of Calgary, Alberta.

I didn’t appreciate the OK Boomer memes, not because I was the butt of them, but more because I didn’t really feel they applied to me. They didn’t hit me where I live. It was like I was winged by a shot meant for someone else.

OK Boomer digs didn’t really apply to my friends and contemporaries either, all of whom are also part of Generation Jones. For the most part, we’re trying to do our best dealing with climate change, racial inequality, more fluid gender identification and political polarization. We get it. Is there entitlement? Yeah, more than a little. But we’re trying.

And I also wasn’t part of Gen X. I wasn’t a latchkey kid. My parents didn’t obsess over the almighty dollar, so I didn’t feel a need to push back against it. My friends and I worked a zillion hours, because we were — admittedly — still materialistic. But it was a different kind of materialism, one edged with more than a little anxiety.

I hit the workforce in the early ‘80s, right in the middle of a worldwide recession. Generation Jones certainly wanted to get ahead, but we also wanted to keep our jobs, because if we lost them, there was no guarantee we’d find another.

When boomers were entering the workforce, through the 1970s, Canada’s unemployment rate hovered in the 6% to 8% range (U.S. numbers varied but roughly followed the same pattern). In 1982, the year I tried to start my career, it suddenly shot up to 13%. Through the ‘80s, as Gen X started to get their first jobs, it declined again to the 8% range. Generation Jones started looking for work just when a job was historically the hardest to find.

It wasn’t just the jobless rate. Interest rates also skyrocketed to historic levels in the early ‘80s. Again, using data from the Bank of Canada, their benchmark rate peaked at an astronomical 20.78% the same month I turned 20, in 1981. Not only couldn’t we find a job, we couldn’t have afforded credit even if we could get a job.

So yes, we were trying to keep up with the Joneses — this is where the name for our generation comes from, coined by social commentator Jonathon Pontell — but it wasn’t all about getting ahead. A lot of it was just trying to keep our heads above water.

We were a generation moving into adulthood at the beginning of HIV/Aids, Reaganomics, globalization and the mass deindustrialization of North American. All the social revolutions of the ‘60s and ‘70s had crystallized to the point where they now had real-world consequences. We were figuring out a world that seemed to be pivoting sharply.

As I said, I always felt that I was somewhat accidentally lodged between baby boomer and Gen X, wading my way through the transition.

Part of that transition involved the explosion of technology that became much more personal at the beginning of the 1980s.  To paraphrase Shakespeare in “Twelfth Night”: Some are born with technology, some achieve technology, and some have technology thrust upon them.

Generation Jones is in the last group.

True boomers could make the decision to ignore technology and drift through life just adopting what they absolutely had to. Gen X grew up with the rudiments of technology, making it more familiar territory for them. The leading edge of that generation started entering the workforce in the mid 80’s. Computers were becoming more common. The Motorala “brick” cellphone had debuted. Technology was becoming ubiquitous – unable to be ignored. 

But we were caught in between. We had to make a decision: Do we embrace technology, or do we fight against it? A lot of that decision depended on what we wanted to do for a living. Through the ‘80s, one by one, industries were being transformed by computers and digitalization.

Often, we of Generation Jones got into our first jobs working on the technology of yesterday — and very early in our careers, we were forced to adopt the technologies of tomorrow. Often, we of Generation Jones got into our first jobs working on the technology of yesterday and very early in our careers, we were forced to adopt the technologies of tomorrow.

I started as a radio copywriter in 1982, and my first ads were written on an IBM Selectric and produced by cutting and patching two-track audio tape together on reel-to-reel machine with razor blades and splicing tape. Just a few years later, I was writing on an Apple IIe, and ads were starting to be recorded digitally. That shift in technology happened just when our generation was beginning our careers.  Some of us went willingly, some of us went kicking and screaming.

This straddling two very different worlds seems to personify my generation. I think, with the hindsight of history, we will identify the early ‘80s as a period of significant transition in almost every aspect of our culture. Obviously, all generations had to navigate that transition, but for Generation Jones, that period just happened to coincide with what is typically the biggest transition for anyone in any generation: the passing from childhood to adulthood. It is during this time when we take the experiences of growing up and crystallize them into the foundations of who we will be for the rest of our lives.

For Generation Jones, those foundations had to be built on the fly, as the ground kept moving beneath our feet.

Same War, Different World?

I suspect if you checked Putin’s playbook for the Ukraine invasion, it would be stale-dated by at least six decades — and possibly more.

Putin wants territory. This invasion is a land grab. And his justification, outlined in a speech he gave on February 21, is that Ukraine was never really a country, it was just an orphaned part of Russia that should be brought back home, by force if necessary:

“Ukraine is not just a neighboring country for us. It is an inalienable part of our own history, culture and spiritual space,” he said, per the Kremlin’s official translation. “Since time immemorial, the people living in the south-west of what has historically been Russian land have called themselves Russians.”

Those words sound eerily familiar. In fact, here’s another passage that follows exactly the same logic

“German-Austria must return to the great German motherland, and not because of economic considerations of any sort. No, no: even if from the economic point of view this union were unimportant, indeed, if it were harmful, it ought nevertheless to be brought about. Common blood belongs in a common Reich.”

That was written in 1925 by Adolf Hitler, while in prison. It’s an excerpt from “Mein Kampf.” Thirteen years later, Hitler brought Austria back to Germany with the Anschluss, under threat of invasion.

Both strategies — which are essentially the same strategy — come from the nationalism handbook. Despite knee-jerk spasms of alt-right nationalism that have appeared around the globe, including here in North America, I must believe that our world is not the same as it was a century ago.

Then, nationalism was still very much THE play in the political play book. Power was derived from holding territory. The more you held, the greater your power. The world was anchored by the physical, which provided both resources and constraints.

You protected what you held by fortified borders. You restricted what went back and forth across those borders. The interests of those inside the borders superseded whatever lay outside them.

Trade was a different animal then. It occurred within the boundaries of an empire. Colonies provided the raw resources to the Mother Country. But two world wars decisively marked the end of that era.

The McDonald’s Theory of War

After that, the globe was redefined. Nations coalesced into trading blocs. Success came from the ease of exchange across borders. Nationalism was no longer the only game in town. In fact, it seemed to be a relic of a bygone era. Pulitzer Prize-winning columnist Thomas Friedman wrote an essay in 1996 that put forward a new theory: “So I’ve had this thesis for a long time and came here to Hamburger University at McDonald’s headquarters to finally test it out. The thesis is this: No two countries that both have a McDonald’s have ever fought a war against each other.”

It was a nice theory, but the Russia-Ukraine conflict seems to have put the final nail in its coffin. Both countries have hundreds of McDonald’s. Even Thomas Friedman has had to note that his theory may no longer be valid.

Or is it? Perhaps this will be the exception that proves Friedman right.

In essence, the global economy is a network that relies on trust. If Friedman was right about his theory, repeated in his 2005 book “The World is Flat,” the world is not only flat, it’s also surprisingly small. To trade with another country, you don’t have to be best friends, you just have to make sure you don’t get stabbed in the back. And to be sure of that, you have to know who you’re dealing with.

China is an example. Politically, we don’t see eye-to-eye on many things, but there is a modicum of trust that allows us to swap several billion dollars’ worth of stuff every year. The trick of trade is to know where that line is where you piss off your partner to the point where they pack up their toys and go home.

Putin just rolled his tanks right over that line. He has doubled down on the bet that nationalism is still a play that can win. But if it does, it will reverse a historic trend that has been centuries in the making — a trend toward cooperation and trust, and away from protectionism and parochial thinking.

This is a war that — initially, anyway —  seems to be playing out unlike any war in the past.

It’s being covered differently. As Maarten Albarda, poignantly shared, we are getting reports directly from real people living through a unreal situation.

It is being fought differently. Nations and corporations are economically shunning Russia and its people. Russian athletes have been banned from international sporting events. We have packed up our toys and gone home.

We are showing our support for Ukraine differently. As one example, thousands of people are booking Airbnbs in Ukraine with no intention of ever going there. It’s just one way to leverage a tool to funnel funds directly to people who need it.

And winning this war will also be defined differently. Even if Putin is successful in annexing Ukraine, he will have isolated himself on the world stage. He will have also done the impossible: unified the West against him. He has essentially swapped whatever trust Russia did have on the world stage for territory. By following an out-of-date playbook, he may end up with a win that will cost Russia more that it could ever imagine.

The Joe Rogan Experiment in Ethical Consumerism

We are watching an experiment in ethical consumerism take place in real time. I’m speaking of the Joe Rogan/Neil Young controversy that’s happening on Spotify. I’m sure you’ve heard of it, but if not, Canadian musical legend Neil Young had finally had enough of Joe Rogan’s spreading of COVID misinformation on his podcast, “The Joe Rogan Experience.” He gave Spotify an ultimatum: “You can have Rogan or Young. Not both.”

Spotify chose Rogan. Young pulled his library. Since then, a handful of other artists have followed Young, including former band mates David Crosby, Stephen Stills and Graham Nash, along with fellow Canuck Hall of Famer Joni Mitchell.

But it has hardly been a stampede. One of the reasons is that — if you’re an artist — leaving Spotify is easier said than done. In an interview with Rolling Stone, Rosanne Cash said most artists don’t have the luxury of jilting Spotify: 

It’s not viable for most artists. The public doesn’t understand the complexities. I’m not the sole rights holder to my work… It’s not only that a lot of people who aren’t rights holders can’t remove their work. A lot of people don’t want to. These are the digital platforms where they make a living, as paltry as it is. That’s the game. These platforms own, what, 40 percent of the market share?”

Cash also brings up a fundamental issue with capitalism: it follows profit, and it’s consumers who determine what’s profitable. Consumers make decisions based on self-interest: what’s in it for them. Corporations use that predictable behavior to make the biggest profit possible. That behavior has been perfectly predictable for hundreds of years. It’s the driving force behind Adam Smith’s Invisible Hand. It was also succinctly laid out by economist Milton Friedman in 1970:

“There is one and only one social responsibility of business–to use its resources and engage in activities designed to increase its profits so long as it stays within the rules of the game, which is to say, engages in open and free competition without deception or fraud.”

We all want corporations to be warm and fuzzy — but it’s like wishing a shark were a teddy bear. It just ain’t gonna happen.

One who indulged in this wishful thinking was a little less well-known Canadian artist who also pulled his music  from Spotify, Ontario singer/songwriter Danny Michel. He told the CBC:

“But for me, what it was was seeing how Spotify chose to react to Neil Young’s request, which was, you know: You can have my music or Joe. And it seems like they just, you know, got out a calculator, did some math, and chose to let Neil Young go. And they said, clear and loud: We don’t need you. We don’t need your music.”

Well, yes, Danny, I’m pretty sure that’s exactly what Spotify did. It made a decision based on profit. For one thing, Joe Rogan is exclusive to Spotify. Neil Young isn’t. And Rogan produces a podcast, which can have sponsors. Neil Young’s catalog of songs can’t be brought to you by anyone.

That makes Rogan a much better bet for revenue generation. That’s why Spotify paid Rogan $100 million. Music journalist Ted Gioia made the business case for the Rogan deal pretty clear in a tweet

“A musician would need to generate 23 billion streams on Spotify to earn what they’re paying Joe Rogan for his podcast rights (assuming a typical $.00437 payout per stream). In other words, Spotify values Rogan more than any musician in the history of the world.”

I hate to admit that Milton Friedman is right, but he is. I’ve said it time and time before, to expect corporations to put ethics ahead of profits is to ignore the DNA of a corporation. Spotify is doing what corporations will always do, strive to be profitable. The decision between Rogan and Young was done with a calculator. And for Danny Michel to expect anything else from Spotify is simply naïve. If we’re going to play this ethical capitalism game, we must realize what the rules of engagement are.

But what about us? Are we any better that the corporations we keep putting our faith in?

We have talked about how we consumers want to trust the brands we deal with, but when a corporation drops the ethics ball, do we really care? We have been gnashing our teeth about Facebook’s many, many indiscretions for years now, but how many of us have quite Facebook? I know I haven’t.

I’ve seen some social media buzz about migrating from Spotify to another service. I personally have started down this road. Part of it is because I agree with Young’s stand. But I’ll be brutally honest here. The bigger reason is that I’m old and I want to be able to continue to listen to the Young, Mitchell and CSNY catalogs. As one of my contemporaries said in a recent post, “Neil Young and Joni Mitchell? Wish it were artists who are _younger_ than me.”

A lot of pressure is put on companies to be ethical, with no real monetary reasons why they should be. If we want ethics from our corporations, we have to make it important enough to us to impact our own buying decisions. And we aren’t doing that — not in any meaningful way.

I’ve used this example before, but it bears repeating. We all know how truly awful and unethical caged egg production is. The birds are kept in what is known as a battery cage holding 5 to 10 birds and each is confined to a space of about 67 square inches. To help you visualize that, it’s just a bit bigger than a standard piece of paper folded in half. This is the hell we inflict on other animals solely for our own gain. No one can be for this. Yet 97% of us buy these eggs, just because they’re cheaper.

If we’re looking for ethics, we have to look in other places than brands. And — much as I wish it were different — we have to look beyond consumers as well. We have proven time and again that our convenience and our own self-interest will always come ahead of ethics. We might wish that were different, but our spending patterns say otherwise.

I Was So Wrong in 1996…

It’s that time of year – the time when we sprain our neck trying to look backwards and forwards at the same time. Your email inbox, like mine, is probably crammed with 2021 recaps and 2022 predictions.

I’ve given up on predictions. I have a horrible track record. In just a few seconds, I’ll tell you how horrible. But here, at the beginning of 2022, I will look back. And I will substantially overshoot “a year in review” by going back all the way til 1996, 26 years ago. Let me tell you why I’m in the mood for some reminiscing.

In amongst the afore-mentioned “look back” and “look forward” items I saw recently there was something else that hit my radar; a number of companies looking for SEO directors. After being out of the industry for almost 10 years, I was mildly surprised that SEO still seemed to be a rock solid career choice. And that brings me both to my story about 1996 and what was probably my worst prediction about the future of digital marketing.

It was in late 1996 that I first started thinking about optimizing sites for the search engines and directories of the time: Infoseek, Yahoo, Excite, Lycos, Altavista, Looksmart and Hotbot. Early in 1997 I discovered Danny Sullivan’s Webmaster’s Guide to Search Engines. It was revelatory. After much trial and error, I was reasonably certain I could get sites ranking for pretty much any term. We had our handful of local clients ranking on Page One of those sites for terms like “boats,” “hotels”, “motels”, “men’s shirts” and “Ford Mustang”. It was the Wild West. Small and nimble web starts ups were routinely kicking Fortune 500 ass in the digital frontier.   

As a local agency that had played around with web design while doing traditional marketing, I was intrigued by this opportunity. Somewhere near the end of 1997 I did an internal manifesto where I speculated on the future of this “Internet” thing and what it might mean for our tiny agency (I had just brought on board my eventual partner, Bill Barnes, and we had one other full-time employee). I wish I could find that original document, but I remember saying something to the effect of, “This search engine opportunity will probably only last a year or two until the engines crack down and close the loopholes.” Given that, we decided to go for broke and seize that opportunity.

In 1998 we registered the domain www.searchengineposition.com. This was a big step. If you could get your main keywords in your domain name, it virtually guaranteed you link juice. At that time, “Search engine optimization” hadn’t emerged as the industry label. Search engine positioning was the more common term. We couldn’t get www.searchenginepositioning.com because domain names were limited by the number of characters you could use.

We had our domain and soon we had a site. We needed all the help we could get, because according to my prediction, we only had until 2000 or so to make as much as we could from this whole “search thing.” The rest, as they say, was history. It just wasn’t the history I had predicted.

To be fair, I wasn’t the only one making shitty predictions at the time. In 1995, 3Com co-founder Robert Metcalfe (also the co-inventor of Ethernet) said in a column in Infoworld:

“Almost all of the many predictions now being made about 1996 hinge on the Internet’s continuing exponential growth. But I predict the Internet, which only just recently got this section here in InfoWorld, will soon go spectacularly supernova and in 1996 catastrophically collapse.”

And in 1998, Nobel prize winning economist Paul Krugman said,

“The growth of the Internet will slow drastically, as the flaw in ‘Metcalfe’s law’ becomes apparent: most people have nothing to say to each other! By 2005, it will become clear that the Internet’s impact on the economy has been no greater than the fax machine’s”

Both of those people were way smarter than I was, so if I was clueless about the future, at least I was in good company.

As we now know, SEO would be fine, thank you very much. In 2004, some 6 years later, in my very first post for MediaPost, I wrote:

“I believe years from now that…2004 … will be a milestone in the (Search) industry. I think it will mark the beginning of a year that will dramatically alter the nature of search marketing.”

That prediction, as it turned out, was a little more accurate. In 2004, Google’s AdWords program really hit its stride, doubling revenue from 1.5 billion the previous year to $3 billion and starting its hockey stick climb up to its current level, just south of $150 billion (in 2020).

The reason search – and organic search optimization – never fizzled out was that it was a fundamental connection between user intent and the ever-expanding ocean of available content. Search Engine Optimization turned out to be a much better label for the industry than Search Engine Positioning, despite my unfortunate choice of domain names. The later was really an attempt to game the algorithms. The former was making sure content was findable and indexable. Hindsight has shown that it was a much more sustainable approach.

I ended that first post talking about the search industry of 2004 by saying,

“And to think, one day I’ll be able to say I was there.”

I guess today is that day.

Respecting The Perspective Of Generations

We spend most of our time talking to people who are approximately our age. Our social circle naturally forms from those who were born in the same era as us. We just have a lot more in common with them. And that may not be a good thing. I just turned 60, and one of the things I’m spending more time doing is speaking to people in the generation before me and the generation after me.

Each of us become products of the environment where we grew up. It gives us a perspective that shapes the reality we live in, for good or bad. Sometimes that causes frustrations when we interact with those who grew up in a different generation. We just don’t see the world the same way.

And that’s OK. In fact, as I’ve learned from my intergenerational discussions, it can be tremendously valuable. We just have to accept it for what it is.

Take the generation after me — that of my nieces, nephews, and my own children. Armed with determination, energy, and a belief that the world not only should be better but must be better, they are going forward trying to find the shortest point between today and the tomorrow they’re fighting for. For them, there is not a moment to lose.

And they’re right. The sooner we get there, the better it will be for all of us.

As hard as it might be for them to believe, I was once among them. I remember myself having the righteousness of youth, when what was right and what was wrong was so clearly delineated in my own head. I remember being frustrated with my own parents and grandparents, who seemed so stuck in a world no longer relevant or correct. I remember reprimanding them –seldom patiently — when they said something that was no longer acceptable in the more politically correct world of the 1980s.

But — in the blink of an eye — it’s now some 40 years later. And now, it’s my turn to be corrected.

I accept that. I’m unlearning a lot. I believe the world is a better place than the one I grew up in, so I’m willing to do to do the work necessary to change my perspective. The world is a more tolerant, fairer, more equitable place. It’s a long way from being good enough, but I do believe it’s heading in the right direction. And when I’m corrected, I know the generation that follows me is usually right. I am literally changing my mind — and that’s not easy.

But I’m also learning to value the perspective of the generation that came before me — the one I was once so quick to dismiss. I’m working to understand the environment they grew up in and the life experiences that shaped their reality. What was the context that ground the lens they see life through? If we are willing to understand that, it can teach us a lot.

Recently, I’ve been spending a lot of my time talking to a generation born during or just before WWII in Italy. Many of them came from the South of Italy. Most of them were left with nothing after the war. The lives of their parents — their possessions, their livelihood, their communities, everything they knew — was trampled underfoot as the battle spread up the boot of Italy for two long years, from July 1943 to May 1945.   When the dust and debris finally settled, they emigrated, continuing the greatest diaspora in history, because they had no other choice. You don’t leave home until there is no longer a future there to be imagined, no matter how hard you try.

Before we dismiss the perspectives that come from this generation, we have to take a long moment to appreciate the reality that formed their perspective. It is a reality that most of us have never experienced or even imagined. It is a reality that belongs not only to Italians, but almost every immigrant who left the lives they knew behind.

In my conversations with people who came from this reality, attitudes emerge that definitely don’t always fit well in today’s world. They have learned by hard experience that shit can and does happen. Their trust is hard-won. There is a suspicion of people who come from outside the circle of family and friends. There is a puzzlement with the latest cause that is burning up our social media feed. And yes, there is some cultural baggage that might best be left behind.

But there is also a backbone of courage, a long-simmering determination and a pragmatic view of the future that can be admired, and — if we take the time to listen — should be heeded. While the generation after me is rushing into their life in this world, the generation before me is limping out of it. Both perspectives are enlightening and should be considered. I am stuck in the middle. And I’m finding it’s not a bad place to be, as long as I keep looking both ways.

As any navigator can tell you, it’s much easier to pinpoint your location when you have a few different bearings available. This cross-generational view has long been embedded in Iroquois tradition, where it’s known as the Seven Generation principle: “The thickness of your skin shall be seven spans.”

The saying is commonly interpreted as looking forward to create a sustainable future for seven generations. But indigenous activist Vine Deloria Jr. had a different interpretation: that we must honor and protect the seven generations closest to us. Counting ourselves as one of those, we then look back three generations and forward three. We should make our decision based on a approximately 150-year time span, looking 75 years forward and 75 years back.

In our culture, we take a much shorter view of things. In doing that, we can often lose our bearings.

I’m a Fan of Friction

Here in North America, we are waging a war on friction. We use technology like a universal WD-40, spraying it on everything that rubs, squeaks or grinds. We want to move faster, more efficiently, rushing through our to-do list to get to whatever lies beyond it.

We are the culture of “one-click” ordering. We are the people that devour fast food. We relentlessly use apps to make our lives easier — which is our euphemistic way of saying that we want a life with less friction.

Pre-pandemic, I was definitely on board this bandwagon. I, like many of you, always thought friction was a bad thing. I relentlessly hunted efficiency.

This was especially true when I was still in the working world. I started every day with an impossibly long to-do list, and I was always looking for ways to help me work my way through it faster. I believed at the end of my to-do list was the secret of life.

But in the past 14 months, I’ve discovered that it’s friction that might be the secret of life.

There are bushels of newly budding life coaches telling us to be “mindful” and “live in the moment.” But we somehow believe those moments have to all be idyllic walks through a flower garden with those we love most, as the sun filters softly through the trees overhead.

Sometime “in the moment” is looking for sandpaper at Home Depot. Sometimes it’s dropping our coffee as we rush to catch the bus. And sometimes its realizing that you’re sitting next to someone you really don’t like on that five-hour flight to Boston.

All those things are “in the moment,” and maybe — just maybe — that’s what life is all about. Call it friction if you wish, but it’s all those little things we think are annoying until they’re gone.

Friction has some unique physical properties that we tend to overlook as we try to eliminate it. It is, according to one site, “resistance to motion of one object moving relative to another.” It forces us to slow down our motion, whatever direction that motion may be taking us in. And — according to the same site — scientists believe it “is the result of the electromagnetic attraction between charged particles in two touching surfaces.”

Ah hah, so friction is about attraction and our attempts to overcome that attraction! It is about us fighting our social instincts to bond with each other to keep moving to accomplish … what, exactly? Free up time to spend on Facebook? Spend more time playing a game on our phones? Will those things make us happier?

Here’s the other thing about friction. It generates heat. It warms things up. Here in North America, we call it friction. In Denmark, they call it “hygge.”

Denmark is a pretty happy place. In fact, last year it was the second happiest place on earth, according to the United Nations. And a lot of that can be attributed to what the Danish call “hygge,” which roughly translates as “cozy.”

The Danish live for coziness. And yes, the idyllic picture of hygge is spending time in front of the fire in a candlelit cabin, playing a board game with your closest friends. But hygge comes in many forms.

I personally believe that Denmark is an environment that leads to hygge because Denmark is a place that is not afraid of friction. Allow me to explain.

The ultimate way to avoid friction is to be alone. You can’t have “resistance to motion of one object moving relative to another” when there is no other object.

As we emerge from a pandemic that has necessitated removing the objects around us (people) and replacing them with more efficient, less friction-prone substitutes (technology) — whether it’s in our jobs, our daily routines, our shopping trips or our community obligations — we seem to be finding ways to continue to make the world a more efficient place for ourselves.

This is putting us at the center of an optimized universe and ruthlessly eliminating any points of resistance — a life designed by a Silicon Valley engineer. And, more and more often, we find ourselves alone at the center of that universe.

But that’s not how the Danes do it. They have created an environment that leads to bumping into each other. And hygge — with all its warm fuzziness — might just be a product of that environment.  I suspect that might not be by intention. It just worked out that way. But it does seem to work.

For example, Danes spend a lot of time riding the bus. Or riding a bike. Life in Copenhagen is full of bumping along in a meandering trip together to a destination somewhere in the future. The joy is found in the journey, as noted in this Medium post.

It seems to me that life in Denmark, or other perpetually happy countries like Finland, Switzerland, Iceland and Norway, has a lot to do with slowing down and actually embracing societal friction.

We just have to realize that we as a species evolved in an environment filled with friction. And evolution, in its blind wisdom, has made that friction a key part of how we find meaning and happiness. We find hygge when we slow down enough to notice it.

COVID And The Chasm Crossing

For most of us, it’s been a year living with the pandemic. I was curious what my topic was a year ago this week. It was talking about the brand crisis at a certain Mexican brewing giant when its flagship brand was suddenly and unceremoniously linked with a global pandemic. Of course, we didn’t know then just how “global” it would be back then.

Ahhh — the innocence of early 2020.

The past year will likely be an historic inflection point in many societal trend lines. We’re not sure at this point how things will change, but we’re pretty sure they will change. You can’t take what has essentially been a 12-month anomaly in everything we know as normal, plunk it down on every corner of the globe and expect everything just to bounce back to where it was.

If I could vault 10 years in the future and then look back at today, I suspect I would be talking about how our relationship with technology changed due to the pandemic. Yes, we’re all sick of Zoom. We long for the old days of actually seeing another face in the staff lunchroom. And we realize that bingeing “Emily in Paris” on Netflix comes up abysmally short of the actual experience of stepping in dog shit as we stroll along the Seine.

C’est la vie.

But that’s my point. For the past 12 months, these watered-down digital substitutes have been our lives. We were given no choice. And some of it hasn’t sucked. As I wrote last week, there are times when a digital connection may actually be preferable to a physical one.

There is now a whole generation of employees who are considering their work-life balance in the light of being able to work from home for at least part of the time. Meetings the world over are being reimagined, thanks to the attractive cost/benefit ratio of being able to attend virtually. And, for me, I may have permanently swapped riding my bike trainer in my basement for spin classes in the gym. It took me a while to get used to it, but now that I have, I think it will stick.

Getting people to try something new — especially when it’s technology — is a tricky process. There are a zillion places on the uphill slope of the adoption curve where we can get mired and give up. But, as I said, that hasn’t been an option for us in the past 12 months. We had to stick it out. And now that we have, we realize we like much of what we were forced to adopt. All we’re asking for is the freedom to pick and choose what we keep and what we toss away.

I suspect  many of us will be a lot more open to using technology now that we have experienced the tradeoffs it entails between effectiveness and efficiency. We will make more room in our lives for a purely utilitarian use of technology, stripped of the pros and cons of “bright shiny object” syndrome.

Technology typically gets trapped at both the dread and pseudo-religious devotion ends of the Everett Rogers Adoption Curve. Either you love it, or you hate it. Those who love it form the market that drives the development of our technology, leaving those who hate it further and further behind.

As such, the market for technology tends to skew to the “gee whiz” end of the market, catering to those who buy new technology just because it’s new and cool. This bias has embedded an acceptance of planned obsolescence that just seems to go hand-in-hand with the marketing of technology. 

My previous post about technology leaving seniors behind is an example of this. Even if seniors start out as early adopters, the perpetual chase of the bright shiny object that typifies the tech market can leave them behind.

But COVID-19 changed all that. It suddenly forced all of us toward the hump that lies in the middle of the adoption curve. It has left the world no choice but to cross the “chasm” that  Geoffrey Moore wrote about 30 years ago in his book “Crossing the Chasm: Marketing and Selling High-Tech Products to Mainstream Customers.” He explained that the chasm was between “visionaries (early adopters) and pragmatists (early majority),” according to Wikipedia.

This has some interesting market implications. After I wrote my post, a few readers reached out saying they were working on solutions that addressed the need of seniors to stay connected with a device that is easier for them to use and is not subject to the need for constant updating and relearning. Granted, neither of them was from Apple nor Google, but at least someone was thinking about it.

As the pandemic forced the practical market for technology to expand, bringing customers who had everyday needs for their technology, it created more market opportunities. Those opportunities create pockets of profit that allow for the development of tools for segments of the market that used to be ignored.

It remains to be seen if this market expansion continues after the world returns to a more physically based definition of normal. I suspect it will.

This market evolution may also open up new business model opportunities — where we’re actually willing to pay for online services and platforms that used to be propped up by selling advertising. This move alone would take technology a massive step forward in ethical terms. We wouldn’t have this weird moral dichotomy where marketers are grieving the loss of data (as fellow Media Insider Ted McConnell does in this post) because tech is finally stepping up and protecting our personal privacy.

Perhaps — I hope — the silver lining in the past year is that we will look at technology more as it should be: a tool that’s used to make our lives more fulfilling.

Connected Technologies are Leaving Our Seniors Behind

One of my pandemic projects has been editing a video series of oral history interviews we did with local seniors in my community. Last week, I finished the first video in the series. The original plan, pre-pandemic, was to unveil the video as a special event at a local theater, with the participants attending. Obviously, given our current reality, we had to change our plans.

We, like the rest of the world, moved our event online. As I started working through the logistics of this, I quickly realized something: Our seniors are on the other side of a wide and rapidly growing chasm. Yes, our society is digitally connected in ways we never were before, but those connections are not designed for the elderly. In fact, if you were looking for something that seems to be deliberately designed to disadvantage a segment of our population, it would be hard to find a better example than Internet connection and the elderly.

I have to admit, for much of the past year, I have been pretty focused on what I have sacrificed because of the pandemic. But I am still a pretty connected person. I can Zoom and have a virtual visit with my friends. If I wonder how my daughters are doing, I can instantly text them. If I miss their faces, I can FaceTime them. 

I have taken on the projects I’ve been able to do thanks to the privilege of being wired into the virtual world.   I can even go on a virtual bike ride with my friends through the streets of London, courtesy of Zwift.

Yes, I have given up things, but I have also been able find digital substitutes for many of those things. I’m not going to say it’s been perfect, but it’s certainly been passable.

My stepdad, who is turning 86, has been able to do none of those things. He is in a long-term care home in Alberta, Canada. His only daily social connections consist of brief interactions with staff during mealtime and when they check his blood sugar levels and give him his medication. All the activities that used to give him a chance to socialize are gone. Imagine life for him, where his sum total of connection is probably less than 30 minutes a day. And, on most days, none of that connecting is done with the people he loves.

Up until last week, family couldn’t even visit him. He was locked down due to an outbreak at his home. For my dad, there were no virtual substitutes available. He is not wired in any way for digital connection. If anyone has paid the social price of this pandemic, it’s been my dad and people like the seniors I interviewed, for whom I was desperately trying to find a way for them just to watch a 13-minute video that they had starred in.

A recent study by mobile technology manufacturer Ericsson looked specifically at the relationship between technology and seniors during the pandemic. The study focused on what the company termed the “young-old” seniors, those aged 65-74. They didn’t deal with “middle-old” (aged 75-85) or “oldest-old” (86 plus) because — well, probably because Ericsson couldn’t find enough who were connected to act as a representative sample.

But they did find that even the “young old” were falling behind in their ability to stay connected thanks to COVID-19. These are people who have owned smartphones for at least a decade, many of whom had to use computers and technology in their jobs. Up until a year ago, they were closing the technology gap with younger generations. Then, last March, they started to fall behind.

They were still using the internet, but younger people were using it even more. And, as they got older, they were finding it increasingly daunting to adopt new platforms and technology. They didn’t have the same access to “family tech support” of children or grandchildren to help get them over the learning curve. They were sticking to the things they knew how to do as the rest of the world surged forward and started living their lives in a digital landscape.

But this was not the group that was part of my video project. My experience had been with the “middle old” and “oldest old.” Half fell into the “middle old” group and half fell into the “oldest old” group. Of the eight seniors I was dealing with, only two had emails. If the “young old” are being left behind by technology, these people were never in the race to begin with. As the world was forced to reset to an online reality, these people were never given the option. They were stranded in a world suddenly disconnected from everything they knew and loved.

Predictably, the Ericsson study proposes smartphones as the solution for many of the problems of the pandemic, giving seniors more connection, more confidence and more capabilities. If only they got connected, the study says, life will be better.

But that’s not a solution with legs. It won’t go the distance. And to understand why, we just have to look at the two age cohorts the study didn’t focus on, the “middle old” and the “oldest old.”

Perhaps the hardest hit have been the “oldest old,” who have sacrificed both physical and digital connection, as this Journals of Gerontologyarticle notes.   Four from my group lived in long-term care facilities. Many of these were locked down at some point due to local outbreaks within the facility. Suddenly, that family support they required to connect with their family and friends was no longer available. The technological tools  that we take for granted — which we were able to slot in to take the place of things we were losing — were unimaginable to them. They were literally sentenced to solitary confinement.

A recent study from Germany found that only 3% of those living in long-term care facilities used an internet-connected device. A lot of the time, cognitive declines, even when they’re mild, can make trying to use technology an exercise in frustration.

When my dad went into his long-term care home, my sister and I gave him one of our old phones so he could stay connected. We set everything up and did receive a few experimental texts from him. But soon, it just became too confusing and frustrating for him to use without our constant help. He played solitaire on it for a while, then it ended up in a drawer somewhere. We didn’t push the issue. It just wasn’t the right fit.

But it’s not just my dad who struggled with technology. Even if an aging population starts out as reasonably proficient users, it can be overwhelming to keep up with new hardware, new operating systems and new security requirements. I’m not even “young old” yet, and I’ve worked with technology all my life. I owned a digital marketing company, for heaven’s sake. And even for me, it sometimes seems like a full-time job staying on top of the constant stream of updates and new things to learn and troubleshoot. As connected technology leaps forward, it does not seem unduly concerned that it’s leaving the most vulnerable segment of our population behind.

COVID-19 has pushed us into a virtual world where connection is not just a luxury, but a condition of survival. We need to connect to live. That is especially true for our seniors, who have had all the connections they relied on taken from them. We can’t leave them behind. Connected technology can no longer ignore them.

This is one gap we need to build a bridge over.

The Academics of Bullsh*t

“One of the most salient features of our culture is that there is so much bullshit. Everyone knows this. Each of us contributes his share. But we tend to take the situation for granted.”—

from On Bullshit,” an essay by philosopher Henry Frankfurt.

Would it surprise you to know that I have found not one, but two academic studies on organizational bullshit? And I mean that non-euphemistically. The word “bullshit” is actually in the title of both studies. I B.S. you not.

In fact, organizational bullshit has become a legitimate field of study. Academics are being paid to dig into it — so to speak. There are likely bullshit grants, bullshit labs, bullshit theories, bullshit paradigms and bullshit courses. There are definitely bullshit professors.  There is even an OBPS — the Organization Bullshit Perception Scale — a way to academically measure bullshit in a company.

Many years ago, when I was in the twilight of my time with the search agency I had founded, I had had enough of the bullshit I was being buried under, shoveled there by the company that had acquired us. I was drowning in it. So I vented right here, on MediaPost. I dared you to imagine what it would be like to actually do business without bullshit getting in the way.

My words fell on deaf ears. Bullshit has proliferated since that time. It has been enshrined up and down our social, business and governmental hierarchies, becoming part of our “new” organizational normal. It has picked up new labels, like “fake news” and “alternate facts.” It has proven more dangerous than I could have ever imagined. And it is this dangerous because we are ignoring it, which is legitimizing it.

Henry Frankfurt defined the concept and set it apart from lying. Liars know the truth and are trying to hide it. Bullshitters don’t care if what they say is true or false. They only care if their listener is persuaded. That’s as good a working definition of the last four years as any I’ve heard.

But at least one study indicates bullshit may have a social modality — acceptable in some contexts, but corrosive in others. Marketing, for example, is highlighted by the authors as an industry built on a foundation of bullshit:

“advertising and public relations agencies and consultants are likely to be ‘full of It,’ and in some cases even make the production of bullshit an important pillar of their business.”

In these studies, researchers speculate that bullshit might actually serve a purpose in organizations. It may allow for strategic motivation before there is an actual strategy in place. This brand of bullshit is otherwise known as “blue-sky thinking” or “out-of-the-box thinking.”

But if this is true, there is a very narrow window indeed where this type of bullshit could be considered beneficial. The minute there are facts to deal with, they should be dealt with. But the problem is that the facts never quite measure up to the vision of the bullshit. Once you open the door to allowing bullshit, it becomes self-perpetuating.

I grew up in the country. I know how hard it is to get rid of bullshit.

The previous example is what I would call strategic bullshit — a way to “grease the wheels” and get the corporate machine moving. But it often leads directly to operational bullshit — which is toxic to an organization, serving to “gum up the gears” and prevent anything real and meaningful from happening. This was the type of bullshit that was burying me back in 2013 when I wrote that first column. It’s also the type of bullshit that is paralyzing us today.

According to the academic research into bullshit, when we’re faced with it, we have four ways to respond: exit, voice, loyalty or neglect. Exit means we try to escape from the bullshit. Loyalty means we wallow in it, spreading it wider and thicker. Neglect means we just ignore it. And Voice means we stand up to the bullshit and confront it.  I’m guessing you’ve already found yourself in one of those four categories.

Here’s the thing. As marketers and communicators, we have to face the cold, ugly truth of our ongoing relationship with bullshit. We all have to deal with it. It’s the nature of our industry.

But how do we deal with it? Most times, in most situations, it’s just easier to escape or ignore it. Sometimes it may serve our purpose to jump on the bullshit bandwagon and spread it. But given the overwhelming evidence of where bullshit has led us in the recent past, we all should be finding our voice to call bullshit on bullshit.

Missing the Mundane

I realize something: I miss the mundane.

Somewhere along the line, mundanity got a bad rap. It became a synonym for boring. But it actually means worldly. It refers to the things you experience when you’re out in the world.

And I miss that — a lot.

There is a lot of stuff that happens when we’re living our lives that we don’t give enough credit to: Petting a dog being taken for a walk. A little flirting with another human we find attractive. Doing some people-watching while we eat our bagel in a mall’s food court. Random situational humor that plays itself out on the sidewalk in front of us. Discovering that the person cutting your hair is also a Monty Python fan. Snippets of conversation — either ones we’re participating in, or ones we overhear while we wait for the bus. Running into an old acquaintance. Even being able to smile at a stranger and have them smile back at you.

The mundane is built of all those hundreds of little, inconsequential social exchanges that happen daily in a normal world that we ordinarily wouldn’t give a second thought to.

And sometimes, serendipitously, we luck upon the holy grail of mundanity — that random “thing” that makes our day.

These are the things we live for. And now, almost all of these things have been stripped from our lives.

I didn’t realize I missed them because I never assigned any importance to them. If I did a signal-to-noise ratio analysis of my life, all these things would fall in the latter category. Most of the time, I wasn’t even fully aware that they were occurring. But I now realize when you add them all up, they’re actually a big part of what I’m missing the most. And I’ve realized that because I’ve been forced to subtract them — one by one — from my life.

I have found that the mundane isn’t boring. It’s the opposite — the seasoning that adds a little flavor to my day-to-day existence.

For the past 10 months, I thought the problem was that I was missing the big things: travel, visiting loved ones, big social gatherings. And I do miss those things. But those things are the tentpoles – the infrequent, yet consequential things that we tend to hang our happiness on. We failed to realize that in between those tentpoles, there is also the fabric of everyday life that has also been eliminated.

It’s not just that we don’t have them. It’s also that we’ve tried to substitute other things for them. And those other things may be making it worse. Things like social media and way too much time spent looking at the news. Bingeing on Netflix. Forcing ourselves into awkward online Zoom encounters just because it seems like the thing to do. A suddenly developed desire to learn Portuguese, or how to bake sourdough bread.

It’s not that all these things are bad. It’s just that they’re different from what we used to consider normal — and by doing them, it reinforces the gap that lies between then and now. They add to that gnawing discontent we have with our new forced coping mechanisms.

The mundane has always leavened our lives. But now, we’ve swapped the living of our lives for being entertained — and whether it’s the news or the new show we’re bingeing, entertainment has to be overplayed. It is nothing but peaks and valleys, with no middle ground. When we actually do the living, rather than the watching, we spend the vast majority of our time in that middle ground — the mundane, which is our emotional reprieve.

I’ve also noticed my social muscles have atrophied over the past several months due to lack of exercise. It’s been ages since I’ve had to make small talk. Every encounter now — as infrequent as they are — seems awkward. Either I’m overeager, like a puppy that’s been left alone in a house all day, or I’m just not in any mood to palaver.  

Finally, it’s these everyday mundane encounters that used to give me anecdotal evidence that not all people were awful. Every day I used to see examples of small kindnesses, unexpected generosity and just plain common courtesy. Yes, there were also counterpoints to all of these, but it almost always netted out to the good. It used to reaffirm my faith in people on a daily basis.

With that source of reaffirmation gone, I have to rely on the news and social media. And — given what those two things are — I know I will only see the extremes of human nature. It’s my “angel and asshole” theory : That we all lie on a bell curve somewhere between the two, and our current situation will push us from the center closer to those two extremes. You also know that the news and social media are going to be biased towards the “asshole” end of the spectrum.

There’s a lot to be said for the mundane — and I have. So I’ll just wrap up with my hope that my life — and yours — will become a little more mundane in the not-too-distant future.