The World vs Big Tech

Around the world, governments have their legislative cross hairs trained on Big Tech. It’s happening in the US, the EU and here in my country,  Canada. The majority of these are anti-trust suits. But Australia has just introduced a different type of legislation, a social media ban for those under 16. And that could change the game – and the conversation -completely for Big Tech.

There are more anti-trust actions in the queue in the US than at any time in the previous five decades. The fast and loose interpretation of antitrust enforcement in the US is that monopolies are only attacked when they may cause significant harm to customers through lack of competition. The US approach to anti-trust since the 1970s has typically followed the Chicago School of neoclassical economy theory, which places all trust in the efficiency of markets and tells government to keep their damned hands off the economy. Given this and given the pro-business slant of all US administrations, both Republican and Democratic, since Reagan, it’s not surprising that we’ve seen relatively few anti-trust suits in the past 50 years.

But the rapid rise of monolithic Big Tech platforms has raised more discussion about anti-trust in the past decade than in the previous 5 decades. These platforms suck along the industries they spawn in their wake and leave little room for upstart competitors to survive long enough to gain significant market share.

Case in point: Google. 

The recent Canadian lawsuit has the Competition Bureau (our anti-trust watchdog) suing Google for anti-competitive practices selling its online advertising services north of the 49th parallel. They’re asking Google to sell off two of its ad-tech tools, pay penalties worth up to 3% of the platform’s global gross revenues and prohibit the company from engaging in anti-competitive practices in the future.

According to a 3-year inquiry into Google’s Canadian business practices by the Bureau, Google controls 90% of all ad servers and 70% of advertising networks operating in the country. Mind you, Google started the online advertising industry in the relatively green fields of Canada back when I was still railing about the ignorance of Canadian advertisers when it came to digital marketing. No one else really had a chance. But Google made sure they never got one by wrapping its gigantic arms around the industry in an anti-competitive bear hug.

The recent Australian legislation is of a different category, however. Anti-trust suits are – by nature – not personal. They are all about business. But the Australian ban puts Big Tech in the same category as Big Tobacco, Big Alcohol and Big Pharma – alleging that they are selling an addictive product that causes physical or emotional harm to individuals. And the rest of the world is closely watching what Australia does. Canada is no exception.

The most pertinent question is how will Australia enforce the band? Restricting social media access to those under 16 is not something to be considered lightly.  It’s a huge technical, legal and logistical hurdle to get over. But if Australia can figure it out, it’s certain that other jurisdictions around the world will follow in their footsteps.

This legislation opens the door to more vigorous public discourse about the impact of social media on our society. Politicians don’t introduce legislation unless they feel that – by doing so – they will continue to get elected. And the key to being elected is one of two things; give the electorate what they want or protect them against what they fear. In Australia, recent polling indicates the ban is supported by 77% of the population. Even those opposing the ban aren’t doing so in defense of social media. They’re worried that the devil might be in the details and that the legislation is being pushed through too quickly.

These types of things tend to follow a similar narrative arc: fads and trends drive widespread adoption – evidence mounts about the negative impacts – industries either ignore or actively sabotage the sources of the evidence – and, with enough critical mass, government finally gets into the act by introducing protective legislation.

With tobacco in the US, that arc took a couple of decades, from the explosion of smoking after World War II to the U.S. Surgeon General’s 1964 report linking smoking and cancer. The first warning labels on cigarette packages appeared two years later, in 1966.

We may be on the cusp of a similar movement with social media. And, once again, it’s taken 20 years. Facebook was founded in 2004.

Time will tell. In the meantime, keep an eye on what’s happening Down Under.

Can OpenAI Make Searching More Useful?

As you may have heard, OpenAI is testing a prototype of a new search engine called SearchGPT. A press release from July 25 notes: “Getting answers on the web can take a lot of effort, often requiring multiple attempts to get relevant results. We believe that by enhancing the conversational capabilities of our models with real-time information from the web, finding what you’re looking for can be faster and easier.”

I’ve been waiting for this for a long time: search that moves beyond relevance to usefulness.  It was 14 years ago that I said this in an interview with Aaron Goldman regarding his book “Everything I Know About Marketing I Learned from Google”:“Search providers have to replace relevancy with usefulness. Relevancy is a great measure if we’re judging information, but not so great if we’re measuring usefulness. That’s why I believe apps are the next flavor of search, little dedicated helpers that allow us to do something with the information. The information itself will become less and less important and the app that allows utilization of the information will become more and more important.”

I’ve felt for almost two decades that the days of search as a destination were numbered. For over 30 years now (Archie, the first internet search engine, was created in 1990), when we’re looking for something online, we search, and then we have to do something with what we find on the results page. Sometimes, a single search is enough — but often, it isn’t. For many of our intended end goals, we still have to do a lot of wading through the Internet’s deep end, filtering out the garbage, picking up the nuggets we need and then assembling those into something useful.

I’ve spent much of those past two decades pondering what the future of search might be. In fact, my previous company wrote a paper on it back in 2007. We were looking forward to what we thought might be the future of search, but we didn’t look too far forward. We set 2010 as our crystal ball horizon. Then we assembled an all-star panel of search design and usability experts, including Marissa Mayer, who was then Google’s vice president of search user experience and interface design, and Jakob Nielsen, principal of the Nielsen Norman Group and the web’s best known usability expert. We asked them what they thought search would look like in three years’ time.

Even back then, almost 20 years ago, I felt the linear presentation of a results page — the 10 blue links concept that started search — was limiting. Since then, we have moved beyond the 10 blue links. A Google search today for the latest IPhone model (one of our test queries in the white paper) actually looks eerily similar to the mock-up we did for what a Google search might look like in the year 2010. It just took Google 14 extra years to get there.

But the basic original premise of search is still there: Do a query, and Google will try to return the most relevant results. If you’re looking to buy an iPhone, it’s probably more useful, mainly due to sponsored content. But it’s still well short of the usefulness I was hoping for.

It’s also interesting to see what directions search has (and hasn’t) taken since then. Mayer talked a lot about interacting with search results. She envisioned an interface where you could annotate and filter your results: “I think that people will be annotating search results pages and web pages a lot. They’re going to be rating them, they’re going to be reviewing them. They’re going to be marking them up, saying ‘I want to come back to this one later.’”

That never really happened. The idea of search as a sticky and interactive interface for the web sort of materialized, but never to the extent that Mayer envisioned.

From our panel, it was Nielsen’s crystal ball that seemed to offer the clearest view of the future: “I think if you look very far ahead, you know 10, 20, 30 years or whatever, then I think there can be a lot of things happening in terms of natural language understanding and making the computer more clever than it is now. If we get to that level then it may be possible to have the computer better guess at what each person needs without the person having to say anything, but I think right now, it is very difficult.”

Nielsen was spot-on in 2007. It’s exactly those advances in natural language processing and artificial intelligence that could allow ChatGPT to now move beyond the paradigm of the search results page and move searching the web into something more useful.

A decade and a half ago, I envisioned an ecosystem of apps that could bridge the gap between what we intended to do and the information and functionality that could be found online.  That’s exactly what’s happening at OpenAI — a number of functional engines powered by AI, all beneath a natural language “chat” interface.

At this point, we still have to “say” what we want in the form of a prompt, but the more we use ChatGPT (or any AI interface) the better it will get to know us. In 2007, when we wrote our white paper on the future of search, personalization was what we were all talking about. Now, with ChatGPT, personalization could come back to the fore, helping AI know what we want even if we can’t put it into words.

As I mentioned in a previous post, we’ll have to wait to see if SearchGPT can make search more useful, especially for complex tasks like planning a vacation, making a major purchase onr planning a big event.

But I think all the pieces are there. The monetization siloes that dominate the online landscape will still prove a challenge to getting all the way to our final destination, but SearchGPT could make the journey faster and a little less taxing.

Note: I still have a copy of our 2007 white paper if anyone is interested. Just email me (email in the contact us page), give me your email and I’ll send you a copy.

Google Leak? What Google Leak?

If this were 15 years ago, I might have cared about the supposed Google Leak that broke in late May.

But it’s not, and I don’t. And I’m guessing you don’t either. In fact, you could well be saying “what Google leak?” Unless you’re a SEO, there is nothing of interest here. Even if you are a SEO, that might be true.

I happen to know Rand Fishkin, the person who publicly broke the leak last week. Neither Rand nor I are in the SEO biz anymore, but obviously his level of interest in the leak far exceeded mine. He devoted almost 6000 words to it in the post where he first unveiled the leaked documents, passed on to him by Erfan Azimi, CEO and director of SEO of EA Eagle Digital.

Rand and I spoke at many of the same conferences before I left the industry in 2012. Even at that time, our interests were diverging. He was developing what would become the Moz SEO tool suite, so he was definitely more versed in the technical side of SEO. I had already focused my attention on the user side of search, looking at how people interacted with a search engine page. Still, I always enjoyed my chats with Rand.

Back then, SEO was an intensely tactical industry. Conference sessions that delved into the nitty gritty of ranking factors and shared ways to tweak sites up the SERP were the ones booked into the biggest conference rooms, because organizers knew they’d be jammed to the rafters.

I always felt a bit like a fish out of water at these conferences. I tried to take a more holistic view, looking at search as just one touchpoint in the entire online journey. To me, what was most interesting was what happened both before the search click and after it. That was far more intriguing to me than what Google might be hiding under their algorithmic hood.

Over time, my sessions developed their own audience. Thanks to mentors like Danny Sullivan, Chris Sherman and Brett Tabke, conference organizers carved out space for me on their agendas. Ken Fadner and the MediaPost team even let me build a conference that did its best to deal with search at a more holistic level, the Search Insider Summit. We broadened the search conversation to include more strategic topics like multipoint branding, user experience and customer journeys.

So, when the Google leak story bleeped on my radar, I was immediately taken back to the old days of SEO. Here, again, there was what appeared to be a dump of documents that might give some insights into the nuts and bolts of Google’s ranking factors. Mediapost’s own post said that “leaked Google documents has given the search industry proprietary insight into Google Search, revealing very important elements that the company uses to rank content.” Predictably, SEOs swarmed over it like a flock of seagulls attacking a half-eaten hot dog on a beach. They were still looking for some magic bullet that might move them higher in the organic results.

They didn’t come up with much. Brett Tabke, who I consider one of the founders of SEO (he coined the term SERP), spent five hours combing through the documents and said it wasn’t a leak and the documents contained no algorithm-related information. To mash up my metaphors, the half-eaten hotdog was actually a nothingburger.

But Oh My SEOs – you still love diving into the nitty gritty, don’t you?

What is more interesting to me is how the actual search experience has changed in the past decade or two. In doing the research for this, I happened to run into a great clip about Tech monopolies from Last Week Tonight with John Oliver. He shows how much of the top of the Google SERP is now dominated by information and links from Google. Again, quoting a study from Rand Fishkin’s new company, SparkToro, Oliver showed that “64.82% of searches on Google…ended..without clicking to another web property.”

That little tidbit has some massive implications for marketers. The days of relying on a high organic ranking are long gone, because even if you achieve it, you’ll be pushed well down the page.

And on that, Rand Fishkin and I seem to agree. In his post, he does say, “If there was one universal piece of advice I had for marketers seeking to broadly improve their organic search rankings and traffic, it would be: ‘Build a notable, popular, well-recognized brand in your space, outside of Google search.’”

Amen.

A Decade with the Database of Intentions

First published September 27, 2012 in Mediapost’s Search Insider

It’s been over 10 years since John Battelle first started considering what he called the “Database of intentions.” It was, and is:

The aggregate results of every search ever entered, every result list ever tendered, and every path taken as a result. It lives in many places, but three or four places in particular hold a massive amount of this data (ie MSN, Google, and Yahoo). This information represents, in aggregate form, a place holder for the intentions of humankind – a massive database of desires, needs, wants, and likes that can be discovered, supoenaed, archived, tracked, and exploited to all sorts of ends. Such a beast has never before existed in the history of culture, but is almost guaranteed to grow exponentially from this day forward. This artifact can tell us extraordinary things about who we are and what we want as a culture. And it has the potential to be abused in equally extraordinary fashion.

When Battelle considered the implications, it overwhelmed him. “Once I grokked this idea (late 2001/early 2002), my head began to hurt.” Yet, for all its promise, marketers have only marginally leveraged the Database of Intentions.

In the intervening time, the possibilities of the Database of Intention have not diminished. In fact, they have grown exponentially:

My mistake in 2003 was to assume that the entire Database of Intentions was created through our interactions with traditional web search. I no longer believe this to be true. In the past five or so years, we’ve seen “eruptions” of entirely new fields, each of which, I believe, represent equally powerful signals – oxygen flows around which massive ecosystems are already developing. In fact, the interplay of all of these signals (plus future ones) represents no less than the sum of our economic and cultural potential.

Sharing Battelle’s predilection for “Holy Sh*t” moments, a post by MediaPost’s Laurie Sullivan this Tuesday got me thinking again about Battelle’s “DBoI.” A recent study by Google and EA showed that using search data can predict 84% of video game sales.  But the data used in the prediction is only scratching the surface of what’s possible. Adam Stewart from Google hints at what might be possible, “Aside from searches, Google plans to build in game quality, TV investment, online display investment, and social buzz to create a multivariate model for future analysis.”

This is very doable stuff. All we need to create predictive models that match (and probably far exceed) the degree of accuracy already available. The data is just sitting there, waiting to be interpreted. The implications for marketing are staggering, but to Battelle’s point, let’s not be too quick to corral this simply for the use of marketers. The DBoI has implications that reach into every aspect of our society and lives. This is big — really big! If that sounds unduly ominous to you, let me give you a few reasons why you should be more worried than you are.

Typically, if we were to predict patterns in human behavior, there would be two sources of signals. One comes from an understanding of how humans act. As we speak, this is being attacked on multiple fronts. Neuroscience, behavioral economics, evolutionary psychology and a number of other disciplines are rapidly converging on a vastly improved understanding of what makes us tick. From this base understanding, we can then derive hypotheses of predicted behaviors in any number of circumstances.

This brings us to the other source of behavior signals. If we have a hypothesis, we need some way to scientifically test it. Large-scale collections of human behavioral data allow us to search for patterns and identify underlying causes, which can then serve as predictive signals for future scenarios. The Database of Intentions gives us a massive source of behavior signals that capture every dimension of societal activity. We can test our hypotheses quickly and accurately against the tableau of all online activity, looking for the underlying influences that drive behaviors.

At the intersection of these two is something of tremendous import. We can start predicting human behavior on a massive scale, with unprecedented accuracy. With each prediction, the feedback loop between qualitative prediction and quantitative verification becomes faster and more efficient. Throw a little processing power at it and we suddenly have an artificially intelligent, self-ssimproving predictive model that will tell us, with startling accuracy, what we’re likely to do in the future.

This ain’t just about selling video games, people. This is a much, much, much bigger deal.

Climbing the Slippery Slopes of Mount White Hat

First published August 30, 2012 in Mediapost’s Search Insider

On Monday of this week, fellow Search Insider Ryan DeShazer bravely threw his hat back in the ring regarding this question: Is Google better or worse off because of SEO?

DeShazer confessed to being vilified after a previous column indicated that Google owed us something. I admit I have a column penned but never submitted that Ryan could have added to the “vilify” side of that particular tally. But in his Monday column, Ryan touches on a very relevant point: “What is the thin line between White Hat and Black Hat SEO?” For as long as I’ve been in this industry (which is pushing 17 years now) I’ve heard that same debate. I’ve been at conference sessions where white hats and black hats went head to head on the question. It’s one of those discussions that most sane people in the world could care less about, but we in the search biz can’t seem to let go.

Ryan stirs the pot again by indicating that Google may be working on an SEO “Penalty Box”: a temporary holding pen for sites that are using “rank modifying spammers” where results will fluctuate more than in the standard index. The high degree of flux should lead to further modifications by the “spammers” that will help Google identify them and theoretically penalize them. DeShazer’s concern is the use of the word “spammers” in the wording of the patent application, which seems to include any “webmasters who attempt to modify their search engine ranking.”

I personally think it’s dangerous to try to apply wording used in a patent application (the source for this speculation) arbitrarily against what will become a business practice. Wording in a patent is intended to help convey the concept of the intellectual property as quickly and concisely as possible to a patent review bureaucrat. The wording deals in concepts that are (ironically) pretty black and white. It has little to no relationship to how that IP will be used in the real world, which tends to be colored in various shades of gray. But let’s put that aside for a moment.

Alan Perkins, an SEO I would call vociferously “white hat,” some years ago came up with what I believe is the quintessential difference here. Black hats optimize for a search engine. White hats optimize for humans.  When I make site recommendations, they are to help people find better content faster and act on it. I believe, along with Perkins, that this approach will also do good things for your search visibility.

But that also runs the danger of being an oversimplification. The picture is muddied by clients who measure our success as SEO agencies by their position relative to their competitors on a keyword-by-keyword level. This is the bed the SEO industry has built for itself, and now we’re forced to sleep in it. I’m as guilty as the next guy of cranking out competitive ranking reports, which have conditioned this behavior over the past decade and a half.

The big problem, and one continually pointed out by vocal grey/black hats, is that you can’t keep up with competition who are using methods more black than white by staying with white-hat tactics alone. The fact is, black hat works, for a while. And if I’m the snow-white SEO practitioner whose clients are repeatedly trounced by those using a black hat consultant, I’d better expect some client churn. Ethics and profitably don’t always go together in this industry.

To be honest, over the past five years, I’ve largely stopped worrying about the whole white hat/black hat thing. We’ve lost some clients because we weren’t aggressive enough, but the ones who stayed were largely untouched by the string of recent Google updates targeting spammers. Most benefited from the house cleaning of the index. I’ve also spent the last five years focused a lot more on people and good experiences than on algorithms and link juice, or whatever the SEO flavor du jour is.

I think Alan Perkins nailed it way back in 2007. Optimize for humans. Aim for the long haul. And try to be ethical. Follow those principles, and I find it hard to imagine that Google would ever tag you with the label of “spammer.”

Living a B-Rated Life

First published August 16, 2012 in Mediapost’s Search Insider

I love ratings and reviews — and I’m not alone.  4.7 people out of 5 people love reviews. We give them two thumbs up. They rate 96.5% on the Tomato-meter.  I find it hard to imagine what my life would be without those ubiquitous 5 stars to guide me.

This past weekend, I was in Banff, Alberta for my sister’s wedding. My family decided to find a place to go for breakfast. The first thing I did was check with Yelp, and soon we were stacking up the Eggs Benny at a passable breakfast buffet less than two miles from our hotel. I never knew said buffet existed before checking the reviews — but once I found it, I trusted the wisdom of crowds. It seldom steers me wrong.

Now, you do have to learn how to read between the lines of a typical review site. Just before heading to my sister’s wedding, I spent the day in Seattle at the Bazaar Voice user event and was fascinated to learn that their user research shows that the typical number of reviews scanned is generally about seven. Once people hit seven reviews, they feel they have a good handle on the overall tone, even if there are 1,000 reviews in total. This seems right to me. It’s about the number of reviews I scan if possible.

But we also rely on the average rating summaries that typically show above the individual reviews and comments. When I read a review, I tend to follow these rules of thumb:

  • Look for the entry with the most reviews.
  • Find one that has a high average, but be suspicious of ones that have absolutely no negative reviews (unusual if you follow Rule One).
  • Scan the top six or seven reviews to get an overall sense of what people like and dislike.
  • Sort by the most negative reviews and read at least one to see what people hate.
  • Decide whether the negative reviews are the result of a one-off bad experience, or possibly an impossible-to-please customer (you can usually pick them out by their comments).
  • Do the “sniff test” to see if there are planted reviews (again, they’re not that hard to pick out).

I’ve used the same approach for restaurants, hotels, consumer electronics, cars, movies, books, hot tubs – pretty much anything I’ve had to open my wallet for in the past five or six years. It’s made buying so much easier. Ratings and reviews are like the Cole’s notes of word of mouth. They condense the opinions of the marketplace down to the bare essentials.

It’s little wonder that Google is starting to invest heavily in this area, with recent acquisitions of Zagat and Frommer’s. These are companies that built entire businesses on eliminating risk through reviews. The aggregation and organization of opinion is a natural extension for search engines. Of course, we should give it a fancy name, like “social graph,”, so we can sound really smart at industry conferences, but the foundations are built on plain common sense. Our attraction to reviews is hardwired into our noggins. We are social animals and like to travel in packs.  Language evolved so we could point each other to the best cassava root patch and pass along the finer points of mastodon hunting.

As Google acquires more and more socially informed content, it will be integrated into Google’s algorithms. This is why Google had to launch its own social network. Unfortunately, Google+ hasn’t gained the critical mass needed to provide the signals Google is looking for. I personally haven’t had a Google+ invite in months. Despite Larry Page’s insistence that it’s a roaring success, others have pointed out that Google+ seems to be a network of tire kickers, with little in the way of ongoing engagement. Contrast that with Pinterest, which is all the various women in my life seem to talk about — and is outperforming even Twitter when it comes to driving referrals.

I personally love the proliferation of structured word-of-mouth. Some say it negates serendipity, but I actually believe I will be more apt to explore if there is some reassurance I won’t have a horrible experience. Otherwise, this weekend my family and I would have been having Egg McMuffins at the Banff McDonald’s — and really, is that the life you want?

Marissa Mayer and Yahoo’s Regression to the Mean

First published July 26, 2012 in Mediapost’s Search Insider

There is not a lot of overlap between the universes of Gord Hotchkiss and Marissa Mayer, but our orbits have intersected on a few occasions in the past. I’ve had the opportunity to talk to Mayer about various aspects of search on a handful of occasions, so it was with some interest that I watched the announcement and subsequent buzz about her appointment as Yahoo CEO.

Much has been said about Mayer’s personal qualifications for the job, and the general consensus is that this is a good thing for Yahoo. If this were a movie, I’m thinking she would score an 82% on the Tomatometer, handily qualifying as “fresh.” Personally, I would agree. Mayer has a razor-sharp (and somewhat intimidating) intellect, a core love for search and an innate sense of what’s right for the user. All of these things will be big plusses for Yahoo. What she hasn’t been tested on is her ability to run a big company. And that’s where things could get interesting.

No doubt Google still imparts its own “halo” effect on anyone who has spent time at the “Plex” in a leadership position. And few have spent as much time there as Mayer, who, as hire number 20, was Google’s first female engineer, logging 13 years with bosses (and hopefully still friends) Page and Brin.  These three tied a tight little knot in the early days of Google, but from the outside, that knot seems to have frayed just a little in the past few years. Mayer’s recent moves in the company have been more lateral than vertical, as later additions to the Google team were promoted above her. Undoubtedly, this was a contributing factor to the parting of the ways with Google.

But how much value does Mayer’s vast inside knowledge of Google and its past successes bring to Yahoo? It must have played a major role in her selection as the new chief Yahooligan. But was she instrumental in the streak of seemingly picture-perfect management calls in the early days of the Internet’s Golden Child? And, even if she were, does it really matter?

Earlier this year, I took part in an open forum on search at an industry conference. Our moderator tossed a ticking time bomb at the panel, in the form of this delicately stated question: “What the #%^&$ is Google doing lately? Have they gone insane?” We each offered our opinions, which ranged in the degree of madness ascribed to Google’s executives. I started my response with this, “I think we tend to downplay the role luck played in the early days of Google. Maybe their luck is just running out.”

There is a much fancier name for the hypothetical situation I described, which is called “regression to the mean.”  In his recent book, “Thinking, Fast and Slow,” (a HIGHLY recommended read) psychologist and Nobel laureate Daniel Kahneman explores how this can lead us to overvalue executive talent when it’s combined with the halo effect. Kahneman even uses Google as an example: “Of course there was a great deal of skill in the Google story, but luck played a more important role in the actual event than it does in the telling of it. And the more luck was involved, the less there is to be learned.”

Regression to the mean simply means that when you take a snapshot in time that represents either exceptionally good or bad performance, subsequent snapshots tend to move closer to the average. And those highs and lows generally involve luck to some extent. So you can poach talent from a company on a hot streak, only to find that it wasn’t the executives responsible for the performance, but simply the planets aligning in a favorable way.

As an ex-CEO of a company, albeit a tiny one, I find it hard to swallow that leadership might not be as important as we think in the fortunes of a company. But I generally find Kahneman to be an incredibly astute observer of human errors in judgment, so I have to resist the urge to go with my own cognitive biases here and trust Kahneman’s research.  He doesn’t say leadership is inconsequential, but he does caution against ignoring the role of timing and sheer luck.

This is also not to downplay the role Marissa Mayer will play in the future of Yahoo.  Somebody has to lead the company, and Mayer is at least as good a choice as anyone else I can think of.

Who knows? Maybe Yahoo’s luck is due to change. In their case, “regression to the mean” means there’s no place to go but up.

Will Google X Get Google’s Mojo Back?

First published May 31, 2012 in Mediapost’s Search Insider

What do you do when the search engine you started up with your fellow uber-geek partner makes you fabulously wealthy, but somehow all the billions it’s raking in leaves you feeling rather empty?

What do you do when you’re no longer the darling of the mainstream press, who once enthused that no challenge was too daunting for you and your company full of exceptionally gifted and only slightly less egotistical baby geniuses?

Well, if you’re Sergey Brin, you find a new toy. You leave the mind-numbingly mundane business of running a multibillion-dollar mega-corporation to your power-tripping co-founder, and you lock yourself away in an undisclosed office somewhere in Silicon Valley, spending your day playing with robots, space elevators, virtual reality glasses and self-driving cars.

You go back to what you wanted to do in the first place, which was to “put a ding in the universe.” And it’s probably no coincidence that you’re following in the footsteps of your “love me or hate me” mentor, the late Steve Jobs.

Say what you want about Google, I don’t think there’s any doubt that Brin and Page wanted to change the world in substantial (and hopefully non-evil) ways when they started. But the business of running a business tends to make one put ideals on hold and focus on the bottom line. Taking your company public doesn’t help. Shareholders typically value revenue over revolution, profits over prophesy. “Sure, robots and space elevators are cool, but tell me how that’s going to contribute to our quarterly earnings?” Public companies, by necessity, tend to focus on the short term rather than the long.

But Brin has never been a short-term guy. Neither has Page, for that matter. They both love to take something and spin it into a grandiose vision. For Page, he felt he could best realize that by taking over the leadership of Google. But for all the power that comes with that role, there’s also a heaping helping of compromise. Brin apparently felt more comfortable in the more idealistic environs of the Google-X Lab.

If you’re not familiar with Google X, it’s a super-secret hidden laboratory where an ultra-powerful super computer and high tech gadgets allow the billionaire to fight crime… no, wait, that’s the Bat Cave. Google X is a secret laboratory where Brin has been spending a lot of time lately. In a New York Times article from last November, it’s described as a, “clandestine lab where Google is tackling a list of 100 shoot-for-the-stars ideas. Google is so secretive about the effort that many employees do not even know the lab exists.”

What are some of these “shoot-for-the-star” ideas? There is no definitive list, given the “hush hush” nature of Google X, but third-party reports commonly mention space elevators, driverless cars, connected household appliances, and one project that is starting to see the light of day: Google Glass, wearable technology that someday could bring a Google interface to the world around us (more about this in a future column).

Google X certainly doesn’t suffer from a lack of ambition. It’s the type of thing we used to routinely expect from the Google we knew and loved.  And it’s got oodles of “cool”: robots and space elevators and driverless cars, oh my! But these types of skunk work projects are often just a way to pacify a few highly placed egos and keep them out of the way while the real work of the company gets done by those who are a little less grandiose in their ambitions.

And Google X does suffer from Google’s long-term problem of trying to do everything at once. The company has always had a problem with focus. Unlike Google X, Jobs’ lofty ambitions and breakaway projects at Apple were tied to a product that would ship sometime in the next decade. Don’t expect to see a space elevator coming to your neighborhood anytime soon.

So the question remains: Will Google X define the future of Google, or is it just a plaything to keep Sergey happy? Only time will tell.

Living Beyond Our Expectations

First published May 25, 2012 in Mediapost’s Search Insider

To my father-in-law, the Internet is a big black box that he doesn’t understand, but inside of which, all is possible. This became clear to me after the following conversation:

F-I-L: Gord?

Me: Yes?

F-I-L: Can you go on your computer and find the combination for my safe?

Me: Huh?

F-I-L: I have an old safe that I locked years ago and I can’t remember the combination. I thought you could probably find it on your computer.

Of course, by “computer,” he meant the Internet. To him, the Internet is the sum collection of all information, and in that, he’s not far wrong. Chances are, in some archive of manufacturer’s data somewhere, the lost combination probably exists. If it does, it’s just one database call away from being public. One would hope that this information would always remain private, but my point is, as naïve as my father-in-law’s question seems to be, it’s probably not that far removed from reality.

Technology and our expectations of what’s possible also seem to play a game of cat and mouse.  No matter what we dream up, it seems that it becomes reality in the blink of an eye. In fact, I suspect that technology now regularly outpaces our wildest dreams. Almost anything is possible, at least in theory. If it doesn’t exist, it’s probably just that it’s not practical. Nobody has bothered to put in the effort to make it happen.

Consider marketing intelligence, for instance. Remember the first time you encountered what John Battelle dubbed the “database of intentions”? It was Google’s query data, and Battelle had what he called a “Holy Sh*t” moment when he realized:

This information represents, in aggregate form, a place holder for the intentions of humankind – a massive database of desires, needs, wants, and likes that can be discovered, supoenaed, archived, tracked, and exploited to all sorts of ends. Such a beast has never before existed in the history of culture, but is almost guaranteed to grow exponentially from this day forward. This artifact can tell us extraordinary things about who we are and what we want as a culture. And it has the potential to be abused in equally extraordinary fashion.

For marketers, Google had provided us with the biggest source of marketing intelligence ever compiled. It was the crystallization of consumer intent, in searchable form. We collectively salivated over it.

But that was a decade ago. Now, as marketers, we routinely curse the gaps in and shortcomings of Google’s query data. As powerful as it once seemed, our expectations have leapfrogged ahead of it.

Battelle has recently updated his definition of the database of intent, adding four new “fields” to it. Originally there was the search “query,” signaling “what I want.” Now, the “social graph” indicates “who I am” and “who I know.” The “status update” signals “what I’m doing” and “what’s happening.” The “check-in” signals “where I am.” And the “purchase” signals “what I’m buying.”

For a marketer, this is mind-blowing stuff.  The trick, of course, is to bring this all together in a meaningful way. To do so, there are multiple technology, intellectual property and privacy hurdles to get over. But it’s all very doable. It’s administration, not technology, that’s holding us back. A big part of Facebook’s IPO valuation was based on successfully pulling this off.

Again, technology has dangled a possibility at the leading edge of our expectations. But it will happen. And when it does, it will suddenly seem ho-hum to us. Our expectations will rocket forward to another possibility.

But even as fast as our expectations move, I guarantee, somewhere, someone is already working on something that lies beyond anything we ever dreamed of. Thank goodness our expectations are as elastic as they seem to be.

Search and the Age of “Usefulness”

First published April 19, 2012 in Mediapost’s Search Insider

There has been a lot of digital ink spilled over the recent changes to Google’s algorithm and what it means for the SEO industry. This is not the first time the death knell has been rung for SEO. It seems to have more lives than your average barnyard cat. But there’s no doubt that Google’s recent changes throws a rather large wrench in the industry as a whole. In my view, that’s a good thing.

First of all, from the perspective of the user, Google’s changes mark an evolution of search beyond a tool used to search for information to one used by us to do the things we want to do. It’s moving from using relevance as the sole measure of success to incorporating usefulness.

The algorithm is changing to keep pace with the changes in the Web as a whole. No longer is it just the world’s biggest repository of text-based information; it’s now a living, interactive, functional network of apps, data and information, extending our capabilities through a variety of connected devices.

Google had to introduce these back-end changes. Not to do so would have guaranteed the company would have soon become irrelevant in the online world.

As Google succeeds in consistently interpreting more and more signals of user intent, it can become more confident in presenting a differentiated user experience. It can serve a different type of results set to a query that’s obviously initiated by someone looking for information than it does to the user who’s looking to do something online.

We’ve been talking about the death of the monolithic set of search results for years now. In truth, it never died; it just faded away, pixel by pixel. The change has been gradual, but for the first time in several years of observing search, I can truthfully say that my search experience (whether on Google, Bing or the other competitors) looks significantly different today than it did three years ago.

As search changes, so do the expectations of users. And that affects the “use case” of search. In its previous incarnation, we accepted that search was one of a number of necessary intermediate steps between our intent and our ultimate action. If we wanted to do something, we accepted the fact that we would search for information, find the information, evaluate the information and then, eventually, take the information and do something with it. The limitations of the Web forced us to take several steps to get us where we wanted to go.

But now, as we can do more of what we want to online, the steps are being eliminated. Information and functionality are often seamlessly integrated in a single destination. So we have less patience with seemingly superfluous steps between us and our destination. That includes search.

Soon, we will no longer be content with considering the search results page as a sort of index to online content. We will want the functionality we know exists served to us via the shortest possible path. We see this beginning as answers to common information requests are pushed to the top of the search results page.

What this does, in terms of user experience, is make the transition from search page to destination more critical than ever. As long as search was a reference index, the user expected to bounce back and forth between potential destinations, deciding which was the best match. But as search gets better at unearthing useful destinations, our “post-click” expectations will rise accordingly.  Whatever lies on the other side of that search click better be good. The changes in Google’s algorithm are the first step (of several yet to come) to ensure that it is.

What this does for SEO specialists is to suddenly push them toward considering a much bigger picture than they previously had to worry about. They have to think in terms of a search user’s unique intent and expectations. They have to understand the importance of the transition from a search page to a landing page and the functionality that has to offer. And, most of all, they have to counsel their clients on the increasing importance of “usefulness” — and how potential customers will use online to seek and connect to that usefulness.  If the SEO community can transition to that role, there will always be a need for them.

The SEO industry and the Google search quality team have been playing a game of cat and mouse for several years now. It’s been more “hacking” than “marketing” as SEO practitioners prod for loopholes in the Google algorithm. All too often, a top ranking was the end goal, with no thought to what that actually meant for true connections with prospects.

In my mind, if that changes, it’s perhaps the best thing to ever happen in the SEO business.