Don’t Think Click Fraud, Think Negative ROI

First published March 1, 2007 in Mediapost’s Search Insider

The search engines have a dilemma on their hands when it comes to click fraud. We’re all clamoring for more information on the issue. We all want solid numbers to help us define the scope of click fraud. The very fact that we refer to it as click fraud is confusing. A lot of things get thrown in the click fraud “basket” that are in no way fraudulent. Thanks to sensationalist reporting by publications like BusinessWeek, click fraud is portrayed as the biggest scourge to threaten the Nirvana that is search marketing. A tremendous number of resources have been dedicated towards click fraud by the engines themselves, in response to the advertisers’ demand that the problem be stamped out.

But when you do an honest appraisal of the issue, the search engines would rather we get over our preoccupation with click fraud and start thinking of it as part of a much bigger whole, the return we get on our search marketing investment. This in no way negates the importance of click fraud as an issue. I don’t think there’s anyone more aware of click fraud than Shuman Ghosemajumder (Google), John Slade (Yahoo), and Brendan Kitts (Microsoft). They’re the first to say that click fraud does exist and that they’re each, in their own ways, actively policing it.

It’s more a question of proportional response, an appropriate amount of attention given the actual scope of the issue. And today, for the first time, Google is giving us concrete numbers on what that scope might be, at least for its network. Google is announcing a multiphase approach and product road map to handle the click fraud question. Accompanying the announcement are hard numbers, for the first time, about how much of Google’s traffic could actually be considered fraudulent. I’ll talk more about the numbers in a moment, but first, let’s explore the dilemma that presents itself to the engines.

Caught Between an Over-Hyped Threat and an Ignored Danger

The engines know that, as a factor that negatively impacts return on search marketing investment, click fraud represents a tiny percentage. There are far bigger drains on the performance of campaign that advertisers should be paying significantly more attention to, but thanks to doom and gloom “exposés,” there’s a disproportionate amount of attention focused on click fraud. So, although the engines would rather advertisers focus more on the big picture and consider all the factors, including fraudulent traffic, that are negatively impacting their return on that investment, they’re playing the game they have to and are keeping the focus on click fraud. Google’s announcement today may allay some of the “sky is falling” concerns that are being whipped up by journalists, but in the long run it may do the advertisers a disservice by diverting attention from more pressing campaign optimization issues.

I’ve talked about some of this before, but here are some of the issues I have with the current click fraud situation:

Just Because We Call It Fraud Doesn’t Make It Fraud

Click fraud seems to be the label that has stuck with this particular issue. There have been calls to try to put numbers around the occurrence of click fraud in search marketing. In reality, it’s not that cut-and-dried. First of all, fraud implies that someone loses money through the deliberate actions of someone else. For a click to be fraudulent, at least in the way that BusinessWeek tried to define it, advertisers have to lose money. They have to be paying for traffic that has no value.

Less than 10% are Invalid Clicks

The fact is, there are a number of factors that may result in traffic that the advertiser would probably prefer not to pay for. Fraudulent traffic is just one of them. Google puts all this traffic into a basket they call invalid clicks. This includes double clicks on ads, questionable activity from a single IP address, automated clicks, and yes, clicks from the nefarious click fraud perpetrator. In today’s release, Google said invalid clicks accounted for less than 10% of its total network traffic. The company didn’t want to get more specific than this, because the actual percentage can rise and fall with a fair amount of volatility, based on spikes in clickbot attacks and other factors. Google works to filter this traffic out proactively, so it’s as if the clicks never happened. The advertiser is never charged for this traffic. In most cases, the publisher of the site from which the traffic is generated is never paid for the traffic. No money changes hands, so no fraud has been committed. If anyone is out of pocket, it’s Google, not the advertiser.

The Bottom Line for Advertisers? .02%!

The traffic that the advertiser should be concerned about is the fraudulent traffic that slips through the cracks. This is truly click fraud. It’s not caught by the Google filters and it’s up to the advertiser to come back and report it and request a refund. In this case, money has changed hands and fraud has been perpetrated. Today, Google announced that this represents .02% of its total traffic. Some time ago I did a column after a talk with Shuman at Google, and after making some assumptions and extrapolating the number, I came out with a “worst case” estimate of .2%. It appears that my worst case was much higher than reality, by a factor of 10X.

I don’t know about you, but frankly, if something is only making a .02% impact on my advertising campaign, I’ve probably got better places to be spending my time. One place you might want to look? The conversion rates of your landing page. If you can bump your conversion rates by .5%, you’ve just made 25 times more impact on your overall campaign performance than by continuing to fret about click fraud on Google.

Google’s announcement today was more than just releasing numbers on the occurrence of click fraud. It is also announcing the creation of a Click Fraud Resource Center, a streamlined reporting process, the ability for advertisers to filter out questionable IPs, more details in its nvalid click reporting and some other initiatives. I believe all these things are good and are needed by advertisers, if only to put to bed the perceptions of click fraud as a major issue. But do me a favor, will you? Take some of the time you may be spending worrying about click fraud, and start looking at all the other places where your return on investment may be slipping through the cracks. My guess is there a lot bigger cracks you should be looking at than the click fraud one.

Marissa Mayer Interview on Personalization

marissa-mayer-7882_cnet100_620x433Below is the full transcript of the interview with Marissa Mayer on personalization of search results. For commentary, see the Just Behave column on Searchengineland.

Gord: It’s a little more than two weeks ago since Google made the announcement that personalization would become more of a default standard for more users on Google.  Why did you move towards making that call?

Marissa: We’ve had a very impressive suite of personalized products for awhile now: personalized homepage, search history, the personalized webpage and we haven’t had them integrated, which I think has made it somewhat confusing for users. A lot of people didn’t know if they had signed up for search history or personalized search; whether or not it was on.  What we really wanted to do was move to a signed in version of Google and a signed out version of Google.  So if you’re signed in you have access to the personalized home page, the personalized search results and search history.  You know all three of those are working for you when you’re signed in.  And if you’re signed out, meaning that you don’t see an email in the upper right hand corner that personalized search isn’t turned on.  If anything, it’s a cleaning up of the user model, to make it clearer to users what services they’re using them and when they’re using them.

Gord: But some of the criticism actually runs counter to that.  One of the criticisms is that it used to be clearer, as far as the user went, when you were signed in and when you are signed out.  There were more indicators on the Google results page whether you were getting personalized results or not.  Some of those have seemed to disappear, so personalized results have become more of a default now, rather than an option that’s available to the user.

Marissa: If you think about it as default-on when you’re signed in, I think that it’s still as clear on the search results page.  We removed the “turn off the personalized search results” link, but you still see very clearly up in the upper right-hand corner whether or not you’re signed in, your e-mail address appears, and that’s your clue Google has personalized you and that’s why that e-mail address is there.  I do think, based on our user studies and our own usage at Google, that we’ve made the model clearer.  We were actually ended up at the stage with our personalized product earlier this year where, at one point, Eric (Schmidt) asked “am I using personalized search?”  And the team’s answer as to whether or not he was currently using it was so complicated that even he couldn’t follow it.  You’d have to go to “my account”, see whether or not he was signed up for personalized search, make sure that your toggle hadn’t been turned off or on, and there was no way to just glance at the search results page and easily tell whether or not it was invoked.  So now it’s very easy, if you see your username and e-mail address up in the upper left-hand corner, you’re getting personalized results and if you don’t, you’re not.  So effectively there are two parallel universes of Google, per se.  One if you’re signed out where you see the classic homepage and the classic search results and one where you’re signed in, where you get the personalized home page and…you’ll be able to toggle back and forth, of course…and then the personalized search results page and the search history becomes coupled with all that because that’s how we personalize your search.

Gord: So, to sum up, it’s fair to say that really the search experience hasn’t changed that dramatically, it’s just cleaning up the user experience about whether you’re signed in or signed out and that’s been the primary change.

Marissa: That’s right.  Before you could be signed in and be using one of the three products or two of the three products but not all and, of course, because people like to experiment with a new product, they forget whether they signed up for personalized search.  Had they signed up for search history?  This just makes it cleaner.  If you’re signed in you’re using and/or have access to all three, if you’re signed out, you’re on the anonymous version of Google that doesn’t have personalization.

Gord: We can say that it cleans up the user experience because it makes it easier to you know when you’re signed in or signed out, but having done the eye tracking studies, we know that where the e-mail address shows is in a location that’s not prominently scanned as part of the page.  Do the changes mean that more people are going to be looking at personalized search results, just because we’ve made that more of a default opt in and we’ve moved the signals that you’re signed in a little bit out of the scanned area of the page.  Once people fixate on their task they are looking further down the page.  This should mean at a lot more people are looking at personalized search results than previously.

Marissa: Actually, I don’t think it will change the volume of personalized search all that much, not based on what we’ve seen on our logs and usage.  It makes it cleaner to understand whether or not you’re using it and I do think that over time, what it does is it pushes the envelope of search more such that you expect personalized results by default.  And we think that the search engines in the future will become better for a lot of different reasons, but one of the reasons will be that we understand the user better.  And so when we think about how we can advance towards that search engine of the future that we’re building, part of that will be personalization.  I do think that when we look five years out, 10 years out, users will have an expectation of better results.  One of the reasons that they have that expectation is that search engines will have become more personalized.  I think that in the future, working with the search engine that understands something about you will become the expectation.  But you’re right in that we believe that for users that are signed in, who find value in the personalized search results, over time as those users know they are signed in and that there search history is being kept track of, that their search results are being personalized, and they don’t need to look at every single search task to see whether or not they are signed in because that’s what their expectation is and they’re expecting personalized results.  So I do think we won’t see a drastic increase of volume right now of the use of personalized search but that it will hopefully change the user’s disposition over time to become more comfortable that personalization is a benefit for them and it’s something they come to expect.

Gord: There are a number of aspects of that question that I’d like to get into, and leave behind the question of whether you’re signed in or signed out of personalized search, but I have one question before we move on.  We’ve been talking a lot about existing users. The other change was where people were creating a new Google account and they got personalized search and search history by default.  The opt-out box is tucked into an area where most users would go right past it.  The placement of that opt-out box seems to indicate that Google would much rather have people opting into personalized search.

Marissa: I think that falls in with the philosophy that I just outlined. We believe that the search engines of the future will be personalized and that it will offer users better results.  And the way for us to get that benefit to our users is to try and have as many users signed up for personalized search as possible.  And so certainly we’re offering it to all of our users, and we’re going to be reasonably aggressive about getting them to try it out. Of course, we try to make sure they’re well-educated about how to turn it off if that’s what they prefer to do.

Gord: When this announcement came out I saw it as a pretty significant announcement for Google because it lays the foundation for the future.  I would think from Google’s perspective the challenge would be knowing what personalized search could be 5 to 10 years down the road,  what it would mean for the user experience and how do you start adding that incrementally to the user experience in the meantime?  From Google’s side, you have invested in algorithmic work to categorize content online. I would think the challenge would be just as significant to introduce the technology required to disambiguate intent and get to know more about users. You’re not going to hit that out of the park on the first pitch. That’s going to be a continuing trial and error process.  How do you maintain a fairly consistent user experience as you start to introduce personalization without negatively impacting that user experience?

Marissa: I will say that there are a lot of challenges there and a lot of this is something that’s going to be a pragmatic evolution for us.  You have to know that this is not a new development for us. We’ve been working on personalized search now for almost 4 years. It goes back to the Kaltix acquisition. So we’ve been working on it for awhile and our standards are really high.  We only want to offer personalized search if it offers a huge amount of end user benefit.  So we’re very comfortable and confident in the relevance seen from those technologies in order to offer them at all, let alone have them veered more towards the results, as we’re doing today.  We acquired a very talented team in March of 2003 from Kaltix.  It was a group of three students from Stanford doing their Ph.D, headed up by a guy named Sep Kamvar, who is the fellow who cosigned the post with me to the blog. Sep and his team did a lot of PageRank style work at Stanford.  Interestingly enough, one of the papers they produced was on how to compute PageRank faster.  They wrote this paper about how to compute page rank faster and it caused a huge media roil around the web because everyone said there are these students at Stanford who created an even faster version of Google.  Because the press obviously doesn’t understand search engines and thinks that we actually do the PageRank calculation on the fly on each query, as opposed to pre-computing it.  Their advance was actually significant not because it helps you prepare an index faster, which is what the press thought was significant.  Interestingly enough, the reason they were interested in building a faster version of PageRank was because what they wanted to do was be able to build a PageRank for each user.  So, based on seed data on which pages were important to you, and what pages you seemed to visit often, re-computing PageRank values based on that. PageRank as an algorithm is very sensitive to the seed pages.  And so, what they were doing, was that they had figured out a way to sort by host and as a result of sorting by host, be able to compute PageRank in a much more computationally efficient way to make it feasible to compute a PageRank per user, or as a vector of values that are different from the base PageRank.  The reason we were really interested in them was: one, because they really grasped and cogged all of Google’s technology really easily; and, two, because we really felt they were on the cutting edge of how personalization would be done on the web, and they were capable of looking at things like a searcher’s history and their past clicks, their past searches, the websites that matter to them, and ultimately building a vector of PageRank that can be used to enhance the search results.

We acquired them in 2003 and we’ve worked for some time since to outfit our production system to be capable of doing that computation and holding a vector for each user in parallel to the base computation.  We’ve been very responsible in the way that we’ve personalized Search Labs and we also did what we called Site Flavored Search on Labs where you can put a search box on your page and that is geared towards a page of interests that you’ve selected. So if you have a site about baseball you can say you want to base it on these three of your favorite baseball sites and have a search box that has a PageRank that’s veered in that direction for baseball queries.

So, the Kaltix team has been really successful at integrating all these Google technologies and taking this piece of theoretical research and ultimately bringing it to life on the Web.  And as it’s growing stronger and stronger and our confidence around the Kaltix technology grew, we’ve been putting it forward more and more.  We started off on Labs through a sign-up process, then we transitioned it over to Google.com and now we are in effect leaning towards a model where for people who use Google.com and have a Google account, they get personalized search basically by default.  If you look at the historical reviews of the Kaltix work it’s gotten pretty rave reviews.  The users that have noticed it and have been using it for a long time, like Danny (Sullivan), they’ll say that they think it’s one of the biggest advances to relevance that they’ve seen in the past three years.

Gord: So when you the Kaltix technology working over and above the base algorithm, obviously that’s going to be as good as the signals you’re picking up on the individual.  And right now the signals are past sites they visited, perhaps what they put on their personalized homepage and sites that they’ve bookmarked. But obviously the data that you can include to help create that on-the-fly, individual index improves as you get more signals to watch.  In our previous interview you said one thing that was really interesting to you was looking at the context of the task you are engaged in, for example, if you’re composing an e-mail in Gmail. So is contextual relevance another factor to look at.  Are those things that could potentially be rolled into this in the future?

Marissa: I think so.  I think that overall, we really feel that personalized search is something that holds a lot of promise, and we’re not exactly sure of the signals that will yield the best results.  We know that search history, your clicks and your searches together provide a really rich set of signals but it’s possible that some of the other data that Google gathers could also be useful. It’s a matter of understanding how.  There’s an interesting trade off around personalized search for the user which is, as you point out, the more signals that you have and the more data you have about the user, the better it gets.  It’s a hard sell sometimes, we’re asking them to sign up for a service where we begin to collect data in the form of search history yet they don’t see the benefits of that, at least in its fullest form, for some time.  It’s one of those things that we think about and struggle with. And that’s one reason why we’re trying to enter a model where search history and personalized search are, in fact, more expected.  And I should also note that as we look at reading some of the signals across different services we will obviously abide by the posted privacy policies.  So there are certain services where we’ve made it very clear we won’t cross correlate data. For example on Gmail, we’ve made it very clear that we won’t cross correlate that data with searches without being very, very explicit with the end user.  You don’t have to worry about things like that.

Gord: One of the points of concern seems to be how smart will that algorithm get and do we lose control?  For example, when we’re exploring new territory online and we’re trying to find answers we’ve refine our results based on our search experience.  So, at the beginning, we use very generic terms that cast a very wide net and then we narrow our search queries as we go. Somebody said to me, “Well, if we become better searchers, does that decrease the need for personalization?”  Do we lose some control in that?  Do we lose the ability to say “No, I want to see everything, and I will decide how I narrow or filter that query.  I don’t want Google filtering that query on the front end”?

Marissa: I think it really depends on how forcefully we’re putting forth personalization.  And right now we might be very forceful in getting people to sign up to it, or at least more forceful than we were. The actual implementation of personalized search is that as many as two pages of content, that are personalized to you, could be lifted onto the first page and I believe they never displace the first result, in our current substantiation, because that’s a level of relevance that we feel comfortable with.  So right now, at least eight of the results on your first page will be generic, vanilla Google results for that query and only up to two of them will be results from the personalized algorithm.  We’re introducing it in a fairly limited form for exactly the reason that you point out.  And I think if we tend to veer towards a model where there are more results that are personalized, we would have ways of making it clearer: “Do you want to explore this topic as a novice or with the personalization in place?” So the user will be able to toggle in a different filter form.  I think the other thing to remember is, even when personalization happens and lifts those two results onto the page, for most users it happens one out of every five times.  When you think about it, 20% of the queries are much better by doing that, but for 80% of the queries, people are, in fact, exploring topics that are unknown to them and we can tell from their search history that they haven’t searched for anything in this sphere before. There’s no other search like it. They’ve never clicked on any results that are related to this topic, and, as a result, we actually don’t change their query set at all because we know that they need the basic Google results.  The search history is valuable not only because it can help personalize the results but they’re also valuable because we can tell when not to.

Gord: There’s two parts to that: one is the intelligence of the algorithm to know when to push personalization and when not to push personalization, and two, as you said, right now this is only impacting one out of five searches where you may have a couple of new results being introduced into the top 10 as a result of personalization.  But that’s got to be a moving target.  As you become more confident in the technology and that it’s adding to the user experience, personalization will creep higher and higher up the fold and increasingly take over more of the search results page, right?

Marissa: Possibly.  I think that’s one of many things that could possibly happen, and I think that’s a pretty aggressive stance.  I look at our evolution and our foray into personalization, where we’re sitting here three or four years in, with some base technology that several years old already and it still has been very slight in a way that we have it interact with the user experience.  Mostly because we think that base Google is pretty good.  As it becomes more aggressive, certainly I would be pushing for an understanding of the ability of the user to know that these results are, in fact, coming from my personalization and not background and if I want to filter them out and get back to basics, that that would be possible.  One thing that we’ve struggled with is if we should actually mark the results are entering the page as a result of personalization but because team is currently and frequently doing experiments, we didn’t want to settle on a particular model or marker at this exact moment.

Gord: The challenge there is as you roll more personal results into the results page and get feedback from some users that they would want more control over what on the page is personalized and the degree of personalization and introduce more filters or more sophisticated toggles, it complicates the user experience. And as we know, that user experience needs to be very simple. Is it a delicate balance of how much control you give the user versus how much do you impact the 95% of the searches that are just a few seconds in duration and have to be really simple to do?

Marissa: There are two thoughts there.  One, even if we introduce them to filtering on the results page, it wouldn’t be any more complicated than what you had two weeks ago, so we already have that filter.  Two, we put the user first, and people have varying opinions about whether their search results page is too complicated, but the same people who designed that user experience will be the people who will be tackling this for Google, so I think you can expect results of a similar style and direction.

Gord: In the last few weeks, Google has introduced some new functionality, related searches and refine search suggestions, that are appearing at the bottom of the page for a number of searches.  To me that would seem to be a prime area that could be impacted by personalization opportunities that are coming.  As you make suggestions about other queries that you could be using, using that personalization data to refine those. Is that something you’re considering? And how long before personalization starts impacting the ads that are being presented to you on a search results page?

Marissa: Refinement is an interesting but a neophyte technology from our perspective.  We are finally now just beginning to develop some refining technologies that we believe in enough to use on the search results page.  A lot of people have been doing it for a lot longer. When you look at the overall utility, probably 1 to 5% of people will click those query refinements on any given search, where most users, probably more than two thirds of users, end up using one of our results. So in terms of utility and value that is delivered to the end user, the search results themselves and personalizing those are an order of magnitude more impactful then personalizing a query refinement.  So part of it is a question of, it’s such a new technology that we really haven’t looked at how we can make personalization make it work more effectively.  But the other thing is on a “bang for the buck” basis, personalizing those search results get us a lot more.

And as to ads, I think there are some easy ways to personalize ads that we’ve known for some time, but we’ve chosen at this point to focus on personalizing the search results because we wanted to make sure to delivered the end-user value on that, because that’s our focus, before we look at personalizing ads

Gord: So, no immediate plans for the personalization of ads?

Marissa: That’s right

Gord: Thank you so much for your time Marissa.

The Inevitability of Personalized Search

First published February 15, 2007 in Mediapost’s Search Insider

Google’s announcement a little more than a week ago that it would be showing personalized search results to more people through a change in the sign-in/sign-out default signaled perhaps the most significant change in search marketing in the past few years. Fellow Search Insider David Berkowitz dealt with some of the SEO implications in his column on Tuesday. Today I’d like to deal more with the user side of the story. Although Google’s announcement heralds a relatively minor change in terms of user experience, at least for the present time, it represents a step down a path from which there is no return. This path marks a dramatically different direction for search that will have far-reaching implications, both for advertisers and users.

Google Gets Personal

First, a brief recap of Google’s announcement and what it means to users right now. Here are the details: Now, everyone signing up for a Google account gets Search History enabled by default. The opt-out box is positioned so that most people would likely not even notice it during the sign-in process.

Whether or not you have Search History enabled, you get personalized search turned on by default. This means that Google will subtly change your results, based on various “signals,” like what you have on your personalized Google Homepage and what sites you’ve bookmarked as Google favorites. Of course, if you have Search History enabled, this is the main “signal” for personalized search

 

Finally, and probably least significantly, everyone gets his or her own Google Home Page when s/he signs up for a Google account.

The End of One Page for All

Let’s leave aside the privacy issues of Search History right now. That’s a topic that deserves a column by itself. It’s the end of the universal search results page that I want to touch on today.

There has been significant dissent voiced about Google’s move to personalized search, and it’s coming primarily from one source: search engine optimizers. In opposing personalized search, they’re saying it degrades the user experience. I responded by saying that it was the wrench that personalized search throws into their SEO plans that was raising their ire. But let me set aside my jaundiced view of the search world for a moment and chronicle its concerns (excluding privacy issues), as near as I can understand them:

 

  • Taking control away from the user by making personalized search a default and making it more difficult to toggle on and off 
  • Fear of anomalous browsing patterns (i.e. going to visit a number of humor sites on a whim or the invite of a friend) unnaturally biasing search results 
  • The “machine learning” algorithms that power personalized search not being smart enough to really provide more relevant resultsI’ve come out as saying that personalized search is inevitable; the day when all of us see the same page of search results is rapidly coming to a close. To me, this just seems obvious. But still, there are those that protest. Here’s one example from Michael Gray, a well know SEO Blogger: “I’ve never met a business owner who’s said, ‘Man, you know what, I wish the search engines could create anarchy by making sure no two people got the exact same results for the exact same search — that would be the best thing since sliced bread.'”

    In fact, Michael’s beef seems to be a consistently recurring theme among the dissenters, that a move to personalization suddenly seems to open the door for chaos on the results page. I believe the opposite is true.

    Every Search is an Island

    I am an individual, with unique interests, experiences, values and goals. My intent when I search for hybrid vehicles, or New York hotels, or Smart Phones, or any of the hundreds of other things I search for monthly, will be significantly different than all the other people that launch those same searches. I want a search engine smart enough to know that. I’ve always said that humans are complex, far too complex for a simple search box to get it right. That’s why personalized search is inevitable. If we want search to move to the next level, to get smarter, more intuitive, more relevant, we need to leave standardized search results behind.

    Does this mean Google will get it right out of the box? No. It will take baby steps towards what personalization eventually needs to become (although I believe those steps will be in rapid succession, because Google can hear the competition hard on its heels). Yes, there will be many who find that in the early stages, personalization may be more frustrating than it is useful. But for search to mature, these are growing pains we’ll have to endure.

    I’ve been labeled as an early proponent of personalization. I’m not sure this is necessarily the case. To me, it’s not a question of liking or disliking the recent moves by Google. To me, fighting search personalization is as pointless as refusing to accept today’s weather.

The Personalized Results are Coming, The Personalized Results are Coming!

Okay, sometimes the temptation to say I told you so is overwhelming. Danny has a nice long post in Searchengineland about Google’s changes to Personalized Search, making it more of a default and less of an option for millions of users. Danny details it more than I intend to, so please check it out.

As Danny says, he’s been talking about personalization for years, but up to now, it never materialized. After interviews with head user experience people at all three engines, I felt the time was right for personalized search to roll out (check The Future of SEO in a Personalized Search Interface and The SEO Debate Continues). And it appears my sense of timing was bang on. Much as I’d like to claim to be prescient, it’s really just common sense. You could see all the engines inching towards it. Now, Google has just upped the ante a little.

There are two major implications to this: what it means for search marketers, especially organic optimizers, and what it means to users. I’ll deal with each in turn.

What it Means for Search Marketers

The “Is SEO Dead? Rocket Science? A Scam?” Debate has been winding it’s weary way through several blogs in the past few weeks. My take was that SEO is, and will continue to be, vitally important as long as organic search results continue to be important to the user. Based on what I’m seeing, that continues to be very much the case. But, organic optimization now has a completely new rule set, which will irritate the hell out of many organic optimizers. The disgruntlement is already beginning to show. Michael Gray, better known as Graywolf, was the first to post a comment on Danny’s story:

Just because I ordered my coke with extra ice last time doesn’t mean I want it that way this time. I hate personalized SERP’s, I despise it even more that they don’t tell me they are personalized, and I loathe not being able to turn it off. I also have extreme antipathy for not being able to keep my search history on and not be part of personalized search.

Let me have it the way I want, not the way you think I do. I don’t want SERP’s that work like Microsoft programs that try to anticipate what I want to do, because more often than not it’s wrong. Bring back truth, purity, and clarity to the SERP’s.

Graywolf is complaining as a user, but I can’t help thinking that the more significant pain he’s feeling is as an organic optimizer who’s world suddenly just became a lot more complicated. “Truth, purity and clarity to the SERP’s”? In whose eyes? Come on. Personalization is being implemented because it enhances the user experience. It doesn’t take a “Rocket Scientist” (sorry, couldn’t resist) to see that one set of search results is not the best way to serve millions of users.

As Danny said, there’s now an explosion of new fronts for the organic optimizer to consider. Right now, Google is only injecting a few personalized results into the search page, but expect that threshold to gradually creep up as Google gains confidence in the targeting of the results to the person. The days of the universal results page are numbered. Which means that the days of the reverse engineering approach to SEO are equally numbered. I’m sure people will try to figure out ways to spam personalized search, but as I’ve said before, reverse engineering requires a fixed constant to test against. Up to now, the results page and the other sites that appeared on it represented that fixed constant. That’s gone now.

So where does that leave SEO? Well, it’s certainly not dead, but it has dramatically changed. You can’t optimize against a results set, but you can optimize against a user. Let’s use an analogy that’s often been used before to describe SEO. Think of it as Public Relations on the Web. If you launch a PR campaign, you don’t target a particular position on the front page of the NY Times, you target a type of audience. You plan your release distribution and messaging accordingly. And you give reporters what you think will catch their attention. Most of all, you have to wrap your campaign around something that’s genuinely interesting. Then, you hope for the best.

Now, SEO becomes the same thing. You don’t target the first page of results on Google for a particular term. You target an end user. You wrap your site messaging in terms that resonate with that user. You write in their language, you give them a reason to seek you out, and you sure as hell don’t disappoint them when they click through to your site. You do all this, and you remove all the technical barriers between your content and the indexes you need to be in. Then, you hope for the best.

The problem with SEO has always been that it’s been treated like some magical voodoo that can be applied after the fact, like some “secret sauce”. And yes, that was what the infamous Dave Pasternack has been trying to say. He just went several steps too far. The fact is, with universal search results, you could actually do this. Thousands of affiliates have made millions of dollars doing it. Link spamming, cloaking, doorway sites..the fact is, up to now, this bag of tricks has worked. It’s gotten harder, but it’s worked. Site owners looked to SEO to help them hi jack traffic that wasn’t rightfully theirs. They hadn’t done the heavy lifting to create a site that justified a place in the top rankings, and they tried to take an easy short cut.

But now, organic optimization means that you have to do the heavy lifting. It has to be integrated into the entire online presence. What Marshall Simmonds has done with About.com and the NY Times is a perfect example of the new definition of SEO. Get to the front lines, to the people who are churning out the content, and teach them about what search engines are looking for. Make sure SEO best practices are baked right into the overall process flow. Work with the IT team to create a platform that entices the spider to crawl deeper. Work with the marketing team to crawl inside the head of your target audience and figure out the who, the when and the why. Don’t worry so much about the where, because you can’t really control that any more. It’s a tough paradigm to break. We’ve been struggling with our clients for the past year or so. They’re still fixated on “being number one” for a particular term. We’ve been trying to ease them into the new reality, but it’s not easy.

I guarantee this will create an identity crisis for the SEO industry. As recently as a few months ago I was moderating a panel that was talking about analytics, and in the Q&A someone asked the panel, who had a few very well known SEM’s on it, about what they used for ranking reporting. The names of various options were thrown out and people started scribbling them down. I saw this and thought I had to comment.

“You know, the whole concept of ranking is quickly becoming irrelevant”

Nobody lifted their head, they were still busy writing down tool names. Maybe they hadn’t heard.

“As search engines move to personalized results, there will be no such thing as ranking. It will all be relative to the user.”

That should get their attention. Nope, nothing.

One of the search marketers said, “Yes, but knowing how they rank is still important to people.”

Huh? Am I speaking a different language here? I shook my head and gave up.

So, does this mean SEO is dead? Absolutely not. It becomes more vital than ever. Here are a few things that remain to be true. Preliminary results from the new SEMPO survey say SEO continues to be the number one tactic in search marketing. Yes, people want to bring it in house, but they recognize it’s importance.

Why do they think it’s important? Because it kicks ass in ROI. Here are the results from another recent study by Ad:Tech and MarketingSherpa, asking advertisers about the return they get from various marketing channels.

080569

The biggest jump from year to year? SEO. Now, let’s look at where marketers plan to spend more money in the next year.

080570

SEO, from flatlined last year to looking to spend 25% more this year. So SEO definitely isn’t dead. But it is moving to a new home. Here’s some early results from the SEMPO State of the Market Study (by the way, final results should be available next week. Look for them):

SEMPO2a

It’s true that most companies would far rather bring SEO in house, if they could. And when we consider the new definition of SEO, it probably makes sense for SEO to be integrated into the internal work flow. But the problem is that there’s not a lot of SEO expertise out there. If SEO was so easy, why don’t more companies do it, or do it well? Contrary to Pasternack’s argument, it’s not a “set and forget” type of tactic. It requires a champion, buy in and diligence.

I think the future is bright for SEO as a skill set, but we’re talking a modified set of skills. I talked about this in a recent SearchInsider column and a follow up online debate with Andrew Goodman. My view of the future for the really good SEO’s out there fall into three categories:

Get a (Really Good) Job

As companies bring this in house, there will be a firestorm of demand for skilled SEO Directors, but ideally as employees, not consultants.

Broaden Your View

Become an expert in how consumers navigate online and help your customers with the big picture, including the new reality of SEO.

Adapt and Survive

Find a new online niche where your search honed skills give you an advantage.

User’s View

Okay, this is already a much longer post than I intended, so I should probably talk about personalized search from the user’s perspective now.

Personalized search is a big win for the user. Don’t judge by the first few tentative steps Google is taking. Personalization is a much bigger deal than that. Google is easing us in so the experience isn’t too jarring. By the end of 2007, all 3 of the major engine’s results pages will look significantly different than they do today. Personalization will be like a breached dam. Right now we’re seeing the first few trickles, but there will be a wave of much deeper personalization options over the next several months. Search will become your personalized assistant, tailored to your tastes. As you search more, your results will draw more and more away from the universal default and closer and closer to your unique intent. Immediately after your query, you’ll be dropped into a much richer search experience. Disambiguation will become much more accurate, and you’ll find that you will pretty much always find just what you’re looking for right at the top of your page, without having to dig deeper. Here’s how I see it playing out at each of the big three:

Google

Google has a religious devotion to relevance, and as they gain confidence with personalized search and their ability to disambiguate, this will manifest itself with a laser focus on relevance above the fold. They will continue to maintain a good balance of organic results, but these results will not just be the current web search results. They could be local, image, news or a mix of each. And ads. Yes, you won’t escape ads, but Google will be the most judicious in what they show. Expect more stringent quality scoring, down to the landing page level and a high degree of relevancy in the ads that do show. Google will be the most concerned of the three in disambiguating intent.

Yahoo

Yahoo will put their own spin on personalization by wrapping in Social Search. They will continue to leverage their community, as they currently do in Yahoo! Answers so when you’re logged into Yahoo, you’ll be plugged into their community and that will impact the search results you see. Relevancy will be determined more by what the community finds interesting than what you find interesting, although it will be a mix between the two. Yahoo will target two types of searches, serendipitous search, where you’re looking to discover new sites, and what I call “frustrated” search, where your own efforts to unearth the data online have come up empty and you want the help of the community. When it comes to monetization, Yahoo will be the most aggressive, pushing more ads above the fold into Golden Triangle real estate. These ads will trail Google’s in terms of relevance

Microsoft

Microsoft will use their targeting capabilities and probably tie in some behavioral targeting to personalize their search results. Also expect personalization in the Microsoft product to be integrated at a deeper, more ubiquitous level, into apps and OS. This probably won’t happen in 07, but it will be a long term goal. When it comes to ad presentation, Microsoft will fall somewhere between Google and Yahoo in both the number and relevance of the ads being presented. The heaviest investment will be in building out the platform to manage and model the ad program, rather than in policing the quality of the ads themselves.

It promises to be a very interesting year in the Search Marketing biz!

New Click Fraud Numbers: But Can You Trust Them?

There’s new click fraud numbers out from Click Forensics indicating that click fraud is on the rise.  In fact, according to the study, click fraud on high-value keywords could be as high as 20%.  The overall industry average click fraud rate for the fourth quarter was 14.2%.  According to the report, the average click fraud rate on PPC ads on search engine networks was 19.2% for the fourth quarter of 2007.

In the report I saw it didn’t indicate which search engine networks this number was coming from.  I would have to assume that this includes both first-tier and second-tier search engine networks.  Some further data around this would be helpful, as my suspicion is that a majority of the click fraud being reported by Click Forensics is likely to coming from second-tier networks that don’t have the same stringent click fraud filtering mechanisms in place as Google, Yahoo and soon to be newcomer to the space, Microsoft.

Andy Beal casts doubt on these numbers in his blog MarketingPilgrim.com, making the salient point that you have to remember they’re coming from a company that has a vested interest in the growth of click fraud.  Also Click Forensics sample includes only companies that are concerned enough about click fraud to actually use Click Forensics to monitor fraudulent activity on their sites.  One has to assume that these companies would be especially vulnerable to click fraud and are not an accurate representation of the total universe of advertisers.  Like Andy Beal said, it’s a bit like going into a hospital and asking a number of people if they feel sick.

Perhaps it’s coincidence, but about the same time that this report was coming out, Danny Sullivan had a conversation with Shuman Ghosemajumder at Google about their concerns on some of the click fraud reporting that’s coming from companies like Click Forensics.  Apparently one of the main points of contention is around the use of the back button on a browser.  I had previously talked with Shuman about some of the reported numbers around the click fraud issue in their concern about inflation of those numbers.

Regardless, at this point it looks like we’re still going to be grasping at straws when we try to put scope around the click fraud issue.

The Social Fabric of Search

First published February 1, 2007 in Mediapost’s Search Insider

You know the phenomena of Synchronicity, where once you become aware of something it seems like everyone is talking about the same thing? You can’t turn a corner without seeing some reference to something that just a week ago didn’t even register on your social consciousness. For me that was social search and the time was last week. While I was certainly aware of social search before that, for some reason, last week was the week where the knocking got so loud I had to pay more attention.

In looking at the referrer logs for my blog I noticed that Stumbleupon seemed to have emerged as a major traffic source. Also last week, I was on a panel with Danny Sullivan and he mentioned that we have to start watching social engines like Digg and Stumbleupon as emerging trends in the search space. Finally I did an interview with Larry Cornett, one of the key usability people at Yahoo, and when I asked him what the differentiating factor was for Yahoo in the future, he pointed to the emergence of social search and gave me Yahoo! Answers as the current example of that in practice.

There seems to be a lot of buzz around social search but exactly how is social search shaping our search experience and why we should be looking at it in the future? When Danny Sullivan mentioned that social search is something to keep your eye on, I made the point that different types of search engines lead themselves to different types of search activity.

Serendipitous Search

What I noticed Stumbleupon show up in my referrer report, I did some investigation into what Stumbleupon is about. Stumbleupon is the embodiment of serendipitous search. Its whole purpose is to help you find new sites that you might think are interesting. And here’s where the aspect of social search, or community, comes in. Stumbleupon depends on a network of like-minded people to earmark sites that would be of interest based on your profile. It’s based on the concept that great minds think alike. Apparently, someone in the online universe had pegged my blog as one that might be of interest in some particular niche and suddenly dozens of other people were stumbling upon it, guided by their online friends.

Stumbleupon is probably the best example of serendipitous search but Digg is another one, albeit with a slightly different flavor. While Stumbleupon helps you find sites, Digg connects you directly to new content about specific topics. Like Stumbleupon, Digg uses a rating system to allow community members to vote on whether a site or story is noteworthy. Both Stumbleupon and Digg have emerged as significant drivers of traffic in recent months so as marketers, we have to keep these sites on our radar.

From the user’s perspective, the aspect of social search becomes interesting in these two examples because they help guide us to explore undiscovered territory online. We’re going where we haven’t been before and it helps us when people who share our interests can guide the way. In each case, social search lends credibility to new sites with which we have no previous experience.

The Wisdom of Crowds

James Surowiecki wrote a book called the Wisdom of Crowds. The basic premise of the book is that crowds, given the right conditions, can be amazingly intelligent. He cites a number of examples where a large group of people, acting independently with limited amounts of information, collectively came to decisions that were more valid than those of all but the very smartest individuals within the group. The whole became greater than the sum of its parts.

This is the basis of a new flavor of social search where the community collectively builds the index of the search engine. Consider Yahoo! Answers. You pose the question and Yahoo’s community kicks into gear to provide the answers. These answers are aggregated and provide searchable content that make up Yahoo! Answers. Based on my conversation with Larry Cornett and recent comments by Yahoo CEO Terry Semel, it appears that Yahoo Answers provides a clue into their strategy for going head to head with Microsoft and Google. This concept of community building a better search experience is key to Yahoo and a main strategic platform for the future.

Another example of this variation of social search can be found in Search Wikia, the new search initiative that “is going to change everything” according to Wikipedia founder Jimmy Wales. In Search Wikia, it’s a case where the broad concept seems to be in place but the specifics on how it’s going to be executed still seem a little thin.

The biggest challenge with this variation of social search is that it depends on the engagement of individual members of the community. Unless you have volunteers that are willing to spend their time enhancing the search experience, the scalability of the project breaks down. Anything that depends on people to take time to tag results, to contribute or to answer questions is dependent on the person’s motivation to participate. While that’s present in a very small percentage of our population, it’s not a commonly found trait in most of us. It’s generally been proven that hardware is rapidly scalable, people are not.

However you define social search, the fact remains that the combination of search and the very notion of an online community are inherently aligned. Communities are all about connections, and nothing can connect faster than online search. It will take us a while to smooth out the wrinkles, but search is fundamentally social and communities are fundamentally connected. These concepts will live together in the online world.

Kevin Lee on the Lee-ching Effect of Search

Kevin runs a great column on a topic I explored awhile ago in SearchInsider: are search engines leeching value from the web? Kevin approaches it from a slightly different angle than I did, but the conclusion was similar. Vendors are beginning to resent having to pay for every search generated touchpoint with a consumer. Kevin’s point, which I share, is: Get Used to It!

Here’s the 10 second summary of the idea, but please take some time to read the column. Consumers continue to turn to search to connect with an online vendor, even after the initial introduction. The vendor has to either maintain a prominent position in the sponsored ads, or, in some cases, pay an affiliate who is maintaining a high organic position (this reference is somewhat ironic, coming from the company that says SEO is simple enough that these affiliate sites should be cut out of the ecosystem). The vendor resents having to pay this recurring toll every time the customer visits them.

Having to invest to maintain share of wallet with a consumer is nothing new. It’s just that the new power of online and search in part makes this investment more focused than it’s ever been before. It used to be that maintaining enough top of mind to ensure a continuing connection with a customer was spread out over a number of marketing channels. Somehow, advertisers would accept this. But now, with the focused use of search to navigate the web, including return visits to a particular site, the cost is being concentrated in one channel. If anything, this introduces efficiency into the marketplace and could potentially save the marketer money, but all they see is a growing cost they have to pay to one channel to keep customers they thought they had already won. The missing piece here is solid data about the shift of influence from more traditional channels to the new search one. The marketer doesn’t know whether they have to maintain all the previous marketing activity, or can they confidently begin moving budget to the new one, namely search.

Read Kevin’s column, and then take a look back at my view. It’s a thoughtful look at an interesting shift in marketing dynamics and is a refreshing change from some of the other opinions currently coming out of Did-It.

The Goodman – Hotchkiss Smack Down

Okay…maybe it’s nowhere near the Pasternack vs the Rest of the SEO World Debate that’s currently going on (which is apparently now even spawning it’s own T-shirt), but Andrew Goodman took exception to my recent SearchInsider column, where I also ponder the future of SEM/SEO.

To save you a ton of reading, I’ll summarize the salient points of both.

My take:

SEM shops, and in particular, SEO shops, have been so tactical and have developed such a specialized set of skills that it will be difficult for us to step back and look at the bigger picture necessary to guide us in the next evolution of search into a more personalized channel. Further, as the current reality of universal search results pages gives way to personalized results, the market value of this highly developed skill set, largely based on the current paradigm of optimizing to gain rank, will begin to lose value to potential acquirers as rank ceases to have any meaning. This will create a shakeout in the industry as some of the best practitioners become employees with large companies and others can’t keep up with the new evolution of search.

Andrew’s take:

I missed some factors, in particular the fact that you can’t predict financial fads and roll ups and acquisition are often driven more by buzz from Wall Street than logic, and that acquisition is often driven by the target’s client list and expertise in a skill set that the buyer doesn’t currently have. Further, Goodman feels that contrary to my point, SEM’s are actually pretty strategic in their execution of campaigns and that our front line approach in customer acquisition is giving us exactly the skills needed to market in the new online reality. He goes on at some length about how the majority of the agency world is drastically out of touch with this new reality.

Of course, both Andrew and I being Canadians, we’re probably both way too polite and pragmatic to create much of a stir. Note the carefully worded way Andrew threw down the gauntlet:

Gord Hotchkiss argues that SEM firms aren’t getting acquired for large sums mainly because they’re too tactical and don’t have skills that help them work on segmenting and customer profiling. I tend to think that the picture is more complex. Or maybe, it’s actually simpler. Either way, Gord’s assessment of current reality is correct, but his analysis is wrong.

It’s kind of like Mac n Tosh, the really polite Warner Brothers gophers, in the WWF. “You hit me first.” “No, I insist, you take the first swing.” “No, no, that would be rude, please put me in a pile driver.”

But I do encourage you to read Andrew’s post, as I think it definitely adds to the perspective. And here, I’d like to dive a little deeper on some of the points Andrew brings up:

I happen to think search marketing is a fine training ground for the strategic mind. Of course, no small consulting firm is given the keys to the entire marketing strategy for a large client, but discounting for size, the influence of the search marketer is impressive. Look at all the data clients already let search marketers work with! While expensive, ponderous segmentation and market research exercises are not the typical MO of the searchie, that’s often because these don’t translate very well to the particular campaigns they’re asked to work on.

Well, in most instances, this usually contradicts the point Andrew is making. While I agree that we often get access to a lot of data, it’s usually in support of the crushing load of tactical work that has to be done in search. We need to dive into conversion data, site stats and a mound of other data to tweak and optimize the campaign. And it’s this deep dive into the data that often keeps us from seeing the big picture. And yes, segmentation and market profiling are ponderous work, and that is tactical, but it’s the ability to take the results and apply them strategically that truly sets apart great marketing. I agree with Andrew that there’s a real danger of misuse of profiling, and it’s horrendously abused in the traditional agency world to justify expensive sponsorships that are more about boondoggles and perks for agency execs than it is about effectively reaching target consumers. But to me, the biggest thing missing in Search is the who and the why. We know where and we guess at what, based on a series of tests.

Here’s how it usually works. We test for best position. We come up with messaging and test for effectiveness, often based on a set of metrics that are end of funnel targeted, because that’s the best we can do right now. There’s testing, testing and more testing.  Andrew makes this point as well:

(Testing is) certainly something search marketers are uniquely skilled to do. In paid search, we learned “direct marketing analysis on steroids,” not from any book, but from the ground up. Now we’re busy writing the book.

This brings us to a point that came up during our panel at the recent Microsoft Summit. One member of the audience equated search to direct marketing. This is a common comparison, but it points to the very reason most SEM’s are too myopically focused. Here’s one definition I found for direct marketing:

Direct marketing is a type of advertising campaign that seeks to elicit an action (such as an order, a visit to a store or Web site, or a request for further information) from a selected group of consumers in response to a communication from the marketer.

From a marketer’s perspective, this pretty much sums up search. It’s what we all want people to do through search. But, and here’s the thing, it’s often not what the user wants to do. The point at which search is used and the point at which the consumer is ready to take the action could be seperated by days, weeks or even months. Here’s another symptom of our short sightedness. When we measure actions, we show a strong bias towards the purchase end of the funnel. Most marketers either don’t or can’t measure promising activity earlier in the funnel. It’s often not our fault, because the leap from measuring end of funnel activity to full funnel activity is huge, and largely impossible. It requires a sophistication of reporting that’s beyond the ability of any single platform. So we tend to focus on what we can measure. And this leads to an increase in short sightedness.

What we need is the who and why. We need to look at presentation of our messaging through the eyes of the target. We need to understand intent, and deliver on it. That dramatically reduces testing cycles. When you learn to do that, your strategy defines itself. Contrary to Andrew’s point…

Customer profiling? I think it’s useful, but let’s not get too cute.
(By the way, I think Andrew’s Canadianism is showing here. Most Search Marketers I know would call it BS)

…knowing your customer isn’t cute, it’s critical. Up to know, search has been able to effectively deliver leads without this understanding. That’s a testament to the power of search as a channel. But those days are rapidly ending. My point is this view is now essential to reach customers in a more personalized search reality, or if it’s not essential now, it will be very very soon. And knowing your customer means research, profiling and creating paradigm shifting frameworks, such as personas. And I don’t mean in the way traditionally abused by agencies, but in highly effective ways pioneered by product designers and employed by Future Now, who Andrew refers to.

A few totally awesome data analysts in these firms – and a handful of independent analysts – are doing exactly the right things, while most everyone else in the agency infrastructure is not empowered to act on the power of the data (or put less politely, they’re just pretending). Agency culture is still dominated by the power of “creative,” and subjective judgments of “spots.”

Okay, here’s one point in where Andrew and I are in complete agreement. Agencies don’t get it, and although some bright individuals within the agency structure do, they’re hamstrung and stiffled by crushing inertia and old thinking. Andrew seems to think I implied agencies wouldn’t buy because they knew how to do it. That wasn’t my point. It was agencies won’t buy because they think they know how to do it. And the distinction is important. In fact, somewhere on my desktop is a half finished blog post where I looked at why this happens, but I shelved it because I thought it might be too “impolite”. Maybe it’s time I dug it up and finished it. But in the meantime, everything that Andrew points out as being wrong in the agency space I agree with totally.

From the ashes of all of this rises search. Which, though highly tactical, is a great training ground for strategic minds like Joe Morin, who is now CEO of StoryBids, a startup that offers an auction system for product placement.

Again, in trying to point out a difference in opinion, Andrew actually helps reinforce a point I was making. Joe is a good friend and a great example of the handful of survivors who are evolving into the new reality. I mentioned that the challenge for SEOs in particular (which was Joe’s background, along with being a private investigator) was in being willing to draw back from their current view and skill set and to reinvent themselves to find a niche in the rapidly evolving online ecosystem. Joe is a master of this and acts as a great example for the industry. Andrew also mentioned he has some skin in a new game, and as one of the smarter people in search, he’ll evolve quite nicely as well. And I think in the word “evolve” we come to the crux of this. Perhaps the best way is to sum up with the following comparisons

Traditional Agencies = Dinosaurs
SEMs/SEOs Unwilling to Change = Mastadons
SEMs/SEOs Embracing Change = Primates

SEM’s Seven Year Itch: Part Three

First published January 25, 2007 in Mediapost’s Search Insider

The 2004 acquisition of SEM firm iProspect by the advertising network Isobar marked a turning point in the search marketing industry. Various reports pegged the total value of the deal at about 50 million dollars, including potential earn-outs. IProspect founder Fredrick Marckini stood to be a very wealthy man.

The iProspect deal wasn’t the first or last acquisition to happen. But the valuation of the deal (together with an earlier Performics/DoubleClick deal) set a new high-water mark for the expectations of the owners of other search shops. Suddenly, it looked like there was a very lucrative exit possible. After years of struggling in the search space, we found that we might just be holding the winning lottery ticket. Up to this point, there was a bit of a taboo about talking of acquisition in SEM circles. We all routinely professed our love for search and how we just couldn’t see ourselves doing anything else. But, hey, a $50 million dollar check can change your thinking somewhat. Suddenly, the SEM community indulged in a little daydreaming and started frequenting the Jaguar Web site and checking out property prices in the Hamptons.

But the flood of acquisitions that was predicted never happened. It was more of a trickle, and when we were privy to details about valuations, they were significantly under the iProspect deal. There are a number of reasons for that (perhaps the topic of a future column). But the fact is, while the owners of search shops have had their appetites whetted, the window for highly profitable acquisitions may have passed by. Here’s why.

Tactically, We’re Awesome

Search marketers are brilliant tacticians, whether they work on the paid or organic side. It’s what we excel at. The biggest show in the SEM industry, ironically titled Search Engine Strategies, is really three or four days jammed packed with tactics, not strategies. We myopically focus on page position, always shooting higher. It’s all about rank. It’s all about being No.1.

What we’re not particularly good at is stepping back and looking at the big picture — the hows and whys of search, and, most importantly, the whos. While this is true across the search space, it’s most apparent with the organic optimizers. Virtually no one in the SEO world has given a hoot about messaging, user experience or intent. It’s all about crawling your way to the top spot. In the last year, I’ve seen a few SEOs starting to change their thinking, but the vast majority is still obsessed with blowing holes in the ranking algorithms.

Rank Becomes Irrelevant

However, we’re rapidly approaching the day when being No. 1 ceases to have any meaning. That’s a view that is tied to the concept of a universal results page. A user searching for “bass” in Seattle sees pretty much the same results page as someone searching in Salisbury or Saskatoon. In this context, rank not only has meaning, it’s the magic bullet.

Currently, the engines are rolling out personalization of results in a number of flavors. Soon geo-targeting, demographics and personal histories will be bigger determiners of the results, and the order you see them in, than the skill of a search optimizer. An aspiring musician in Seattle searching for “bass” may see the biggest selection of bass guitars in the Pacific Northwest in the No. 1 spot.  An angler in Saskatoon will probably see the top bass fishing spots in Western Canada. And the person using their laptop and wireless connection to search for “bass” in a Salisbury pub could well see the official site for Bass Pale Ale.

A New Rulebook

On the organic side, this dramatically changes the rules of search. The hyper-developed technical skill set of SEOs suddenly needs to be rounded out with a deep understanding of the target user. The optimization tactics we felt were going to guarantee us an early retirement, while still valuable, will take a back seat to the ability to segment and understand our target prospects. More important, we have to understand the online paths they’re likely to take, and help our clients intercept them with effective messaging and successful interactive experiences. These are the skills that will be in high demand in the future. There will always be a place for a talented organic optimizer, but it will be as a rather well-paid employee, not a multi-million-dollar acquisition.

Where do these new skills exist? Well, they’re more evident on the sponsored side, as new platform enhancements have allowed the best paid search practitioners to start to segment demographically and geographically. It’s forcing us to do our homework on who our prospects are. And unfortunately, our potential acquirers, the large agencies, believe they have deep bench strength when it comes to segmenting and profiling prospects, certainly deeper than the average SEM shop. I still don’t believe they’re done a particularly good job of porting traditional market research skills to the new consumer-empowered online reality, but I suspect I’d have a hard time convincing them of that.

You know who’s really honing these skills? The behavioral targeting practitioners. Search marketer should start paying a lot more attention to what’s happening in the BT camps.

A Chronic Itch

So where does that leave the average SEM agency? Is a profitable exit still an option as our seven-year itch demands to be scratched? If your valuation depends largely on tactics that gain higher rankings and concentrates on the “where” (on both the organic and sponsored side) rather than the “who” and “why,” your window has passed. But if you’re up for the change and not only embrace the inevitable reality of personalized and integrated search but pioneer the understanding of it, a new market will emerge. That’s the good news.

The bad news is that there’s a lot more hard work and learning that has to happen to position yourself in that market, and this time we’re not the front-runners. The truly passionate will persevere and adapt. The rest will find themselves with some pretty good job opportunities — but the summer house in the Hamptons and the Jag XKR convertible will be long shots.

Pasternack’s at it Again

David Pasternack of Did-It is decidedly unrepentant in his campaign against SEO. He’s at it again in a Q&A on DM News. At this point, it’s beyond intellectual debate and seems to be all about generating a storm of activity, with himself at the center. As Danny Sullivan said  “It’s all getting pretty tired”. David continues to insist that you would be a fool not to bring SEO in-house, as anybody can do it, but apparently with paid search (Did-It’s business model) the opposite is true, as only a fool would try to manage their paid search in house.

You know, I was one of the ones that did see some logic (albeit a little convoluted) in David’s original column, and Kevin Lee’s subsequent columns to try to further clarify. There are two sides (or more) to every argument and I usually try the view from both sides before commenting. But I have to say Pasternack is going beyond the reasonable here. The fact is, with many of our clients, it’s organic search they seek consulting help with and it’s paid search that they keep in house. I frankly don’t see a big difference in the level of sophistication required in both channels. In fact, I would say there are more dimensions, and more potential traps, on the organic side. Here are some other reasons why you’re most definitely not a fool if you’re looking for a partner on the SEO side:

SEO Touches Everything

Sometime ago, I wrote a column about why an effective SEO campaign is so difficult to launch within an organization. The biggest reason is that an effective SEO campaign touches many aspects of the business. It’s embedded in content, which generally involves at least marketing and often includes legal, management, product groups and virtually every aspect of the business. And buy in of the IT department is absolutely essential. SEO doesn’t live it one place. To be effective it has to overarch everything. A partner can help make that happen. Talk to many in-house SEO practitioners and they’ll tell you one of their biggest challenges was selling SEO internally. It’s one of the most common immediate pains we solve when we partner with a company.

Nobody is Helping You With SEO

Right now I’m at a Summit hosted at Microsoft aimed at helping people use AdCenter more effectively. Microsoft is most decidedly not telling us how to rank higher on Live Search’s algorithmic results. Neither is Google or Yahoo! Other than the rather thin Webmaster Guidelines (according to Pasternack, all you need to know), there’s very little effort on the part of the search engines to help you understand algorithmic ranking. Why should they? They don’t make money from it. So you have to cobble your information together from various forums and blogs. There’s no official answer source for algorithmic problems. That’s why Search Engine Strategies attendance continues to grow. It’s also why SEMPO is introducing a organic optimization training program.

Nobody is Investing in Making SEO Easier

According to Pasternack, Did-It has “killer technology”. They do have a proprietary bid management tool, and it’s okay, as are many others. I’m not sure I’d call it “killer technology”, as that implies that it kills the competition. That’s just not true. Maestro is just another flavor of  bid management, with some cool features, and some noticeable gaps when you compare it to some of their competitors. But the fact is, there’s a market for building tools to help manage PPC campaigns. Driving this are the engines themselves, who are dedicated to taking the pain out of managing paid search, and are likely the ones who will be introducing “killer technology”, as there’s a strong economic win in it for them and they have substantially more resources than a company like Did-It. The engines are not dedicated in the same way to making SEO easier. The landscape is too messy and the division between white and black is too vague. It’s more open than it used to be, with the efforts of a few individuals (i.e. Matt Cutts, Jeremy Zawodny, Tim Mayer) and the odd tool (i.e. Google’s Webmaster Tools) but it’s nowhere near the scale of development on the paid side, and it never will be.

The Damage Can Be Long Lasting

If you screw up on a PPC campaign, it can be turned off while you figure out what went wrong. It can cost some lost budget, but that’s about it. Screw up on your SEO and it can take months, or even years, to fix the damage. And every day, you’re missing out on traffic that you’ll never get back. It’s all about risk, and there’s substantially more risk on the SEO side. Ask anyone who inadvertently got their URL banned.

SEO is More Difficult to Manage and Control

Paid search keeps you in control. You can measure campaign performance, adjust bids, turn off individual keyphrases or entire campaigns and introduce robust testing frameworks. While this increases the complexity of campaign management, it does leave you in control. With organic optimization, you have to throw your best guess against the algorithm and hope for the best. You surrender control the minute your site is spidered.

The Returns Can Blow PPC Away

Frankly, you’ve be a fool not to fully leverage the potential of organic, because if you do it right, your returns will blow anything you’re getting from the paid side out of the water. There’s a lot at stake, and the returns can continue for a long time, whereas the best managed paid campaign’s benefits end the minute you turn off the tap on the funds. Why wouldn’t you want to make sure you’re exploring every opportunity available to gain better organic visibility?

More of Pasternack’s Wisdom

In the interview, Pasternack made a number of observations that I wanted to deal with individually:

Everybody’s not angry: only a small percentage of readers with an inferiority complex who happen to call themselves SEO experts

Apparently, Pasternack’s brush only paints in black and white, which probably simplifies his world greatly. His one sided comments rubbed a lot of people the wrong way, myself included. I’d argue whether I’m an SEO expert, and I don’t believe I have an inferiority complex. I do travel in those circles however and talk to a wide number of people and generally find that we’re developing a similar attitude to Mr. Pasternack’s credibility in the industry.

I wonder what percentage of Danny’s (Sullivan) show attendees are there to find the “magic SEO elixir.” I would guess a very high percentage. I suppose we all have to cater to our audience.

I happened to be sitting next to Danny as he first read the above quote. “Harrump” would be the diplomatic term for the response. The fact is, SES attendees are not looking for the magic SEO elixir. They’re looking for answers because there’s precious few places they can find them. They’re generally not going to get them through the engines, at least through the official channels. Organic optimization can be complicated, depending on the challenges present. Ironically, many of the attendees are the very same in-house tacticians that Pasternack says should be more than adequate to optimize the site. In many cases, they’re lost and desperately looking for guidance. If SEO is so simple, where is there such a demand for answers? There’s a reason why attendence continues to grow at shows like SES and PubCon. There’s a reason why SEO oriented sites are amongst the most highly trafficked sites on the web (according to Alexa), including Matt Cutts Blog (#1256), Searchenginewatch (#811), Searchengineland (#3963 and growing) and Webmasterworld (#226). People are looking for answers to complicated questions. Not rocket science perhaps, but not a lead pipe cinch either. By the way, Did-It and their Frog Blog are not quite at the same level (#63,244) as these sites that deal with the supposedly artificial “mystique” around SEO.

I would do a Google search for the term “search engine optimization” and run away from any company which can’t even get themselves into the top five organic results. Doctor, heal thyself! And don’t believe for a second that these firms are not trying to get themselves to this coveted position. If they did, they’d win every sale. Maybe I would even hire them.

Okay, let’s apply Pasternack’s logic to himself. Do a search for paid search management in Google and see if Did-It has presence. Here, I’ll save you the trouble:

googlesponsoredsearch

Hmm, don’t see Did-It there. Should we assume that Did-It isn’t very good at what they do? Is it because the return isn’t worth the investment? It could be a number of reasons. And unlike Pasternack, I would hate to make assumptions about their lack of presence because I don’t have all the facts. Carrying this further, let’s look at some of the top SEO agencies, according to Advertising Age’s recent survey in their Search Fact Pack. Of the top 20, none of them are currently ranking in the top 5 for “search engine optimization”. Does that mean you should run from them? No, there’s probably other reasons for it.

Obviously, there is no such thing as the final word in an internet based debate. But a word of advice to Dave Pasternack here. At some point stirring the pot turns into flogging a dead horse.