First published May 11, 2006 in Mediapost’s Search Insider
I pity the poor new entry in the search engine space. How do you possibly stake out new territory in the hottest online space there is? How do you avoid being swept away in the tidal wave of momentum that is going to the industry leaders, Google, Yahoo and MSN? How do you attract enough users to gain a critical mass? Well, you have to offer something different.
A new desktop search tool, Quintura, is betting that a new user interface based on the concept of semantic mapping is just the ticket it needs to win the search lottery. And if semantic mapping sounds familiar, it should. I’ve been talking about semantic mapping for almost two years now. It’s a powerful concept in understanding how people search and something we identified in our previous research. But when it comes to building a new user interface around it, I’m not sure Quintura’s implementation will be taken up by the search masses.
A Semantic Map Primer
First of all, if you haven’t heard me speak about the concept previously, let me introduce you to the theory of semantic mapping.
Whenever we use a search engine, we have a concept in mind. The concept is usually fairly complex, consisting of a lot of pre-existing relationships we have made mentally. I’ll stick to the example I usually use to illustrate the concept. Let’s assume we’re beginning our research for an upcoming digital camera purchase. In this case, our concept will likely include connections with brands we’re familiar with (Nikon, Kodak, Canon, Olympus, etc), features (zoom, number of megapixels, autofocus) and our intent (finding consumer reviews, reading testimonials, finding side-by-side feature comparisons). These connections can be expressed by the words that define them. Together, all the words that define our concept make up our semantic map. It could consist of dozens or even hundreds of words.
But when we go to a search engine, we distill the concept down to the broadest possible phrase, both out of a desire to be inclusive in our search, and out of a reluctance to expend too much effort in constructing our search (which is a diplomatic way of saying we’re lazy). So we search for “digital camera.” When the results are presented to us, that original concept and its accompanying semantic map is still in place. It plays a vital role in how we react to the listings. We scan listings, and if we happen to find appearances of the words in our map, that listing registers as being a better match to our intent.
Quintura’s Take on Semantic Maps
When I first heard about Quintura, it was like somebody had built a search tool around the Powerpoint slide I’ve been using for the last year and a half to illustrate the concept. Just like that slide, the query used sits in the middle of an actual word map, surrounded by related words that further define the concept. In Quintura, as you click on words that define the concept, they get added to your query, causing the words in your map to update and restrict the focus of your search, allowing you to quickly and graphically structure very specific queries. The theory is that clicking through a semantic map will allow you to spend less time sifting through irrelevant results.
In theory, this should be a huge step forward in the user experience. But, feeling somewhat traitorous to the concept I helped pioneer, I’m not sure it’s the answer for the next big search interface. Here’s the problem…
When All’s Said and Done, We’re Still Lazy…
You’ve been able to structure advanced search queries for ages. But through it all, less than 5 percent of all searches have taken advantage of these capabilities. Quintura’s take is just another way to build the query. The real time updating is cool, and may help refine the search when you’re not exactly sure of the query you should be using, but I’m not sure this is enough. I think it will make Quintura an interesting footnote in the search “also ran” category.
There are reasons why we search the way we do, and not being able to think of the words is generally not one of them. We know the words–in fact, we know too many of them. They reside just under the conscious layer, making themselves known when there’s a match in the result we’re looking for. But this is hardly an articulated process. It happens in split seconds, through subconscious connections. It’s almost transparent to us, as we hardly notice it’s happening. It’s one of those things that when you hear it explained, you say, “Yeah, that makes sense. I’m sure I do that,” but you didn’t know you did it.
My theory? If it takes longer than one second and more than one click to refine your search, you’ve excluded 95 percent-plus of your potential market. Quintura’s approach, although undeniably cool, fails on both counts. I still believe semantic mapping is vitally important, but I think search engines have to get better at creating those maps transparently, through disambiguating our intent by getting to know us better. I believe it’s unrealistic for a search engine to expect the user to go to the trouble of building the map for them.
So what about the other 5 percent? The power users, or the searchers who don’t know what they’re looking for and need the prompting of related words? Again, it comes back to critical mass. This might be a welcome addition to Google’s advanced features, but it can’t attract enough attention as a stand-alone to survive. Then again, perhaps catching Google’s attention isn’t such a bad play.
Sorry, Quintura. I feel like I’m disowning one of my children, but unless you’re looking at being gobbled up by one of the big three, I don’t see this as the winning ticket in the great search lottery.