First published June 18, 2009 in Mediapost’s Search Insider
In the new Bing-enabled world, search is hotter than ever. Your entire Search Insider lineup has been trading quips and forecasts about the future of search. Aaron Goldman thinks Hunch may be the answer to my call for an iPhone of search. Today, I want to talk about why Wolfram|Alpha is very, very important to watch. It’s not an iPhone, but it is changing the rules of search in a very significant way.
Search is more than skin-deep. To most users, a search engine is only skin (or GUI) deep. And anyone who’s taken Wolfram for a spin has judged it based on the results they get back. In a few cases, Wolfram’s abilities are quite impressive. But that’s not what makes Wolfram|Alpha important. For that, we look to what Stephen Wolfram has done with the entire concept of interpreting and analyzing information. Wolfram|Alpha doesn’t search data, it calculates it. That’s a fundamentally important distinction.
Unlike Bing, which is promising a revolution that barely qualifies as evolution, Stephen Wolfram knows this is the first step on a long, long road. He says so right on the home page: “Today’s Wolfram|Alpha is the first step in an ambitious, long-term project to make all systematic knowledge immediately computable by anyone.”
Words are not enough. Wolfram’s previous work with Mathematica and NKS (New Kind of Science) shatters the paradigm that every search engine is built on, semantic relationships. As revolutionary as Google’s introduction of the linking structure of the Web as a relevance factor was, it was added to a semantic foundation. PageRank is still bound by the limits of words. And words are slippery things to base an algorithm on.
The entire problem with words is that they’re ambiguous. The word “core” has 12 different dictionary definitions. It’s very difficult to know which one of those meanings is being used in any particular circumstance. Google and every other engine is limited by its need to guess at the meaning of language, one of the most challenging cognitive tasks we encounter as humans.
Potential advancements in relevance require gathering additional signals to help interpret meanings and reduce ambiguity. Personalization is one way to do this. Hunch, Aaron’s nominee for the iPhone of Search, requires you to fill out a long and rather bizarre quiz about your personal preferences. All this is to learn more about you, making educated guesses possible. If you’re going to stick with a semantic foundation, personalization is a great way to increase your odds for successful interpretation.
Another way to interpret meaning is to go with the wisdom of crowds. By overlaying the social graph, you can make the assumption that the one meaning people like you are interested in, is also the meaning you might be interested in. Again, not a bad educated guess.
Knowledge as a complex system. But what if you could do away with the messiness of language entirely? What if you could eliminate ambiguity from the equation? That’s the big hairy audacious goal that Stephen Wolfram has set his sights on. If you look at the entire body of “systematic knowledge,” you have a complex system — and in any complex system, you have patterns. Patterns are abstractions that you can apply math against. In effect, knowledge becomes computable. You don’t have to interpret semantic meaning, which is intensive guesswork at best. You can deal with numbers. And unlike language, where “core” has 12 different values, the number “3” always has the same value.
Wolfram|Alpha is not important because it provides relevant results for stocks, cities or mathematical problems. It’s important because it’s taking an entirely new approach to working with knowledge. It’s not what Wolfram|Alpha can do today; it’s what it may enable us to do tomorrow, next year and in the year 2015.
Wolfram|Alpha could change all the rules of search. Keep your eye on it.