Friday, September 26, 2008, 7:27 AM ET|Posted by Will Richmond
A couple of days ago, Truveo, the big video search engine owned by AOL, released the results of an internal study which concluded that it provides the most comprehensive search results among 5 companies considered. Before you say, "Duh, Will, what else would you have expected Truveo to conclude?!" it's worth spending a few minutes considering the study's methodology, results and implications. Video search is an extremely strategic space, so all credible data has value.
When it comes to search, there are really two key criteria to judge quality - coverage and relevancy. A search engine can return a million results, but if none are relevant, it's pointless. Conversely, just one spot-on result and you'll rejoice, but you still may yearn for additional, relevant options (since video quality can vary, links may be broken, the user experience at certain sites may stink, etc.). So optimizing both coverage and relevancy must be the goal.
In Truveo's study, it has focused solely on coverage, having deemed relevancy too subjective to credibly measure. To quantify coverage from a competitive standpoint, it chose 4 other search engines, Blinkx, Microsoft Live Video Search, Google Video and Yahoo Video. This limited pool immediately begs the question how the many other video search companies not included would have fared. Truveo explained that the testing was very resource-intensive, so they needed to keep the competitive set relatively small.
To measure coverage, Truveo selected 100 top-ranked Alexa sites across 5 categories: news, sports, TV, music and movies. Then they found 10 representative videos from each and ran a query for those videos - using the exact title the site used - on each of the 5 search engines. Scoring was binary - a search engine got a 1 if they returned an accurate result for at least 5 of the 10 queries, a zero if they didn't. Final score from this process, Truveo 86, Blinkx 20, Microsoft Live Video Search 17, Google Video 3, and Yahoo 2.
Having reviewed the test's full methodology and spoken to a Truveo representative, I think for the most part their approach is pretty fair. An obvious limitation is that lots of video search engines (or web search engines like Google) weren't evaluated so the study is by no means conclusive. Further, only premium sites were included (i.e. no UGC, and actually very little indie video either), so one wonders how the results would have changed if sites like Break.com, Heavy and others were also tested. And then there's the small matter of YouTube, the market's 800 pound gorilla, not being included at all. Since for many users video search begins and ends with YouTube, its omission raises a question about just how reflective these results are of real-world user behavior.
Nonetheless, Truveo gets points in my book for shedding further light on a very confusing subject, and also constructing a relatively objective methodology that can be used by others (in fact Truveo is encouraging independent 3rd parties to undertake more testing of this kind).
Video search is one of the most intellectually challenging areas of the broadband video ecosystem, yet as Truveo asserts, there is surprisingly little evaluative data out there. From my standpoint, more data means more informed market participants and therefore continually improving user experiences. That benefits everyone in the broadband ecosystem.
What do you think? Post a comment now.
(Note, the complete methodology can be requested by emailing Josh Weinberg at jweinbergATtruveo.com)
Categories: Video Search
Posts for 'Microsoft Live Video Search'