• RAMP Enables Automated Contextual Videos With Launch of MetaQ

    One of the main things content providers can do to create rich user experiences is present contextualized content that relates to the underlying article or video. This is why so many content sites have "Read More," "You Might Also Enjoy" and "Also View" type sections. They help content providers increase users' time spent (which drives monetization), build loyalty and create competitive barriers.

    As the library of video assets many content providers have has exploded, it has become virtually impossible for their editors to manually select contextual videos for a newly posted clip, and then subsequently keep them updated. What's required is an automated system that intelligently associates archived videos with new ones based on pre-set, customized rules. Importantly, the system has to be able to massively scale and work not only for on-demand video but for live video as well.

    That is exactly what RAMP, a video technology and search provider, is releasing today, in its new "MetaQ" product. As RAMP CEO Tom Wilde demo'd for me yesterday, MetaQ presents an extensive range of filters and a point-and-click rules creation process that editors use in order to associate/trigger related videos - and other desired content, such as celebrity factoids, stock tickers, ads or transactional prompts. When applied, the rules create a highly robust user experience that feels dynamic because relevant, up-to-date content is constantly being added.

    MetaQ itself is enabled via RAMP's "MediaCloud" platform, which ingests, transcribes and creates metadata for video. With an archive of tagged videos to draw on, MetaQ's rules engine finds the appropriate associations and then presents the content when and how the editor has specified. RAMP's "MetaPlayer" technology, which is integrated with leading video players like Brightcove, YouTube, JW Player and others, enables the related content to be displayed as part of the viewer experience.

    RAMP has 3 great examples of how MetaQ's capabilities can be used. Tomorrow night's People's Choice Awards broadcast will feature a second screen app that will dynamically present information on actors, TV shows, movies, etc. as they appear (which also underscores how MetaQ can power experiences across multiple screens). Second, Better Homes and Gardens is presenting supplementary recipe/ingredient information. And third, WEEI sports radio is integrating additional audio and video by specific athlete, team and game. Tom said that early users of MetaQ have seen a 70%-200% lift in users' video consumption, which directly leads to new monetization.

    As video has become ever more strategic for content providers, sophisticated tools, like MetaQ, are essential for optimizing its value. Further, as viewers expect rich experiences across platforms, these kinds of tools can reinforce and extend the impact, regardless of which device the user opts for at any particular moment.

    With respect to the TV itself, this is the kind of thing I was alluding to in my post last week, "For Tomorrow's TVs, User Experience is More Important Than Screen Size and Resolution." MetaQ is another perfect example of how content providers are going to be able to deliver vastly enhanced and integrated experiences to TVs. In my view, this is how significant new value will be created in the living room, rather than with absurdly large screens and incrementally higher resolutions.