Posts for 'Telestream'

  • HD Cloud Launches Video Encoding Platform, Capitalizing on "Cloud Computing"

    Three significant trends are behind today's launch of HD Cloud, a new video transcoding service being announced today: the proliferation of video file formats and encoding rates, the increase in syndication activity to multiple distributors and the cost and scale benefits of "cloud computing." HD Cloud founder and CEO Nicholas Butterworth (who I have known since he ran MTV's digital operations 10 years ago) walked me through the company's plan yesterday and how it benefits content providers looking to cost-effectively capitalize on broadband video's surging popularity.

    Anyone who spends a little time watching broadband video will notice variations in video formats and quality. Behind the scenes there are diverse encoding specs for how video is prepared from its source file before it is served to users. This video encoding work is multiplied significantly for content providers if they also want to distribute through 3rd parties like Hulu, Netflix, Fancast, TV.com, etc, all of which have their own encoding specs. Further, these 3rd parties all have their own ways of accepting video feeds and associated metadata from content partners. Yet another driver of complexity are adaptive bit rate players like Move Networks which automatically hop between multiple files encoded at different bit rates depending on the user's available bandwidth. Combine it all and it means encoding has become a labor-intensive, complicated, yet highly-necessary process.

    Traditionally encoding has been done locally by content providers using encoding solutions from enterprise-class companies like Anystream, Telestream, Digital Rapids and others. By offering encoding as a service, HD Cloud gives certain content providers an alternative to spending capex and running their own encoding farms. Content providers choose which source files are to be encoded into which formats and bit rates. They also provide HD Cloud with their credentials for distributing to authorized 3rd party sites. When a job is configured, HD Cloud performs the encoding and 3rd party distribution. HD Cloud doesn't store the files or keep a copy, mainly for security reasons.

    The key to making all this work is so-called "cloud computing," whereby HD Cloud (and many others) essentially rent computing capacity from providers like Amazon's EC2. As new jobs come in, HD Cloud requests capacity, temporarily loads its encoding software (which is a combination of open source and its own custom code) and runs its jobs. When they're done, HD Cloud releases the capacity back to Amazon. It's all a little analogous to the old days of timesharing on mainframes, except with new efficiencies. HD Cloud's economics are based on Amazon buying the computing capacity and operating the facilities and utilizing them at a far higher rate than HD Cloud or any other customer would have on their own.

    The result is that HD Cloud prices its encoding at $2/gigabyte, which Nicholas thinks will only get cheaper as bandwidth prices continue to fall. A financial model he sent along suggests that the content provider's ROI given certain assumptions about the amount of content encoded and streamed could be 3-4 times higher than with traditional local encoding solutions. This also assumes the avoidance of upfront capex for local software and hardware encoding alternatives, an important cost-savings for many given the economy. HD Cloud is announcing Magnify.net as its first client today. Others in this space include mPoint, Encoding.com, ON2 and others.

    Between encoding's growing complexity and syndication's appeal, content providers are going to need more extensive and cost-effective encoding solutions. Cloud computing in general, and HD Cloud (and others) seem well-positioned to address these needs.

    What do you think? Post a comment now.

     
Previous | Next