Web 2.0 Expo Tokyo: Eric Klinker – “Web 2.0 and content delivery”

My last report from the Web 2.0 Expo Tokyo event is about the talk by Eric Klinker, chief technical officer for BitTorrent Inc. (I met Eric and his colleague Vincent Shortino briefly on Thursday evening), who gave a talk about “the power of participation”.

The market for IP video is huge, and a Cisco report called the “Exabyte Era” shows that P2P, which currently accounts for 1014 PB of traffic each month, will continue to rise with a 35% year-over-year growth rate. User-contributed computing is happening right now, and is delivering over half of the Internet traffic today.

A new order of magnitude has arrived, the exabyte (EB). One exabyte is 2^60 bytes, which is 1 billion gigabytes. If you wanted to build a website that would deliver 1 EB per month, you would need to be able to transfer at a rate of 3.5 TB/s (assuming 100% network utilisation). 1 EB corresponds to 3,507,000 months or 292,000 years of online TV (stream encoded at 1 MB/s), 64,944 months or 5,412 years of Blu-ray DVD (maximum standard 54 MB/s), 351 months or 29 years of online radio traffic, 20 months or 1.7 years of YouTube traffic, and just one month of P2P traffic.

If you have a central service and want to deliver 1 EB, you would need about 6.5 MB/s peak bandwidth, and 70,000 servers requiring about 60-70 megawatts in total. At a price of $20 per MB/s, it would cost about $130 million to run per month!

The “Web 2.0” way is to use peers to deliver that exabyte. However, not every business is ready to be governed by their userbase entirely. There is an opportunity to take a hybrid model approach. BitTorrent are a content-delivery network that can enable Internet-based businesses to use “the power of participation”. 55 major studios and 10,000 titles are now available via BitTorrent.com (using BitTorrent DNA). Also, the BitTorrent SDK allows BT capability to be added to any consumer electronic device.

He then talked about the Web 2.0 nature of distributed computing, and how we can power something that wouldn’t or couldn’t be powered otherwise. For example, Electric Sheep is a distributed computing application that renders a single frame on your machine for a 30-second long screensaver, which you can then use. Social networks also have a lot of machines, but the best example of distributed computing is search. Google has an estimated 500k to 1M servers, corresponding to $4.5B in cumulative capex (that’s capital expenditure to you and me) or 21% of their Q2 net earnings (according to Morgan Stanley). And yet, search is still not a great experience today, since you still have a hard time finding what you want. Search engines aren’t contextual, they doesn’t see the whole Internet (the “dark web”), they aren’t particularly well personalised or localised, and they aren’t dynamic enough (i.e, they cannot keep up with most Web 2.0 applications [although I’ve noticed that Google is reflecting new posts from my blog quite quickly]).

The best applications involve user participation, with users contributing to all aspects of the application (including infrastructure). Developers need to consider how users can do this (through contributed content, code or computing power). As Eric said, “harness the power of participation, and multiply your ability to deliver a rich and powerful application.”

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s