There is a widespread belief that the cost of internet bandwidth is dropping rapidly to a zero intercept point. As with many widespread beliefs, this one is promoted by those who have most to gain from it being perceived as true, separate from the actual facts. There is another widespread belief (fueled by the hypergrowth of YouTube, MySpace, Google Video, Yahoo Video, etc.) that all forms of video will soon be available on the internet.
I call these “belief networks.” They operate rather like startup press releases. Two startups (and not just startups) get together and issue a mutual press release. Both benefit from increased exposure and the credibility of mutual validation. My entrepreneur friend Tom Taylor calls these releases “two drunks propping each other up”. They’re OK until one gets distracted; then they both fall into the gutter.
Full disclosure: I believe that there is demand for online media on a scale that will transform the internet. And my company Itiva Digital Media is working hard to reduce the cost of bandwidth to the point that it is feasible. That said, I think that it would be useful to inject some facts into the belief network to see just how much work remains to be done.
Most commentators point to GooTube as proof of the massive scale of internet video, estimating YouTube’s download volume at 200 TeraBytes per day. That’s a tidy revenue source for Akamai or Limelight Networks today, but size not only matters, size is relative. Current video flows are neither large relative to potential mass video scales nor likely to drive down data costs with economies of scale.
Let’s look at the real data volumes and associated costs behind the belief network. Some 111 million U.S. households have their TV sets on an average of 8.5 hours a day, according to the Television Bureau of Advertising and Nielsen Media Research. This represents a data volume of about 318,000 TeraBytes per day, or 1590 times the data volume of YouTube.
Who is going to deliver it? Akamai? No, and here’s why: Akamai has roughly 300 Gb/s bandwidth in its network today. Building that network cost about $329 Million in raw asset and facility purchases, let’s say about $1Million per Gb/s of bandwidth. To deliver the volume of data just described would take about 83,000 Gb/s bandwidth. That’s $83 billion of capital cost in the Akamai model.
Telecom service providers will need about $28 billion in capital alone to build out their private Internet TV networks, they estimate. Aside from cost, it takes a long time to successfully deploy that much capital equipment.
So how does this belief network stack up? Far from representing massive video bandwidth, YouTube and all the others are a mere dribble compared to actual mass video availability. Overall demand for bandwidth is exploding. Assuming current models, in the next few years bandwidth supply won’t keep up — tending to higher prices.
We’ll need new technologies to resolve this. Before you cry out “peer-to-peer” is the answer, let me note that P2P systems actually increase the infrastructure cost for ISPs to achieve a target bandwidth. The P2P dream was based on the assumption that the cost of bandwidth was going to be zero. A genuine solution will take some really complex and difficult optimization of the end-to-end chain — from video originator to the viewer’s desktop and ultimately the family TV.
At any given time, there are sources of bandwidth available that are not currently used to capacity; and bandwidth unused is lost forever, even though it must be paid for. One example: in North America there is unused proxy capacity about equal to the total of Akamai’s bandwidth. Using it actually reduces ISP network capital and operating costs rather than increasing them. Creating an optimization system that dynamically matches each video request with the optimal set of available resources in the end-to-end network seems like a daunting task, but it can be done. I think that I’d better get back to work.