Solving the Friday Night Deluge


CED

The reactions to Netflix have varied throughout the cable industry. Some are irritated by a competitor highjacking their broadband pipes for free. Others see Netflix augmenting the value of their broadband product or as a complement or add-on to their own video services.

Both can be true. On the one hand, Netflix continues to represent nearly a third of all streaming traffic during primetime (according to Sandvine), clogging pipes and getting to do it for free. At the same time, nearly three-quarters of Netflix subscribers have a pay TV subscription too (according to Cowen & Co.).

This all puts cable companies in a quandary. They’re like Charley Partanna, Jack Nicholson’s character in “Prizzi’s Honor,” who discovers the woman he loves is a rival hitman. What are my options, he wonders. “Do I ice her? Do I marry her?”

Surveys suggest that viewers with subscriptions to both a pay TV service and to Netflix seem to be watching more video overall. In other words, Netflix represents supplemental viewing, not replacement viewing. There is some concern that among those dual-subscription viewers, those who have access to VOD are using Netflix instead, but for the most part, Netflix is a supplement/complement to cable.

In light of that, more and more cable operators are figuring out ways to accommodate their subscribers’ desire for access to Netflix.

So: Netflix. Ice it, or marry it? Nuptials appear possible. How might that work?

In the past, any discussion of cooperation between Netflix and cable MSOs (or any Internet service provider) most frequently focused on the MSO identifying Netflix traffic though deep packet inspection (DPI) or some other means, and prioritizing that traffic. MSOs are not going to do that for free, and Netflix has no intention of paying for it.Furthermore, while technologically feasible, that approach is dicey politically, in that some view it as a violation of network neutrality principles.

So the next consideration is transparent caching. The basic idea is to place a server at the edge of service providers’ networks dedicated to handling high-traffic content from outside sources.

Some MSOs already do this with their own VOD systems.

But what about content the MSO cannot anticipate will be popular, or does not have rights to? Or what if it’s traffic from, say, Netflix? Netflix’s own solution is Open Connect, its own content delivery network (CDN).

Service providers have two options for joining Open Connect.

One is to directly connect their networks by peering with Netflix at common Internet exchanges, which is free. The other is to install a cache system that stores 100 TB of data in a 4RU chassis in or near their networks. Netflix says the latter option saves more transit costs. Netflix declined to comment for this story.

Cablevision chose to join Open Connect at the beginning of this year. Other companies that have done likewise include British Telecom, Suddenlink, Telus, Bell Canada, Virgin, and Google Fiber.

Right now, with Netflix eating up such a huge chunk of network capacity while other OTT companies barely register, installing a Netflix server is probably no big deal.

But what if other services get popular? Do you install one server for each of them? There are companies trying to forge a business out of schemes that would cache OTT traffic as well as anything unanticipated, thus keeping backbones relatively clear and helping to maintain quality of service (QoS). PeerApp and Qwilt are two such.

Qwilt has a software-based solution it is pitching directly to network operators as a means of minimizing the cost of expanding network capacity. It would then turn around and bill CDNs for doing the delivery on their behalf. Qwilt believes it has an advantage with MSOs because its software-based solution would add only a minimal amount of equipment to already crowded headends and hubs, and it is also convinced its solution scales better.

PeerApp also has a software-based approach, and it thinks the way to go is straight to the CDNs, who ideally will pay for the solution out of the proceeds of their contracts with their OTT clients, and who will deploy the technology. The deployment requires equipment installed on the MVPDs’ premises.

Mediacom Communications became the first MSO to acknowledge having installed such a system when it announced in September that it had deployed Qwilt’s online video delivery solution.

Mediacom declined to be interviewed for this article.

Tony Lapolito manages Cisco’s CDN portfolio, labeled the Videoscape Distribution Suite (VDS), one element of which was designed for transparent caching. The crux of the matter for Cisco is content exchange.

“We believe the friction point is between the CDN providers and the MVPDs,” Lapolito said. “The MVPDs are providing access – they want to cache the content because the over the top traffic being pumped into their networks? That’s a cost center for them. Also, if there’s buffering, there’s a perception of a lack of quality. Being swamped with this stuff is a problem. Meanwhile, the people who are pumping bits into their network – an Akamai or a Limelight – can no longer tell who’s being served.”

Once the stream hits the MVPD network, the CDN loses the ability to keep track of it, meanwhile, the MVPD isn’t getting anything for the use of its network.

“Say it’s a penny a gigabyte served by a CDN,” Lapolito continued.

“If it’s served by a transparent cache, the relationship should be that the MVPD reports back an accurate account of who was served and when, and maybe the CDN gives the MVPD a portion of the penny a gigabyte. The CDN can then turn around and give their customers an accurate view of the work that’s being done.”

Cisco and PeerApp both say they have been involved in trials of their approaches.

And while all this is going on, the big CDNs are working to make their networks more efficient.

Akamai Technologies announced in September it had deployed FastTCP broadly across its network, the Akamai Intelligent Platform, optimizing the throughput of IP content.

Actual average throughput measurements collected from sample regional networks more than a week before and after the FastTCP network upgrade indicated significant improvements in select customer and end user instances, ranging from 8 percent in Japan to 105 percent in China, with both North America and Europe enjoying 15 percent to 22 percent increases, the company said.

Frank Childs is responsible for carrier marketing at Akamai.His take on transparent caching is that it is justifiable when transit costs are high, for example when the traffic is moving through a submarine cable. When transit costs aren’t so high, in typical terrestrial IP networks for example, it’s a harder case to make.

And even if there was a stronger economic justification for using transparent caching for video traffic, finding someone who might actually pay for it could be tough, in Childs’ estimation. “YouTube and Netflix – their margins are already very thin. There’s no room for paying for quality of service. Netflix might have some interest, but they’ve got their Open Interconnect,” he said.

But you never know, Childs said. “We sell a transparent cache. If it goes, we want to play.”

If transparent caching catches on, it might be because the approach can be appropriate for more than just OTT video.

When new games for the PlayStation and Xbox are introduced, download traffic can rapidly escalate. Apple is notorious for not giving anyone much of a heads-up when they make available new device software updates. Videos on YouTube or Vimeo go viral – by definition unexpected. Transparent cache systems can handle that traffic as well.

Another option

Conviva has a different approach that relies on integrating client software on the viewing devices. The client gathers information on how the video is playing, network conditions, and other data, and reports it all back. The Conviva analytics engine evaluates that information, then makes recommendations for a variety of measures that can affect video quality. It can recommend what bit rates are most appropriate for the requesting device, or what bit rates might be most appropriate given the device AND network conditions. “We can even direct traffic down a different path where we know there’s less congestion,” said Conviva VP of marketing Ramin Farassat. “It’s pre-emptive — we’re actually ahead of real-time. And we can do this because we have so many devices out there.”

The client can be embedded in any downloadable app, whether it’s an over-the-top distributor’s app, or a programmer’s app, or an MSO’s remote player app. The technology can work with just about any end device; it is currently being used in apps on iOS products, Android devices, the Roku box and various game consoles, for example.

Farassat said the technology is currently being used mostly by programmers (it is being used by HBO, Bloomberg, and ESPN, among others), but he said that Conviva is talking to DirecTV and BSkyB, and that several MSOs are beginning to take a look.