Transparent Caching and Its Role in the CDN Market

|
 |  Includes: AKAM, LLNW, LVLT
by: Dan Rayburn

Everyone reading this blog is aware of the fact that Internet traffic continues to grow, and that an increasing amount of this traffic is being driven by video. Cisco (NASDAQ:CSCO) VNI projects Internet traffic to grow 5 times between 2009 and 2013, and video will constitute 90% of overall traffic. As a result of all this video traffic, one of the biggest buzz words being used lately by telcos, as well as vendors selling CDN platforms for use inside a carrier network, is the term transparent caching. There's a lot of folks using the phrase these days and as a result, a lot of confusion exists as to what it is, how it differs from regular caching and the role it plays in the CDN industry.

Content caching technology utilized for network optimization has been available for many years, and today, there are a couple of different types of caching approaches being used. Originally caching technology focused on basic web pages and moving HTML files and web objects closer to a user to improve response time. Basic web caching became less necessary as network operators grew bandwidth capacity throughout the last decade and today, most people are familiar with caching to provide application acceleration and scale for CDN's like Akamai (NASDAQ:AKAM). There is also service specific content caching to address particular services, such as Google (NASDAQ:GOOG) cache.

What may (or may not) come as a surprise to some readers is the impact that all this traffic has on service provider networks. While CDNs make their money from content publishers who typically pay based on volume, network service providers' money comes from their subscribers who pay a fixed amount per month. So while CDNs (theoretically at least) stand to gain from this increase in video traffic, network service providers are stuck between the proverbial rock and hard place. They have to invest in their networks to scale to support this traffic, yet they are receiving little incremental revenue from it. Clearly, investment with no return is not a sustainable model, and service providers recognize this.

However, as video streaming and rich media downloads continue to flood operator networks, with no end in sight, network operators are evaluating and deploying transparent Internet caching inside their networks to address a broader range of Internet content. The intent is two-fold. The first is to reduce the network infrastructure and bandwidth costs associated with over the top (OTT) content and the second is to differentiate their consumer broadband service and deliver better user performance. By eliminating any potential delays associated with the Internet or even the content origin, caching allows the operator to highlight their investment being made in the access network and deliver more content at top speeds.

You may be asking, "don't CDNs help offload the service providers network too?" The answer is, "only to a certain extent". First, CDNs only get the content so far-they do a good job distributing content regionally, but once the traffic leaves the CDN servers, it still needs to traverse the network service provider's access and edge networks. Also, CDNs don't collocate in every service provider network, so some traffic must be delivered via a peering or transit relationship. Additionally, there is a lot of content that is simply not delivered over CDNs.

Unlike traditional CDNs, which only store content based on business agreements, transparent caching can make intelligent decisions about which content can and should be cached locally to optimize the network. By deploying intelligent caches strategically throughout their networks, operators can cache and deliver popular content close to subscribers, thus reduce the amount of transit traffic across their networks. Hot content would be delivered very close to the edge, and content of decreasing popularity could be cached further back in the network.

But what is transparent caching, and how does it differ from an Akamai or Google caching service and other traditional forms of proxy or content caching? Most traditional content caching sits outside the network in a peering location, data center, or collocation space. It is usually managed by someone other than the network operator (like Akamai or Google). The network operator has little or no control over the cache servers, and as a result has little visibility into the actual productivity of those servers or what is being delivered from them.

In contrast, because transparent caching works across a much broader set of over the top content and traffic (as much as 75% of an operators consumer broadband traffic is video streams and file downloads), it is embedded inside the carriers network and provides the operator control over what to cache, when to cache, and how fast to accelerate the delivery. A transparent cache has the following characteristics:

  • Multiservice caching - Because a transparent cache needs to address as much Internet content as possible, today's platforms tend to support multiple services and protocols running across the network. The most common applications that consume the bulk of the bandwidth include HTTP flash video, Netflix (NASDAQ:NFLX), HTTP based file sharing, Silverlight, RTMP, and BitTorrent.
  • Automatically adapts to popular content - A transparent cache automatically ingests and serves content as it becomes popular and usually does not require operator intervention to continuously modify the network or the caching solution to support a new popular service or device.
  • Transparent to the subscriber and content origin - Transparent caching does not require modification of any system or browser settings. The performance benefits should be automatic as the only evidence of caching should be better end-to-end performance to the user. Likewise, it does not require any special HTML code or DNS redirection from a content source. Benefits should be automatic to the content origin by providing better performance and less load on origin servers.
  • Preserves all application logic - A transparent cache does not impact any application or service logic, meaning critical functions like authorization and click-thru impressions are preserved so as not to impact Internet business models. In addition, a transparent cache must be careful not to serve stale content or content that has been removed from the Internet.
  • Embedded in the network and controlled by the network operator - Because a transparent cache operates across such a broad range of traffic and protocols, it is embedded in the network and controlled by the operator. Control allows an operator to determine how fast the content should be accelerated to the end user so as not to congest other downstream points in the network. It also provides the operator with visibility in terms of caching performance and what types of content are being cached and accelerated.

It should also be noted that some vendors increase the transparency of the caching solution by not even having a public IP address. This has the added benefit of being completely invisible to the user with the exception of providing faster service. This also means that the transparent cache cannot be hacked or bypassed or interfered with.

Many Telcos, MSOs, and Mobile Operators are now looking at transparent Internet caching as a required element in their network to control “over-the-top” content consumption and to provide the best possible end-to-end user experience. It is a unique technology that simultaneously benefits a content owner, a network operator, and most importantly, a broadband or wireless subscriber.

With all the benefits of intelligent caching, it may beg the question as to why it is only now beginning to gain traction? The reality is that caching in this manner is a very difficult challenge to solve technically. In order to cache content, the cache must intelligently and dynamically identify and adapt to shifting content access patterns. Of course, caches must also be legal, which is achieved by following the guidelines outlined within the Digital Millenium Copyright Act (DMCA).

CDNs are good at what they do, serving content to subscribers quickly, and in the process alleviating peering costs for service providers. But there's an enormous amount of traffic that these CDNs do not serve, and that's where the value of intelligent caching comes in. Deployed throughout a carrier network, it will improve subscribers' experience and reduce carriers' peering costs, but only if it delivers the features and intelligence required to adapt to constantly changing user behavior and content patterns, and most importantly, scales economically to tens or hundreds of gigabits per second.

Disclosure: No positions