November 26, 2020
5 min read time

Micro PoPs: Placing your PoPs for content delivery reach

varnish_micro_pops

Technology evolves alongside what is demanded of it. This is true for digital content delivery and more specifically for video streaming. As we head into the 5G content delivery era, what will content delivery -- and the traditional CDN look like? We draw on insights shared by CacheFly’s CTO Matt Levine in conversation with Varnish Software’s Technical Evangelist, Thijs Feryn to peek into the near future.

 

The evolution of content delivery: The adaptability of expectations

Many of the technologies we use for video streaming have evolved along with streaming itself. We’ve moved beyond the novelty of just exclaiming, “Wow! I’m watching video on my computer!” to a completely different user experience where buffering isn’t tolerated, and various aspects of the quality of viewing experience matter. If the video doesn’t start in two seconds, we as viewers feel our expectations are not being met. And ultimately content delivery comes down to end-user expectations. 

rebuffering

Yet we’re often trying to deliver content with technology that was never designed to do, just bending what we thought was possible and making the best of it. For example, HTTP was not designed for video streaming. But here we are in an HTTP video streaming world, cobbled together by future-oriented demand rather than by technology that existed, waiting for a use case. The fact is, as CacheFly CTO and Founder, Matt Levine, explained during a recent webinar conversation, there will always be user expectations that we cannot meet right now, but the development of expectations proves how much and how fast the streaming experience has improved with incremental growth and innovation.

Most such things are not future-proof and have to, instead, be responsive to the use case at hand. That is, if you’re delivering live sports video, maybe 60 frames a second is the most important factor. Maybe another video use case requires the highest-possible resolution. For content delivery professionals, underlying technology choices are made based on these considerations. 

 

The traditional CDN: The coming scalability challenge

There’s a lot of talk -- much of it hype -- about content delivery in the 5G era. But some of the hype is warranted. How we deliver content, specifically the traditional CDN, is poised to change beyond recognition in large part because of changing expectations. What do we mean by this? 

If we describe a standard CDN as a collection of distributed caches, then we’re defining a CDN as a caching technology -- and caching is mostly what it does. We want the traditional CDN to cache everything to serve it faster and with the highest quality of experience possible, right?

Sure, but when we consider what the future of content itself looks like, i.e., heavy high-resolution video, the caching equation doesn’t scale. Let’s say, for example, we want to deliver 8k video from 5G cache nodes. The math behind how much sheer storage it would take to cache all of this massive content in edge nodes doesn’t add up unless we return to the bad old days of charging a fortune for content delivery. We will not be able to cache everything in a world of exponentially growing numbers of massive files that live in an infinitely growing number of places. 

 

Micro PoPs: Go big by going small

 

 

CDNs rely heavily on caching content physically closer to end users to reduce latency. But what will replace this standard? CacheFly -- and the industry as a whole -- sees a future for the micro PoP. A micro PoP is still a kind of point of presence, but instead of caching actual content right at the edge where users are, we place content closer to end users in intermediary spots that feature good connectivity between the end user and wherever the content actually lives. It’s this middle-mile network that helps to deliver what the user wants without having to cache the files themselves all the way at the edge.

The real-world micro PoP: Hits and misses

Right now we live in a world where we want to achieve the maximum number of cache hits, but now we need to focus on misses. We want them to be as good as possible, which sounds strange. That is, as 8k video and 5G come onto the scene, cache hit ratios will inevitably decline. We know we could see a reduction from, e.g. 95% cache hits to maybe 60% cache hits. What is going to matter in the future of content delivery is how the other 30% -- the cache misses -- is handled. How can we, as Levine asked, improve those misses? The answer, he suggests, is to continue putting things physically closer to end users, even if the “thing” isn’t the content itself, and the content isn’t cached at the edge. 

The micro PoP world is more specific and granular than the traditional CDN, and goes back to the introduction of this post: you will make decisions based on your use case rather than just brute-force caching everything because the kinds of traffic we’re going to see proliferate aren’t going to be suited to the old ways of cache-and-deliver. Instead you will optimize for the most efficient use of space/storage and caching and apply intelligence/logic that makes determinations about the content requested to decide where it should be places, e.g. in a full PoP somewhere or a micro PoP. 

 

Stuck in the middle with you

What happens next is much less about the content and much more about connectivity. Moving content closer to end users delivers a performance win because the micro PoP forms a bridge. Levine described it as making the most of a good middle-mile network to serve the content to the micro PoP, while the micro PoP is well-connected to the end-user who is 2 or 3 milliseconds away. That ensures optimal performance without having to caching all the content right where end-users are. Much of this can be informed by ML or AI intelligence to know more about content and know where to put it. Sometimes the best place is somewhere in the middle. 

Go into deeper detail how micro PoPs work by watching the clip -- or the whole webinar 👇

varnish_cachefly2_watch_now_CTA