In our recent webinar with STL, From CDN to Edge Content Delivery: Quantifying the Big Opportunity, we received a lot of great questions from our audience about the nature of the telco edge, and what edge computing actually means. The edge is still a frontier space for many companies and raises a lot of question marks. Here we’ll try to dispel some of that mystery by answering some of the most common questions we get about edge computing and Varnish’s role as Edge Content Delivery Software.
How do the biggest video-streaming platforms in the world manage to store so much video and serve it so fast to such vast, global audiences? These questions stump even some companies that need to deliver high-performance streaming.
Usually when we talk about the Massive Storage Engine (MSE), we’re talking about its main features and use cases. After all, storage is a kind of hidden, unsexy necessity that powers, in part, many of the conveniences we take for granted: ubiquitous on-demand streaming services, for example. The instant-delivery accessibility of these content libraries is not magic even if it sometimes seems like it. And our MSE discussion normally revolves around the efficiency and speed enabled by a smart storage and caching setup designed for high-performance video distribution, CDNs, and large-cache use cases.
As consumers, we expect to access any content we want on any device with within-millisecond-availability, playing on-demand. We don’t usually think about what is required technically to make this work. One of the biggest components in this equation is storage and the process of accessing selected content from storage to serve it. With massive libraries of both in-demand and long-tail content available, the challenges of high-performance, fast, available and consistent delivery are growing.