As consumers, we expect to access any content we want on any device with within-millisecond-availability, playing on-demand. We don’t usually think about what is required technically to make this work. One of the biggest components in this equation is storage and the process of accessing selected content from storage to serve it. With massive libraries of both in-demand and long-tail content available, the challenges of high-performance, fast, available and consistent delivery are growing.
A quality user experience comes down to two things: availability and performance. That is, when the user presses “play” on selected content, the reaction should be immediate. Then the performance of that content should be smooth, seamless, fast, without buffering or waiting. And this needs to happen across the user’s devices.
We’ve discussed streaming challenges before: there are many factors that go into making the streaming experience ideal for the end-user. However, for the purposes of this post, we’ll discuss the highlights of caching, storing and retrieving hot-set, long-tail and all the rest of your content efficiently.
Varnish's Massive Storage Engine (MSE) enables you to strategically manage your storage layer. First things first: you’re never going to let your HTTP requests near your backend. For VoD requests, given the vast number, you’ll want to cache these responses and serve them from memory. But with the number of requests on top of video and encoding, that’s a lot of data and metadata. MSE is part of what ensures that you can scale your storage out to ensure redundancy and availability of the requested content. MSE also offers persistent storage for 100TB+ data sets, so even in planned and unplanned situations, your cached content will persistent, so you don’t have to wait for the cache to fill up again, saving time and ensuring a consistent user experience. This is key for business-critical content delivery that underpins video distribution, CDNs and large-cache use cases.
MSE: What does it do?
MSE at its most basic stores objects in cache and keeps track of the metadata as well as the hot data set and object eviction when the cache is full or when objects are forcefully purged. This ensures that the most-accessed content is available in cache and thus available to serve at the press of a play button. MSE was designed to solve storage and content delivery challenges pertaining to large volumes of content, tackling limitations of memory and file stevedores that are also available in Varnish.
For example, if you are planning to add more live channels, VoD content, extend your video catch-up periods or store high-quality video (e.g., 4K), you will need to increase your cache capacity and add more nodes in the storage tier of your setup. And that’s where MSE comes in.
MSE is a key part of high-performance streaming and plays a major role in streaming architecture that relies on a two-tier (edge and storage) approach to scaling. You can read more about MSE here or watch part 1 of our MSE 3 part series on our Youtube channel.
Going to IBC? Book a meeting with us to discuss your streaming and content storage needs.