We’re preparing for our December 12th live webinar, Six secrets to successful live streaming, and hope you will join us to watch and learn more. In the meantime, though, I wanted to tackle a couple of the tricky, sticky questions that pop up when I talk to companies about their streaming needs. (Read part one of this blog series for more.)
Live content streaming via commercial CDN
These days, most companies use commercial CDNs to manage and deliver content globally. There can be some coordination that needs to happen to make live streaming run smoothly when a CDN is the middle man. That is, you don’t control the CDN. We addressed this question in a previous blog and webinar.
Our upcoming webinar will dive into this topic, but for the moment, I’d like to emphasize that control has become one driving factor in companies implementing multiple CDNs and hybrid setups. What I mean here is that it has become increasingly important to companies serving live and VoD content to decide for themselves how, when and from where they distribute content. So we’re seeing movement toward the trend of mixing and matching private, hybrid and multiple CDNs to match the unique needs of each company.
With this in mind, of course there are solutions to manage your live content streaming via your commercial CDN - or solutions to mix and match for yourself based on what your live content streaming needs actually are.
Prefetching content: Anticipating what comes next
A lot has been said about prefetching content and whether or not it delivers on its promise of boosting performance and making the streaming experience smoother. We’re not debating the point but instead are introducing a solution in the form of a Varnish Plus VMOD (VMOD-http) to act predictively to prefetch content and keep your cache warm.
What exactly does prefetching mean? Essentially, prefetching content means that you are loading data into your edge server’s cache before it is even requested - so it is there and ready when the client does request it. When the predictive prefetch is correct, the theory is that latency is reduced because there is no time lapse between the content request and the roundtrip to and from the backend. This should also save some strain on the origin server. Then again, live video only has a limited number of parts available to prefetch because the live event is happening in real-time. Bottom line: you can gain an edge, but you don’t want to get too far ahead of yourself. And we’ll talk about that in the webinar.
I recently wrote a blog post specifically on prefetching, where you can learn more. You can also read more about the http VMOD that enables the prefetch function.