As consumers, we expect to access any content we want on any device with within-millisecond-availability, playing on-demand. We don’t usually think about what is required technically to make this work. One of the biggest components in this equation is storage and the process of accessing selected content from storage to serve it. With massive libraries of both in-demand and long-tail content available, the challenges of high-performance, fast, available and consistent delivery are growing.
Every year around this time, companies start to tell (or be told) cautionary tales about the e-commerce ghosts of Black Fridays past. Here we mean recounting the calamitous failures of major online retailers, essentially falling short of a good customer experience or being completely unable to deliver website content at their biggest revenue-making moment of the year. And without fail, this pattern is repeated annually – both the cautionary pleas to take action against such catastrophes and the retailers who have somehow declined to take heed.
With video content dominating the internet and companies of all types adopting serious streaming content delivery strategies, the time is ripe for a robust, high-performance streaming solution that provides key streaming outcomes: high volumes of video and other streaming content at the highest possible quality and best possible performance and speed for both on-demand and live video.
As you probably know, Varnish has always been a very secure piece of software but so far, that safety only applied to itself and therefore, a malicious request could still go through it and hurt your backend. But as a reverse-proxy (load-balancer, origin shield, etc.), Varnish is going to see everything the backend receives and sends, so there's a great opportunity here to sanitize the traffic before it reaches it.