When using a traditional CDN or caching system, creating user-centric security and access policies can be a complex and performance challenging undertaking. Not only do you have to pull user data from a backend, but you then have to apply the security policies from that data onto the request. VCL is an excellent candidate for the latter problem of applying security policies to requests. This leaves the problem of how you get user data (JSON) into VCL? How do you do that on a user-by-user basis? And how do you do that in a way where you keep backend communication to an absolute minimum, or put another way, serve as much data from cache as possible?
In this post, I will explain how to create a highly available, self-routing sharded Varnish Cache cluster. This is similar to a standard sharded cluster with one exception, there is no dedicated routing tier. Each node in the cluster can route the request to the proper destination node, by itself.
Most of our customers use a content delivery network. Quite a few of them are using Akamai, and over the years we’ve helped them integrate Varnish with Akamai to ensure that the two work together as effectively as possible.
This summer Varnish Software decided to do something very ambitious. We decided to make the web a faster place. Now that statement may leave you scratching your head. Doesn't Varnish Cache already exist and hasn't it already made the web a lot faster?