November 3, 2016
3 min read time

Straight from the Varnish lab: Parallel ESI

parallel-esi.gif

We’ve been working on adding parallel ESI in Varnish Plus for the last couple of months. The code is now feature-complete and waiting on more testing before being deemed production-ready.

Based on preliminary testing I can share a bit of information on how the implementation works and what benefits it will offer.

For those of you who don’t know ESI well - here is the basic principle. Many web pages may share content among them. If you work in media you may list most recently published articles for easy access for your reader; if you work in e-commerce you may want to offer your users a list of trending products. Without ESI or a similar technology in place, you’ll lose your entire cache as soon as one of these global page element changes. With ESI all you need to do is update that element and all your pages automatically start serving the updated content - without the need to rebuild all pages.

One problem still remains. Unless your cache-hit ratio is very high, ESI will slow down delivery. And Varnish only processes one include at a time. So, if you have 10 includes and they are all cache misses Varnish will deal with them one at a time.

What’s new with parallel ESI?

The parallel ESI implementation does this differently. Upon considering a page for delivery it will immediately seek out what fragments need to be fetched and fetch them - all at the same time. Delivery starts and when a fragment is needed for inclusion Varnish will stop and wait for the object to arrive. If it is already there it will just deliver it. And you will see a significant difference in how much faster your updated page is delivered.

 

serial-esi.gifparallel-esi.gif

There are obvious performance gains here. If you have 10 includes that need fetching, parallel ESI will fetch these all at once. So the pages slow down to the single slowest fragment included - as opposed to slowing down to the time taken to do all fetches.

The parallel ESI implementation is slightly more efficient than serial ESI, most likely due to less waiting and synchronization of threads. Remember that serial ESI is already using separate threads for backend fetches.

Impact on VCL execution

The client-side VCL will run sequentially in the “master thread”. One backend thread will be spawned for each fragment and will run the backend VCL there.

Impact on pages with a single fragment

One not-so-obvious gain here is the performance impact on pages that only require a single fragment to be fetched. Since the fetching starts right away the fetch might already be done by the time the fragment needs to be delivered. This means the delivery will just continue without halting. This can have a significant speed impact on larger pages and on pages delivered to bandwidth-constrained clients, i.e., mobile clients on cellular connections.

single-include-serial.gifparallel-single-include.gif

In the images above you can see how delivery (red arrow) is unhindered and immediately gets to deliver the included fragment as the fetch started before page delivery.

All in all parallel ESI can have a significant positive performance impact on websites already using ESI. Our experimental Edgestash (edge-side Mustache processing) will also benefit greatly from the infrastructure created. If you are looking for a new Varnish feature that will help you improve your mobile content delivery, parallel ESI might be what you are looking for.

If you are interested in helping us test parallel ESI or you want to evaluate this when it is production-ready - please drop me an email: perbu@varnish-software.com.

Photo (c) 2013 Paul Albertella used under Creative Commons license.