Getting all the elements of a consumer-friendly, high-performance online retail experience right is a technical challenge, filled with potential pitfalls. We want to help.
Downtime is always bad, and you want to avoid it. But sometimes you can get away with it. This is one of the magic things about content caching and being able to serve stale content. You can avoid the perception of downtime (to some extent) and give your website the appearance of being up and running at full speed, even when your backend is down.
How can you tell whether bots visiting your website actually are who they say they are? Are they bad bots with malicious intent or are they good bots crawling your site to index your content properly? It’s a given that if you want your content to appear in search results, you have to let search engines do the automated work of indexing it. Which means, of course, opening the door (including premium content behind a paywall) to different search engines’ web crawlers for indexing.
Transport layer security (TLS) is the de facto standard for sending and receiving secure HTTP traffic. With this in mind, Varnish long ago built a standalone TLS proxy on the open-source Hitch project. It delivers secure transport and doesn’t interfere with content delivery performance, but for a number of reasons, it’s not always the right choice for every implementation.