TL;DR: By redesigning aging infrastructure with Varnish Enterprise for better traffic management and speed, a global financial institution managed traffic spikes, replaced their Adobe Dispatcher caching solution and made Akamai CDN edges more efficient and cost effective.
Most enterprises share at least one thing in common: massive and/or multiple websites, with tens of thousands of both static and dynamic pages. Managing the ever-increasing volume of content is a challenge even in the least complex of circumstances. But for major financial and investment institutions, content is a tangled and sprawling web of customer-facing websites and portals, non-public systems, regulatory compliance concerns, data privacy and sensitive information handling and the concomitant layers of access management and security that come with these types of content.
Flexibility in managing this kind of content is paramount, but in the experience of one large global financial and investment institution based in Europe, not easy to achieve with the solutions they had.
Finding flexibility without Adobe Dispatcher
Managing fluctuating but always large amounts of traffic alongside the challenges of secure, scalable content management proved to be almost insurmountable with Adobe Dispatcher, the built-in caching and load balancing engine that’s a part of the Adobe Experience Manager CMS. The bank faced an uphill battle with Dispatcher because no one knew how to configure it, and it is not widely known for being flexible or granular enough to meet custom needs anyway. Adobe Dispatcher has standard caching and cache invalidation rules that may work if a company has straightforward needs. But for performance and custom cache policy, you need the flexibility to configure your own rules.
And this is where the bank took a look at Varnish Enterprise, which has frequently been the answer to AEM’s Dispatcher inflexibility issue. Varnish Enterprise plays nice and can easily replace Adobe Dispatcher and open the door for flexible configurability.
Redesigned for speed and efficiency
Despite some initial internal resistance to adopting another potentially complex component, the bank moved forward with Varnish in large part because Varnish engineers worked closely with and supported the team through the entire infrastructure redesign and implementation of Varnish.
Varnish easily replaced Adobe Dispatcher, and Varnish provided initial configuration, which the bank’s team was easily able to take over once they learned the C-based Varnish Configuration Language (VCL). This let the team control and customize their Varnish use. One engineer praised VCL as “easy”, and stated, “We were able to replace the existing solution with Varnish without changing anything in the backend because all the logic is done on the Varnish side.”
Efficiency, security and redundancy gains
The bank achieved some key benefits with Varnish:
- Caching efficiency: Not even 10% CPU load
- Security: VCL enhanced backend protection and cookie handling
- Redundancy: If one server fails, another Varnish server can continue to serve content in its place; also helpful for performing routine maintenance without disruptions to service
The next phase of Varnish: performance at the edge
The bank’s relationship with Varnish has continued to expand during its many years using Varnish solutions. Achieving increased performance, i.e., cache hit rate and maximizing use of edge servers, became the goal, leading to:
- Improved cache invalidation: When the bank introduced edge-side invalidation, their edge hit rate improved dramatically. With cache control header s-maxage they could control the length of time edge servers keep content, e.g., the browser may only keep the page for 60 seconds, but the bank could instruct the CDN to keep the same content for 14 days.
In another example of Varnish working in harmony with other critical pieces of software, one bank engineer explained, “Varnish was a game changer for us. We did this for most content on the website and immediately our edge hit rate skyrocketed. We were using Akamai as our CDN and by adopting the Akamai Connector for Varnish, we saw the edge hit ratio jump up to 90 to 95% using this technology compared to 50 to 65% we experienced before.”
- Cost savings: Because edge network traffic between the edge server and backend has a cost while traffic between the edge and clients does not, serving more content from edge servers leads to lower costs and less load and stress on the backend.
Sustainable content delivery solutions for the future
The bank’s tech team realized not only the efficiencies and redundancies described as well as:
- Freeing up time and speed to market: The team is able to create new solutions within hours rather than days or weeks.
- Flexibility and efficiency: The flexible Varnish layer on top of Akamai eliminates duplicative effort and gives the team more flexibility and control over edge caching policy, and is also a faster option in case of restart (Varnish takes seconds to restart; Akamai takes 30+ minutes).
- Increased capacity: Leaving Adobe Dispatcher behind, the load on servers has been reduced and performance increased.
- Simplified configuration: By bringing everything under the Varnish umbrella, configuration is simplified and all VCL - no more duplication, and no more Adobe Dispatcher.
One bank IT employee explained, “The more familiar we become with Varnish, the more we realize we can use it for. Whether it is 404 and 500 page errors, third-party solution integration and protecting user data, or finding a quick workaround for mobile deep linking issues, Varnish continues to offer viable, sustainable solutions that will carry us into the future.”
Want to learn more? Get in touch to find out how we can help you deliver better traffic management and speed, even during peaks.