In our recent webinar with STL, From CDN to Edge Content Delivery: Quantifying the Big Opportunity, we received a lot of great questions from our audience about the nature of the telco edge, and what edge computing actually means. The edge is still a frontier space for many companies and raises a lot of question marks. Here we’ll try to dispel some of that mystery by answering some of the most common questions we get about edge computing and Varnish’s role as Edge Content Delivery Software.

 

Q:  What is the Edge? And what is Edge Computing?

A: The “edge” in edge computing simply refers to the outermost edge of any web platform, the band of digital space that is closest to the end-user. Edge computing makes use of the edge by placing nodes and computing services in that space to better serve users and offer another layer of protection for origin servers.

 

In Varnish’s case, this includes caching content at the edge, as close to the user as possible, improving user experience by eliminating latency in content delivery. 

 

Q: Is Varnish Software designed for the edge?

A: Varnish is Edge Content Delivery Software, employing edge computing principles to enable efficient and secure content delivery. Varnish began as a reverse proxy caching solution, sitting in front of centralized servers and proxying as a midpoint between them and their clients. In a sense, our software has been fulfilling the core edge computing objectives of network decentralization and shortened request journeys since Varnish’s very inception.

 

Q: Can Edge solutions like Varnish support Video on Demand as well as live streaming?

A: Yes. Edge content delivery is a highly efficient way to stream video content of all kinds, including live video and VoD. While VoD can require a higher storage capacity, Varnish Enterprise’s Massive Storage Engine is specifically designed to handle heavier content like VoD. MSE features a combination of memory based and disk storage, and is able to keep up with VoD demands at scale with very low latency.

 

Q: How do you manage Varnish servers and nodes at the edge of the network?

A: Each Varnish server is automatically discovered, and can also be tagged. Every deployment has a specific set of tags associated with it, enabling the creation of different caching tiers that can be configured according to a range of caching customization options, such as sorting by content type or user location.

 

Additional features like Varnish Controller and Varnish Traffic Router offer a more in depth overview of all the nodes in your network, with tools like node health monitoring and CDN mapping to maximize observability at the edge.

 

Q: If a low latency solution can be found on a centralized cloud environment, is making use of the edge still necessary?

A: Edge computing reduces latency by shortening the distance content has to travel to reach the user. Some cloud vendors have data centers located near end-users for this precise purpose. Regardless of what terminology is used, that is a form of edge computing. Basically, if your network is already making use of the edge by moving resources closer to your users, then that’s great, and if not, you may find you can reduce latency further by exploiting the edge as a resource. Varnish solutions can run on a wide range of environments, including the edge, but also cloud, on-prem, bare metal, and VMs.

 

Reach out to us for more answers

While this is only a handful of the many questions we help our customers with, hopefully this helps clarify the edge and Varnish’s roles as a modern edge solution. If you have any further questions don’t hesitate to reach out to our team of experts.

From CDN to Edge Content Delivery On-Demand Webinar