How can you tell whether bots visiting your website actually are who they say they are? Are they bad bots with malicious intent or are they good bots crawling your site to index your content properly? It’s a given that if you want your content to appear in search results, you have to let search engines do the automated work of indexing it. Which means, of course, opening the door (including premium content behind a paywall) to different search engines’ web crawlers for indexing.
Transport layer security (TLS) is the de facto standard for sending and receiving secure HTTP traffic. With this in mind, Varnish long ago built a standalone TLS proxy on the open-source Hitch project. It delivers secure transport and doesn’t interfere with content delivery performance, but for a number of reasons, it’s not always the right choice for every implementation.
Cybersecurity requires more than just a single action or approach because threats exist on many levels. Some of the biggest threats, in fact, can be internal: employees who aren’t fully aware of cybersecurity threats like phishing are a risk; incorrect configurations within your infrastructure can bring your website down (in fact, configuration errors can bring down half the internet). Cybersecurity isn’t just about securing web traffic through transport layer security (TLS) or putting up a firewall, even though these are necessary fundamentals.
Authentication and authorization policies exist in companies of all types and sizes to govern access control and determine who can see and work with specific data. At the same time, people are one of the biggest security vulnerabilities in organizations, making access control at a granular level one of the most important means of securing your sites, apps and your business as a whole, particularly at a time when ransomware and other attacks are on the rise.