Posted inSecurity

A CISO’s guide to building an effective cyber budget in today’s market

How can today’s CISOs use this combination to make more efficient and rational cyber expenses?

In 2023, Cybersecurity remains a hot topic. Experts and vendors still worry whether security budgets are at risk of reduction like they have been in other tech segments, or whether spending on cyber will continue to grow. A report by YL Ventures indicates that while 30% of Fortune 1000 cyber budgets remained unchanged and 30% decreased, approximately 23% were frozen while approximately 12% increased. Despite these numbers, sound projects of cyber and artificial intelligence (AI) are still considered the two safe tech harbours, leading more tech companies to consider an investor-approved combination of both. But how can today’s CISOs use this combination to make more efficient and rational cyber expenses? The answer might reside in an investment in the company’s web traffic management.

Nik Rozenberg, Co-founder and CEO of BotGuard

The vast majority of cyber spending is dictated by fear. Companies buy protection against potential threats which may or may not occur knowing that if the threat ever materializes, the price could be prohibitively high. However, the cost savings of a cyber protection implementation can create an immediate benefit for the company. One example of this is web traffic management provided by WAAP (web application and API protection) services. Regardless of whether they’re malicious or not, every visitor to your public-facing website costs your company money. And while malicious visitors usually cost much more per visit, even a benign visit has its price. Let’s take a look at these expenses in detail. 

The total price of any visit consists of the cost of the uplink (the traffic), server resource consumption, and other business costs specific to your company. In most global locations, web traffic is very inexpensive and you can usually ignore this cost. Server resource consumption determines how many visitors you can handle and when you need to add new resources (for example, buy another server or increase the capacity of the existing one). In a situation where up to half of your incoming traffic is made up of robots (a very typical statistic for e-commerce and some other industries), you may waste a significant chunk of your resources handling visitors that bring in no value. The visit depth, or number of pages visited by a bot, is on average much greater than that of a human, as bots are tasked with crawling entire sites in search of content or vulnerability. For this reason alone, blocking unnecessary bots can lead to significant savings.

Among the more specific business threats, there are those that are constantly present, constituting a kind of grey zone, and those that are relatively rare. The first category includes competitive actions and the risk of losing control over proprietary content, the risk of distorted analytics, and SEO degradation. Even if you don’t precisely identify items in category one, you’ll likely encounter them in your traffic. These losses are undeniable, although they are more difficult to accurately calculate. More typical cyber threats occur less frequently but the damage from them can be very high. For example, it can be a loyal customer who can’t purchase a ticket because bots bought them for resale; the owner of a media site discovering a clone of their website; or a DDoS attack that makes your site inoperable. The use of robots is a big business, almost considered criminal. Bots can be responsible for competitive analysis, data scraping (e-commerce, ChatGPT), botnets, DDoS, hacker attacks, and various frauds. But bots can also be useful and helpful for SEO (search engines, marketplaces, social networks), monitoring (webhooks and payment systems), and general automation (IFTTT, RPA). 

Distinguishing between good and malicious robots is not an easy task. There are a number of methods to address this issue that don’t work completely. Robots.txt does not work at all. CAPTCHA is limited, surmountable by many modern bots, and generally not able to distinguish between good and bad in situations that might be different for various companies or in different contexts. Using a firewall to filter traffic by IP or country is often too straightforward a solution for most businesses. Using WAF can be likened to driving nails with a microscope – using a good tool in the wrong situation.  In addition, it always requires a qualified expert to carefully configure and maintain the rules.  Using custom scripts on the web server side generates support and implementation troubles (and filtering by user-agent does not work at all). You need a solution specialized for web traffic management. 

BotGuard is a user-friendly solution and easy to integrate, which allows webmasters and site owners to decide who they want to let in. BotGuard’s universal tool set filters out all harmful and useless web traffic, giving the user complete granular control over incoming traffic at an affordable rate. 

Web traffic management is one of the few cyber investments that pays off almost immediately. But what about AI? It is impossible to effectively analyze the behaviour of web visitors without using advanced AI technology. Only an AI can successfully fight another AI, especially one mimicking a human being. That is why at BotGuard, we already have a number of AI models developed, mainly related to the analysis of traffic patterns, that allow us to be ahead of the curve.