How bot traffic impacts web analytics and how to avoid it | Articles


In the bustling realm of the digital landscape, the significance of web analytics cannot be overstated. The insights gleaned from these metrics fuel strategic decisions, steer user experience enhancements, and drive businesses towards success. However, navigating this data-driven journey does not come without its challenges. One of the most pervasive and often underestimated obstacles is the presence of bot traffic. Beyond its mere numerical contribution to website visitors, bot traffic can profoundly distort the accuracy of web analytics, potentially leading to misguided decisions and missed opportunities.

In this article, we delve into the intricate relationship between bot traffic and web analytics. We unravel the multifaceted impact of bots on these crucial metrics, shedding light on the implications for businesses striving for genuine user insights. From the deceptive inflation of traffic figures to the distortion of conversion rates, we uncover how bot-driven interactions can skew the very foundation of informed decision-making.


Bot traffic refers to all traffic to a website or application that originates from non-human sources. Although the term "bot traffic" is frequently associated with negative implications, the evaluation of bot traffic as positive or negative is contingent on the intentions behind the deployment of these bots.

Positive bot traffic is generated by bots that engage in valuable functions, including:

  • Web crawlers employed by search engines to index websites, ensuring your site's visibility in search results.
  • Chatbots designed to address queries and resolve customer problems, enhancing customer service and contentment.
  • Monitoring bots that oversee website analytics, providing insights into user behaviour and potential enhancements.
  • Testing bots that assess website performance, detecting and resolving issues proactively to prevent user disruptions.
  • Marketing bots that optimise display advertisements, ensuring targeted and timely ad delivery.
  • Virtual assistants that enhance productivity by automating tasks, freeing up time for higher-priority activities

Negative bot traffic arises from bots engaged in harmful activities, including:

  • Credential stuffing, where bots attempt unauthorised access to websites using stolen login details. This technique can lead to account breaches and the theft of personal information.
  • Data scraping, in which bots illicitly gather information from websites. The obtained data might be exploited for malicious intentions like price comparison or competitive analysis.
  • DDoS attacks, where bots inundate a website with traffic to incapacitate it. This tactic hinders legitimate users from accessing the site.
  • Scalping and denial of inventory attacks, involving bots purchasing and reselling inventory at inflated prices. Such actions obstruct genuine users from fairly acquiring products.


Bot traffic can wield significant influence over a website or application, posing several risks and challenges. Among these, it has the potential to inflate website traffic, leading to increased hosting costs and performance issues. Moreover, it can generate fake leads, thereby wasting valuable marketing investments. Additionally, bot traffic compromises the accuracy of website analytics, making it difficult to draw meaningful insights.

Furthermore, it can obstruct legitimate users' access to the website or app, causing frustration and potentially leading to user abandonment. Lastly, bot traffic can trigger data breaches and other security vulnerabilities, putting sensitive information at risk.

These reasons underscore the importance of taking action against bot traffic to safeguard the integrity and accuracy of the data collected on our website or application. 


To effectively identify and mitigate bot traffic on a website, one should focus on four primary strategies. First, regularly analysing traffic patterns for any unusual activity, such as sudden spikes or repetitive actions. Second, examining user agent strings in server logs to identify distinctive bot signatures. Third, considering implementing CAPTCHA tests on forms and interactive elements to challenge automated bots. Finally, one may want to explore specialised bot detection services like Clickcease or Cloudflare to enhance bot traffic detection capabilities. Combining these approaches can significantly improve your website's defence against unwanted bot activity.


For Atol, it has been decided to take a stance against bot traffic through using a bot detection service, namely ClickCease. The tool was implemented in March 2023 on Atol’s Google Ads campaigns, and early May 2023 on Atol’s Meta campaigns. The results were taken on the 31st of May. The test ran on the basis of two hypotheses:

  1. The click-through rate will go down because of a decrease in traffic - through blocking out bot IP addresses
  2. Although the click-through rate goes down, the conversion rate will go up as the traffic brought on the website is of better quality

On Google, it was hard to draw a significant conclusion out of the results. This being said, according to ClickCease estimates, in a period a bit smaller than 3 months, ClickCease blocked out 22,849 IP addresses on Meta and allowed Atol to save about $25,625.

This being said, on Meta, our two hypotheses were verified when the results were drawn. We observed a 7% click drop and a 9% drop in CTR period on period, although keeping the investment stable. This is due to the limitation on clicks imposed by ClickCease. But the most impressive results were linked to the second hypothesis: Atol observed an increase of 84% of its conversion rate period on period. This increase of conversion rate was directly translated into an increase in Atol’s main conversion. The traffic brought to Atol’s website via Meta ads was therefore much better than it was before implementing the bot detection service. According to ClickCease estimates, in a period of a bit less than one month, ClickCease blocked out 308 IP addresses and allowed Atol to save about $346.


In conclusion, it is undeniable that malicious bot traffic hurts the trustworthiness of web analytics data. Although the impact can vary from one business to another, it is a reality. Fortunately, there are also tools & techniques that allow to limit the impact of bot traffic on web analytics as much as possible. Using these tools allows to significantly improve the quality of the traffic brought to the website. These tools are not free, but in the long-run, it allows to indirectly save money through avoiding the complications bot traffic could bring along. This being said, most detecting tools offer free tests in order to identify if bot traffic is present on a specific website, so it is definitely worth the try.

publication auteur Xander Schepens
Xander Schepens

| LinkedinThis email address is being protected from spambots. You need JavaScript enabled to view it.



Get in touch

Semetis | Rue de l'Escaut 122, 1080 Brussels - Belgium

Connect with us

Cookie Policy

This website uses cookies that are necessary to its functioning and required to achieve the purposes illustrated in the privacy policy. By accepting this OR scrolling this page OR continuing to browse, you agree to our privacy policy.