A Guide To Exclude Hits From Bots, Spiders & Referrals In Google Analytics

A Guide To Exclude Hits From Bots, Spiders & Referrals In Google Analytics

Exclude Hits From Bots, Spiders & Referrals In Google Analytics

You probably came here because you want to exclude all hits from known bots and spiders.

Good news: it's pretty simple – just a few adjustments in your Google Analytics account.

I have noticed lately that many of my websites have started to receive strange traffic from various referral sources in Google Analytics.

The amount of web bot hits is spiking like crazily. And it's quite annoying.

I've noticed it before too, but choose to ignore it due to small amounts. However it has started to become a bit of a problem and the degree of bots & spiders right now is quite severe.

I really don't want to see bots and spiders as part of my Analytics statistics.

Why am I receiving bot traffic as referrals?

The websites are using bots and scripts to get your attention. They want you to buy whatever they have to sell, so they're hoping you are going to visit their sites by curiosity.

This technique has been used as a marketing for some shady services for a while now. But now it has come to a point where all that bot traffic totally disrupts the whole statistics of Google Analytics and you can't make any sense of the data anymore.

At some websites, more than 100% of the traffic is “fake-traffic”, coming from bots and spiders, trying to get my attention as a webmaster. It's annoying and it makes the numbers inaccurate.

And I'm guessing that you probably want to get rid of those web bot hits as much as I wanted.


Bot filters – how to exclude bot & spider traffic in Analytics

I have managed to solve the problem very easily by doing two steps which I will guide you through here.

It works fine. The only problem is that you have to do it on all websites. It can't be applied to your all websites on your Analytics account.

1. Turn on the built in feature “Bot Filtering”

Google has a list of spammy services and websites that are excluded from your statistics at Google Analytics, but only if you enable it. It does not fix the whole problem, however a great amount of bots will be eliminated. The list is called IAB/ABC International Spiders & Bots List and is frequently being updated. Though, some bots might get through.

  1. Go to Google Analytics -> Admin. Choose your account, property, and edit the view by clicking “View settings”.
  2. Check “Exclude all hits from known bots and spiders” and save.

2. Add custom filters based on campaign source

Now to the second step, which is a bit more complicated, yet totally worth implementing, if you want to be sure to get rid of it all.

Google Analytics  have a very smart and sophisticated way of filtering any kind of traffic you don't want to include in your data. There's literally endless of alternatives of filters to apply and there are many ways to get rid of any kind of traffic you don't want.

The problem is to find the right filters to use, but after Googling, and trying many different filters, I'll share which has worked best for me.

These include many different URLs and websites. If there is someone missed, you can just include it and please leave a comment below so I can update these filters. You will add a total of 3 filters. You can name them filter 1, filter 2, filter 3.

  1. Go to Google Analytics -> Admin. Choose your account, property and click “Filters” under the view.
  2. Click Add filter, choose Custom and then pick “Campaign source” as that is the one we will target.
  3. Under “Filter Pattern”, add one of the filters. Click save.
  4. Repeat until all filters are added.

Bot Filter 1:


Bot Filter 2:


Bot Filter 3:


Why does Google allow excess web bot traffic to occur?

Good question. I don't know why they don't fix this issus themselves.

Hopefully as they get smarter, they will find ways to automatically exclude hits from bots and spiders without us having to add these filters!

Written by
Johannes Larsson
Leave a Reply to Wan Cancel reply

  • Why does Google allow excess bot traffic to occur?
    That’s a good question. Especially with regard to the fact that 56% of ad impressions don’t actually appear in front of human eyes (source – )
    There are too many parties who benefit from fraud and bot traffic and it’s the advertisers who really suffer the most from it, as they’re paying their money for something they wouldn’t buy if they knew it from the start.

    The real quality of bot exclusion with GA also remains a question for debate with some claiming it to be not all that effective and time-consuming. Others suggest that fingerprinting can solve the problem. I’d like to hear your stand on the topic of fingerprinting and would definitely enjoy a post about it (might even contribute to it). Do tools like Datadome, Forensiq or really solve the probelem, because they state they do.

  • Create a system, loop holers ready to destroy. Fix the loop holes and again they come back. Only solution is to be continuous vigil, updating & learning.

  • useful info – “Google has a list of spammy services and websites that are excluded from your statistics at Google Analytics, but only if you enable it”. Much obliged. My site facing same problem.

  • When it comes to money I wud divide the world into 2 groups – 1. Hard-working, sincere, knowledgeable, long haul & 2. Hardly working, quick money, always looking for ways to bend the rule. Unfortunately today the vast majority think of the second group as smart people but do not realise their end is tragic. They may accumulate wealth in the short term only to wither it away as they never did hard work to get it. Reading this post I realize the internet too is filled with such scamsters. Post is eye-opener to this shady area and how to prevent the damage. Thank you!

  • As I was preparing for post covid times (now that there has been significant decline) I got hit by bots. Will try your tips. Hopefully things would get sorted out. Thnx.