Whilst the figures reported in Google Analytics are not completely accurate, they do give you a good idea of what’s going on with your website, and when it comes to traffic, it’s a great way of understanding how much you are getting, where it’s coming from and the quality of that traffic.
Bots are an increasing problem, leading to noticeable skews in data. These bots send automated traffic to your website, which is at best of no value, at worst, malicious. These bot visits bump up your traffic stats, making your site look more popular than it really is, but at the same time affecting many other stats negatively.
Bots Skew Your Website Bounce & Conversion Rates
Your website bounce rate is an important performance indicator and can tell you several things. A low bounce rate can mean that people are engaged with your site and that you are attracting good quality, relevant traffic.
Conversely a high bounce rate could mean that you may have a problem with the quality of the traffic you are attracting, or that you have serious aesthetic or usability issues with your website, causing people to distrust your site and immediately leave it.
It is estimated that 56% of all website traffic comes from bots
Consider that for a moment. Potentially half of the visits to your website could be bots and not potential customers at all, no wonder your conversion rates appear to be so low!
Bot traffic leads to a 100% bounce rate, a visit duration of 0 and of course, this means that your goal conversion rate will drop through the floor – this in turn will skew your overall bounce and conversion rates, making your websites performance look worse than it really is.
How To Filter Bot Traffic Out Of Google Analytics Reports
Prolific new bots are popping up all the time, and it is incredibly difficult to reliably guard against them all. Whilst you can specify that some or all bots are restricted from visiting your site at all via your .htaccess file, a good way of dealing with this problem is by filtering bot visits out of Google Analytics using manual filters. Alternatively, new to Google Analytics in July 2014, you can very easily exclude all known bots from showing up in your stats with a single click.
When you exclude a referral source, traffic that arrives to your site from the excluded domain doesn’t trigger a new session.
When you implement your chosen method, it means that you are less likely to see skewed figures in future, although be aware that your traffic may appear to go down once these exclusions are in place.
1. How To Exclude All Bots
- Log in to your Google Analytics account
- Go to ‘All web site data’
- Choose ‘View settings’
- Scroll to the bottom and you will see ‘Bot Filtering’ heading and a tick box.
- Tick the box to ‘exclude all known bots and spiders’
- Click save
2. Filtering Out Individual Bots
- Go to ‘Admin’
- Click on ‘Filters’, under ‘All Web Site Data’
- Click ‘+New Filter’
- Check ‘Create New Filter’
- Give the filter a name e.g. “ButtonsForWebsites BOT”
- Under Filter Type, choose ‘Custom’
- Select ‘exclude’ and choose ‘Referral’ from the ‘filter’ field
- In ‘Filter Pattern’ copy the name of the bot you want to block e.g. buttons-for-website.com
- Click save
3. Create A Referral Exclusion
This is our preferred method and one you’ll find more about here.
- Go to ‘Admin’
- Click on ‘Tracking Info’ in the property column
- Clic ‘Referral Exclusion List’
- Add a new domain you want to exclude by clicking +Add Referral Exclusion
- Enter the domain name
- Click ‘Create’ to save
Common Bot Offenders To Look Out For
Spotting bot traffic is easy, just look for referrers with a consistent 100% bounce rate. Chances are it’s a bot and should be filtered out.