Excluding bots & crawlers

You can exclude bots & crawlers using an additional parameter

Team avatar
Written by Team
Updated this week

By default we allow bots and crawlers to process our services as they are essential for SEO and ranking purposes. For example, Google mentions it is critical to treat bots and real visitors the same. Hence if a real visitor is redirected, a bot should be redirected as well.

The extended list of crawlers can be found on https://whatmyuseragent.com/bots

If you prefer to exclude bots and crawlers we have few suggestions. As a start, you can use whitelisting . Another important thing you can do is to avoid using 'not equal to' location rules or the 'all other locations segment'

  • If you have an EU store instead of redirecting everyone not in EU you should explicitly redirect the list of North American countries to the US site

  • Google bot comes from a blank location - if you explicitly list the countries to be redirected then the blank location google bot won't get redirected

  • however if you redirect everyone outside the EU using the country 'not equal to' EU countries then this blank location google bot will match and get redirected

  • the final step you can take is ensuring you use hreflang tags as they will help crawlers understand that there are multiple sites to crawl

If this doesn't help we also have an article that details the overall approach Google has to geo location redirection affecting SEO

Please, contact our support if something is not clear, we will be happy to help!

Did this answer your question?