Yes bots and spiders crawling your website may process our installed script and if they do, they will count as requests.

By default we allow bots and crawlers to process our services as they are essential for SEO and ranking purposes. For example, Google mentions it is critical to treat bots and real visitors the same. Hence if a real visitor is redirected, a bot should be redirected as well.

Read more about search engine crawlers, geo personalization and SEO


Excluding bots and crawlers from consuming requests
If you prefer to exclude bots and crawlers from processing your services, they can be disallowed by adding an additional query parameter to our API URL.

The query parameter is 'bots=disallow" and it needs to be appended to the API URL for the particular service: 

georedirect?id=-LXSCIDZcUv1Z2gUh3zd&bots=disallow


Example: Redirect script without 'bots=disallow'

<script>
var geotargetlyredirect1548825736330 = document.createElement('script');
geotargetlyredirect1548825736330.setAttribute('type','text/javascript');
geotargetlyredirect1548825736330.async = 1;
geotargetlyredirect1548825736330.setAttribute('src', '//geotargetly-1a441.appspot.com/georedirect?id=-LXSCIDZcUv1Z2gUh3zd&refurl='+document.referrer);
document.getElementsByTagName('head')[0].appendChild(geotargetlyredirect1548825736330);
</script>


Example: Redirect script with 'bots=disallow'

<script>
var geotargetlyredirect1548825736330 = document.createElement('script');
geotargetlyredirect1548825736330.setAttribute('type','text/javascript');
geotargetlyredirect1548825736330.async = 1;
geotargetlyredirect1548825736330.setAttribute('src', '//geotargetly-1a441.appspot.com/georedirect?id=-LXSCIDZcUv1Z2gUh3zd&bots=disallow&refurl='+document.referrer);
document.getElementsByTagName('head')[0].appendChild(geotargetlyredirect1548825736330);
</script>

Did this answer your question?