AI For Bot Detection
February 22, 2023

Protecting Sensitive Paths from ATO

Protecting sensitive paths is a critical to maintaining your site security. Just as every house has a weak access point, every endpoint with a login is vulnerable to Account Take Over (ATO).

Trying to manage bots with Robots.txt is like managing a college bar with no ID checks. You may get away with it for a while, but sooner or later, it’s going to lead to illegal behaviour and a bad headache.

Many of you will be all too familiar with the challenges of robots.txt.

Hackers use your “no follow” instructions as a calling card to go straight to the areas you want to protect. It’s like signposting which flower pot the spare key to your house is hidden under.

It’s too easy to allow a bad bot that has free rein to crawl your site. Hackers write bad bots that impersonate well know search engine bot user agent - they are hard to spot. Coupled with the fact that 'good' bots change their user agents strings and behaviour frequently and rarely document any of the changes. A bot you allowed last year, may be doing things you don’t want this year.

We scan your robots.txt and automatically match it against our database of legitimate bot services. We show you the bot behaviour profile by category, and provide a recommendation engine so you can just automatically apply the suggestions.


domains & Paths on VerifiedVisitors

Using our Command & Control centre, shown above, you can instantly select multiple domains and login paths that have account take over bot traffic, and simply block access to them.

You can specifically add the login and admin paths you need to protect and create dynamic rules which adapt according to the threats on your actual path.

You can now finally enforce Robots.txt from the vast majority of bots that don't obey it - and keep your sensitive paths hidden from the bots and the hackers for good.

Check more blogs

Get updates on the content