Bot Reconnaissance and Web Scanners

What is Bot Reconnaissance, and how do Web Scanners work?

Recommended articles

Social share

A Bot reconnaissance mission is the use of automated bots to systematically discover weaknesses and vulnerabilities in the target's infrastructure while remaining undetected. The reconnaissance mission uses web scanning bots to covertly observe and extract data and intelligence gleaned from weaknesses observed in the platform defences. 

It’s a hard life being a hacker. You constantly have to think of new ways of scamming people. Why not be proactive? Constantly run crawler bots to check for vulnerabilities, and simply exploit the weak sites that are going to be easy. Target the vulnerable, the old and those in poor health. Crawl millions of websites and let the opportunities come to you.

Crawler or Spider bots are the bot army that does the basic reconnaissance mission. They fake user agent strings to look like  “normal” scrapers and visit each page on your site, just like ‘regular’ search engines do. They are begging you to whitelist them. Once whitelisted they can roam with impunity behind the ‘enemy’ lines, and set-up all sorts of further attacks. See our bot database for more information

The reconnaissance missions are analogous to the use of special reconnaissance units in the Army, who operate covert operations often behind enemy lines to build up intel for future strikes, and more recently to the use of Drones to provide specific intelligence around targets behind enemy lines.

Bot Reconnaissance Missions need to be stopped before they lead to more atttacks

These types of reconnaissance bots focus on three main areas:

  1. Account Login Paths and other critical admin / vulnerable paths
  2. Footprint probes to establish the technical stack.
  3. Know vulnerabilities checks, where an exploit has worked in the past.

Typically, these are not custom scripted bots, they are just the standard scripts used across millions of sites. Most web admins will ignore them. These bots aren’t doing any actual harm. 

Many sites allow these crawler bots to systematically access  all the pages on the website. Nothing happens for a while - you probably don’t even notice.

However, ignoring them is not wise. If they do discover any potential vulnerabilities, these bots typically will report back their finding to the hackers, who can then decide on the next course of action to take. 

Account Login Paths 

If the account login paths are seen as vulnerable and don’t have sufficient protections, the next stage may well be a credential stuffing or even full blown account takeover attack, depending on the perceived value of the target property. 

FootPrint Probes

These footprint probing mission work to identify the different properties of the web service or API, such as platform configurations, tech stack, and security defenses, to identify the underlying logic, structures, and methods, of the security defense architecture. 

Known Vulnerabilities

Know exploits are usually contained in older versions of the software, and this is where version control is really necessary, to ensure the stack is up-to-date and has the latest patches for any known vulnerabilities.  Although this is basic good cybersecurity best practice and routine hygiene, the bots discover millions of sites every day, so it’s just a matter of running the numbers until they find a target website that is vulnerable. 

Frequently Asked Questions

Can you ignore web scanning bots?

No! Ingore them at your peril. Having zero trust for bots at the edge of network reduces the chance of further more targeted attacks.

why do bots constantly hit my login paths?

The bots are trying to detect weaknesses in your login protection, or may be launching credential stuffing or account take over attacks.