Information leakage of the web application's directory or folder path

Sooraj V Nair
Published on
04 Jul 2018
2 min read

Web Spiders, Robots, or Crawlers can be used to retrieve any webpage and can recursively traverse all the hyperlinks present in the webpages. This software uses recursive methods to help them retrieve furthermore web content. The robots use the Robots Exclusion Protocol’s specified behaviour to retrieve information from the application. The Robots Exclusion Protocol is specified in the robots.txt file, found on the web application’s web root folder. The robot.txt file contains the protocol along with all the folders the software must ignore. But, the Web Spiders, Robots, or Crawlers can intentionally ignore the disallowed directives in the robot.txt file. These types of robots can be found on many social networks. Due to this reason, robot.txt is not the safest method to enforce restrictions on the way the web content is used by 3rd parties.

An attacker can get robot.txt by using wget.


The output of the above command will be as follows:-

        --2018-09-01 13:13:59--
        Resolving (, 2a03:2880:f12f:87:face:b00c:0:50fb
        Connecting to (||:80... connected.
        HTTP request sent, awaiting response... 302 Found
        Location: [following]
        --2018-09-01 13:13:59--
        Connecting to (||:443... connected.
        HTTP request sent, awaiting response... 302 Found
        Location: [following]
        --2018-09-01 13:13:59--
        Reusing existing connection to
        HTTP request sent, awaiting response... 200 OK
        Length: unspecified [text/html]
        Saving to: ‘robot.txt’
        robot.txt               [     <=>            ] 559.13K   560KB/s    in 1.0s   
        2018-09-01 13:14:01 (560 KB/s) - ‘robot.txt’ saved [572554]



The following is the example of code written in robot.txt :-

        User-agent: *
        Disallow: /search
        Disallow: /sdch
        Disallow: /groups
        Disallow: /images
        Disallow: /catalogs



  • An attacker can easily find all the hidden folders used by the application.

Mitigation / Precaution

Beagle recommends the following fixes:-

  • Make sure the Robots.txt file does not reveal any information about the application’s directory and internal folder structure details.
Automated human-like penetration testing for your web apps & APIs
Teams using Beagle Security are set up in minutes, embrace release-based CI/CD security testing and save up to 65% with timely remediation of vulnerabilities. Sign up for a free account to see what it can do for you.

Written by
Sooraj V Nair
Sooraj V Nair
Cyber Security Engineer
Find website security issues in a flash
Improve your website's security posture with proactive vulnerability detection.
Free website security assessment