What is a Robots.txt file?

A robot.txt file gives web-crawling robots directions for how to crawl a site. This can be used to prevent the indexing of certain parts of your site.


More About Robots.txt files

Robot.txt files are not strictly necessary. Without them, a spider will simply crawl the site based on its default guidance. However, most sites can benefit from the inclusion of a robot.txt file in order to help guide crawlers.

A robot.txt file can be used to encourage or dissuade the crawling of certain pages. For example, if a website has a private staging area that is blocked from being indexed, you wouldn’t want this page showing up in search results. Robot.txt files can also be used to ensure that certain pages do get crawled, or to hide poorly optimized pages from being ranked.

Special Offer!
Professional SEO Services
Our Pro Services team will help you rank higher and get found online. Let us take the guesswork out of growing your website traffic with SEO.