Why robots.txt is important for your SEO?

The robots.txt file is used to tell search engine spiders which pages or sections of your site they can and cannot crawl.

This avoids overloading your site with unnecessary queries, or risking penalizing your SEO by exposing duplicate or low-quality content.

The robots.txt file is therefore important for optimizing your site's crawl and highlighting your most relevant pages.

Here is a Video explaining how the robots is directing the bots

What are the instructions in a robots.txt ?

The most common instructions are Allow, Disallow and User-agent.

User-agent defines the name of the bot concerned by the instruction.

Allow tells a robots he can/should access a directory or page of your website.

Disallow prevent robots to visit these part of you website.

You can also use other instructions such as Crawl-delay (define a delay between requests) or Sitemap (specify the location of the sitemap).

Where can I find template robots.txt file for my website ?

Here is a list of defaults robots.txt templates :

Default website robots.txt (allow all)

    
        User-agent: *
        Allow: /
    
  

This robots.txt tells all bots and search engines could visit all the website's pages.


Default WordPress robots.txt

    
      User-agent: *
      Disallow: /wp-admin/
      Allow: /wp-admin/admin-ajax.php

      Sitemap: {url of your sitemap.xml}
    
  

How to test my robots.txt ?

Here is a list of free tools to test your robots.txt instructions:

About : Hello, we are FreeSEO.app, our goal is to help small business & startups with their SEO. This tool is absolutly free to use.