The robots.txt file is used to tell search engine spiders which pages or sections of your site they can and cannot crawl.
This avoids overloading your site with unnecessary queries, or risking penalizing your SEO by exposing duplicate or low-quality content.
The robots.txt file is therefore important for optimizing your site's crawl and highlighting your most relevant pages.
Here is a Video explaining how the robots is directing the bots
The most common instructions are Allow, Disallow and User-agent.
User-agent defines the name of the bot concerned by the instruction.
Allow tells a robots he can/should access a directory or page of your website.
Disallow prevent robots to visit these part of you website.
You can also use other instructions such as Crawl-delay (define a delay between requests) or Sitemap (specify the location of the sitemap).
Here is a list of defaults robots.txt templates :
User-agent: *
Allow: /
This robots.txt tells all bots and search engines could visit all the website's pages.
User-agent: *
Disallow: /wp-admin/
Allow: /wp-admin/admin-ajax.php
Sitemap: {url of your sitemap.xml}
Here is a list of free tools to test your robots.txt instructions:
About : Hello, we are FreeSEO.app, our goal is to help small business & startups with their SEO. This tool is absolutly free to use.