CRAWLING: What is robots.txt and what should it look like?

This is a file that gives search engines information about the pages a company wants indexed or crawled. You can find this page by going to YOURDOMAIN/robots.txt. Most robots.txt files look like this:

User-agent: *
Disallow: /

Here’s mine for this site, it disallows some admin areas and includes a link to my sitemap.

User-agent: *
Allow: /wp-admin/admin-ajax.php
Disallow: /members-home/

Note: You don’t need to have your sitemap in your robots.txt, it’s just nice to have.