Google Search Console Tracking - robots.txt
1 min read
robots.txt is a text file that search engines use to define site scanning rules for crawlers.
VTEX has a native interface for editing and customizing the robots.txt file.
Go to: Store Settings > Storefront > Settings > SEO > Robots.txt
For a better understanding of the content, see below a detailed description of the basic functions:
- Allow: by using this word, you will allow the search tool crawler to browse and index the address given.
- **Disallow: **you will be blocking the content given
In order to validate the content, you must:
- Check whether the URLs listed really need some rule in the robots.txt file;
- Check whether the rules were correctly applied to the URLs intended;
- Verify if the sitemap.xml file was listed correctly.
robots.txt file editing for your store is available at: [accountname].vtexcommercestable.com.br/admin/Site/ConfigSEOContents.aspx
.
For a proper setup of Search Console, the next steps are to check the store’s Sitemap; present the structure of your store to facilitate crawler browsing; and accelerate the indexation of pages.