Menu
Feedback
Start here
Tutorials


Tutorials
Integration with monitoring tools
Google Search Console Tracking - robots.txt
1 min read

robots.txt is a text file that search engines use to define site scanning rules for crawlers.

VTEX has a native interface for editing and customizing the robots.txt file.

Go to: Store Settings > Storefront > Settings > SEO > Robots.txt

For a better understanding of the content, see below a detailed description of the basic functions:

  • Allow: by using this word, you will allow the search tool crawler to browse and index the address given.
  • **Disallow: **you will be blocking the content given

In order to validate the content, you must:

  • Check whether the URLs listed really need some rule in the robots.txt file;
  • Check whether the rules were correctly applied to the URLs intended;
  • Verify if the sitemap.xml file was listed correctly.

robots.txt file editing for your store is available at: [accountname].vtexcommercestable.com.br/admin/Site/ConfigSEOContents.aspx.

For a proper setup of Search Console, the next steps are to check the store’s Sitemap; present the structure of your store to facilitate crawler browsing; and accelerate the indexation of pages.

Contributors
1
Photo of the contributor
+ 1 contributors
Was this helpful?
Yes
No
Suggest Edits (GitHub)
Google Analytics: FAQ
« Previous
Google Search Console Tracking - Sitemap
Next »
Contributors
1
Photo of the contributor
+ 1 contributors
On this page
Still got questions?
Ask the community
Find solutions and share ideas in the VTEX Community
Join our community
Request VTEX support
For personalized assistance, contact our experts
Open a support ticket
GitHubDeveloper PortalCommunityFeedback