How to declare Sitemap directive in robots.txt file
The robots.txt file is used to declare a link to the XML file(s) of the site map. This directive is intended to notify search engines about the availability of a site map in a special XML Sitemaps format.
When should a directive be declared in a robots.txt file?
Once you have created and uploaded the XML Sitemap to the web server.
When should the Sitemap directive be updated?
The directive should only be updated if the current Sitemap URL has changed. If the Sitemap is updated, but its URL remains the same, no further action is required.
To add the Sitemap directive in the robots.txt file, you must add the following line:
sitemap-file-uri – this is the full URI of your Sitemap file (ex: sitemap.xml).
As a result, you should get the following contents in the robots.txt file:
User-Agent: * Sitemap: http://yourwebsite.tld/sitemap.xml
If you need to update the Sitemap directive, just replace the current line in the robots.txt file:
Sitemap: http://yourwebsite.tld/old-sitemap-uri
with a new one:
Sitemap: http://yourwebsite.tld/new-sitemap-uri
What if robots.txt contains multiple User-Agent sections?
If you use different User-Agent sections for different search robots, you should add a Sitemap directive for each of them (or only in sections for those search robots for which you want to specify a Sitemap URL).
As a result, you should get robots.txt like this:
# For all robots except Google and Yandex User-Agent: * Sitemap: http://yourwebsite.tld/sitemap.xml # Only for Google bot User-Agent: Googlebot Sitemap: http://yourwebsite.tld/sitemap-for-googlebot.xml # Only for Yandex bot User-Agent: Yandex Sitemap: http://yourwebsite.tld/sitemap-for-yandex.xml