seo,sitemap,robots.txt , Use `robots.txt` in a multilingual site


Use `robots.txt` in a multilingual site

Question:

Tag: seo,sitemap,robots.txt

Having to manage a multilingual site, where users are redirected to a local version of site, like

myBurger.com/en // for users from US, UK, etc..
myBurger.com/fr // for users from France, Swiss, etc...

How should be organized the robots.txt file in pair with the sitemap?

myBurger.com/robots.txt // with - Sitemap: http://myBurger.com/??/sitemap
OR
myBurger.com/en/robots.txt  // with - Sitemap: http://myBurger.com/en/sitemap
myBurger.com/fr/robots.txt  // with - Sitemap: http://myBurger.com/fr/sitemap

kwnowing that en and fr sites are in fact independent entities not sharing common content, even if similar appearance.


Answer:

You need to put one robots.txt at the top level.

The robots.txt file must be in the top-level directory of the host, accessible though the appropriate protocol and port number.

https://developers.google.com/webmasters/control-crawl-index/docs/robots_txt


Related:


disqus SEO google crawler doesn't load comments


seo,disqus
I see in google webmaster We were unable to load Disqus. If you are a moderator please see our troubleshooting guide. instead of comments. But i read in the Internet, disqus comments are readable by google Crawler. As i understand to show "We were unable to load..." google had to...

Best JSON-LD practices: using multiple