If i look in my Google Webmaster Console i am getting 66 warnings because there are some urls indexed by Google but blocked by robots.txt
The urls are almost the same and are like :
Robots.txt and sitemaps should always be in synch. In other words: if the xenforo sitemap sends a URL to search engines, then it should not be disallowed in robots.txt or it will cause crawl errors.
Conversely: if a directory is not disallowed in robots.txt, then it should not be inaccessible...
Noticed on this websites robots.txt that their disallow ended with an "Allow /"
I think it would be a good idea if there was an option to make the sitemap follow robots.txt rules. It would mean the sitemap more accurately follows what you actually want indexed.
See this post here: by @cmeinck...