Lack of interest Automatically synch robots.txt with sitemaps

This suggestion has been closed automatically because it did not receive enough votes over an extended period of time. If you wish to see this, please search for an open suggestion and, if you don't find any, post a new one.

Alpha1

Well-known member
Robots.txt and sitemaps should always be in synch. In other words: if the xenforo sitemap sends a URL to search engines, then it should not be disallowed in robots.txt or it will cause crawl errors.

Conversely: if a directory is not disallowed in robots.txt, then it should not be inaccessible to guests.

Since XenForo has sitemaps built in, it should also synch robots.txt to avoid conflicts between sitemaps and robots.txt and thereby avoid crawl errors.

What I am proposing is not on a url level, but on a directory level. If guests have no access to node_X, then robots.txt should disallow node_X.
Robots.txt should be rebuild every time the sitemap is rebuild.
 
Upvote 0
This suggestion has been closed. Votes are no longer accepted.
I respectfully disagree. Such a feature could conflict with other software which has rules in robots.txt or software packages which use a dynamically created robots.txt file to handle various crawlers.
 
I respectfully disagree. Such a feature could conflict with other software which has rules in robots.txt or software packages which use a dynamically created robots.txt file to handle various crawlers.
Good point, but robots.txt rules that fall outside of xenforo could be catered to with a field with additional rules.
And of course such function should have an On/Off toggle.
 
Back
Top Bottom