Conversely: if a directory is not disallowed in robots.txt, then it should not be inaccessible to guests.
Since XenForo has sitemaps built in, it should also synch robots.txt to avoid conflicts between sitemaps and robots.txt and thereby avoid crawl errors.
What I am proposing is not on a url level, but on a directory level. If guests have no access to node_X, then robots.txt should disallow node_X.
Robots.txt should be rebuild every time the sitemap is rebuild.