Today I received an email from Google, bots that are crawling my site are spitting out an increasing number of crawling errors due to certain "usergroup protected" forums, i.e. normal users can access the forum, read thread titles but can NOT read thread content within.
What's the best way around this? Write a robots.txt file? I am not exactly sure how to write out the file, as I don't know how to encapsulate the three forums (which each have 2-3 subforums) with the directory structure that is needed for the robots.txt file?