• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

Errors from Googlebot - Private forums

maxwolfie

Active member
#1
Hi all,

Today I received an email from Google, bots that are crawling my site are spitting out an increasing number of crawling errors due to certain "usergroup protected" forums, i.e. normal users can access the forum, read thread titles but can NOT read thread content within.

What's the best way around this? Write a robots.txt file? I am not exactly sure how to write out the file, as I don't know how to encapsulate the three forums (which each have 2-3 subforums) with the directory structure that is needed for the robots.txt file?
 

Jake Bunce

XenForo moderator
Staff member
#3
Today I received an email from Google, bots that are crawling my site are spitting out an increasing number of crawling errors due to certain "usergroup protected" forums, i.e. normal users can access the forum, read thread titles but can NOT read thread content within.
Is that expected? Do you restrict thread content so guests can't view it? Did Google instruct you to fix this or was it just a FYI? This may not be a problem at all.
 

maxwolfie

Active member
#4
Is that expected? Do you restrict thread content so guests can't view it? Did Google instruct you to fix this or was it just a FYI? This may not be a problem at all.
Well, I guess it is. I was wondering if there was a way to use robots.txt to block certain forums, and their subforums.

I will need to block:
http://www.hackslashrepeat.com/forums/diablo-3-chapter.287/ and subforum http://www.hackslashrepeat.com/forums/diablo-3-events.295/
http://www.hackslashrepeat.com/forums/path-of-exile-chapter.289/ and subforum http://www.hackslashrepeat.com/forums/path-of-exile-events.296/
http://www.hackslashrepeat.com/forums/torchlight-2-chapter.297/ and subforum http://www.hackslashrepeat.com/forums/torchlight-2-events.298/

As well as anything else that should be blocked for XF and XenPorta.

Does anyone know how to make the robots.txt to do this?
 

Jake Bunce

XenForo moderator
Staff member
#5
Code:
User-agent: *
Disallow: /forums/diablo-3-chapter.287/
Disallow: /forums/diablo-3-events.295/
Allow: /
Add a new Disallow line for each forum URL.