Allow Google Index Spider / bot in only certain forums?

faeronsayn

Well-known member
I was wondering if its possible to not allow the google index bot in certain forums ? For example if I have a section for spamming, I wouldn't want Google index bot to see that section since I wouldn't want it to index discussions that are useless. But I know its easy, we can simply hide it from guests, but the problem is that I want other bots to see it like the Adsense bot and so on.

So what I want to do here is only disallow the Google Index Bot from certain forums but allow all other bots (adsense bot) to be able to see and use the words to display ads.

I don't know how to do this through the robots.txt, I was wondering if it was possible through the template system, or maybe add individual bots to different usergroups would work perfectly fine as well.

Please help here
frown.png
, I totally need help
frown.png
 
Disallow: /forums/forum-name.forumid should work. I'm not sure if just the ID alone will work with robots.txt, someone else should be able to clarify.

Thank you for such a fast reply, so with something like this,

Example:

User-agent: Google*
Disallow: /forums/news-and-announcements.4/

This would disallow google from the forum "News and Announcements" does this mean that all threads within the News And Announcements forum will automatically not be accessible by this bot? Since threads obviously have a different URL, I am not sure how that would work ?

So please shed some light on this topic for me if you would.
 
It's a complicated process, I'm not sure if threads would be disallowed as that'd just stop them accessing the forum listings.

The simple way to do so is to just disallow guests from viewing the Spam section.
 
It's a complicated process, I'm not sure if threads would be disallowed as that'd just stop them accessing the forum listings.

The simple way to do so is to just disallow guests from viewing the Spam section.

Thats what I have done right now, but I don't want that done since I want guests to be able to read the spam section, since there are a lot of interesting things in there. I also want relevant ads displayed in there.
 
You might have to create an add-on if you want to allow guests but disallow spiders.

The thing is I want to allow guests and allow spiders, but restrict some.

So its like making a usergroup / permission set for each spider type of thing.

By the way is there anyway to test this ?
 
I've learned an incredible amount be editing the robots.txt file, then waiting to see how google reacts. Takes a lot of time though..

So when you block off the forum URL, does it automatically make it so all threads within that forum will also be unaccessible by this bot >?
 
So when you block off the forum URL, does it automatically make it so all threads within that forum will also be unaccessible by this bot >?
From what I can tell, googlebot is quite literal.

Disallow /forum/ - it will not crawl anything with in that directory, but if it can be accessed via /threads/ that will be indexed.

I've only started experimenting in the last few weeks, I'm certainly no expert. I could be wrong.
 
From what I can tell, googlebot is quite literal.

Disallow /forum/ - it will not crawl anything with in that directory, but if it can be accessed via /threads/ that will be indexed.

I've only started experimenting in the last few weeks, I'm certainly no expert. I could be wrong.

Uhh that sucks, still looking for a way out for this :(
 
Top Bottom