• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

XF 1.4 Lots of Crawler Errors (403 - Access Denied)

#1
Hi guys,
First of all, this is my robots.txt
Code:
User-Agent: *
User-agent: Mediapartners-Google
Disallow: /?page=
Disallow: /find-new/
Disallow: /account/
Disallow: /attachments/
Disallow: /goto/
Disallow: /posts/
Disallow: /login/
Disallow: /admin.php
Disallow: /members/
Disallow: /conversations/
Disallow: /brqct-create-thread

Sitemap: http://www.xyz.com/sitemap.php
Now i just went into Google Webmaster Tools and i found over 25,000 403 Crawler errors. They all seem to be something like
http://www.xyz.com/threads/thread.nnn/reply?quote=nnnnn
I can understand that the crawler would not be able to access that, since its akin to asking something like this example to be crawled without being logged in:
https://xenforo.com/community/threa...opment-documentation.16868/reply?quote=885293

Can someone please assist me in what i need to put into my robots for that?
 

Mike

XenForo developer
Staff member
#2
They should go away over time as that link should not be shown to people who can't use it.

Regardless, this should work:

/threads/*/reply