• This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn more.

XF 1.4 Robots.txt blocking more pages than indexed

dethfire

Well-known member
#1
According to Google my Robots.txt is blocking 958,419 pages and indexed 876,771 pages. Below is my robots file. To me that seems crazy hi. Anyone have ideas on how so many pages are blocked? Does it seem reasonable?


User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /find-new/
Disallow: /account/
Disallow: /goto/
Disallow: /posts/
Disallow: /login/
Disallow: /admin.php

Sitemap: https://www.physicsforums.com/sitemap.php