XF 1.4 Robots.txt blocking more pages than indexed

dethfire

Well-known member
According to Google my Robots.txt is blocking 958,419 pages and indexed 876,771 pages. Below is my robots file. To me that seems crazy hi. Anyone have ideas on how so many pages are blocked? Does it seem reasonable?


User-agent: Mediapartners-Google
Disallow:

User-agent: *
Disallow: /find-new/
Disallow: /account/
Disallow: /goto/
Disallow: /posts/
Disallow: /login/
Disallow: /admin.php

Sitemap: https://www.physicsforums.com/sitemap.php
 
Top Bottom