Hey there I would really appreciate a one to one with someone who is very experienced in Google WT and SEO practises. We have recently been suffering from a load of Google crawl errors: We would like to know... Why Google are repeatedly trying to get to areas of the site that are not publicly available (nodes) - and then subsequently reporting these as "Access Denied" - is there a way to tell them to stay out of specific nodes and de-list the existing indexed content? Why Google is still trying to index content that hasn't been present on the site for over a year...but reports it as an error. We switched to xF in April 2012 but very old vBulletin and Wordpress content appears all the time in the "Soft 404" error list. There are no remnants of vBulletin or Wordpress on the server. What to do with all those "not found" 404's - all of which are threads or content that has been removed from the site permanently. Here's our Robots.txt if that helps at all. Code: User-agent: Mediapartners-Google Disallow: User-agent: * Disallow: /xf/find-new/ Disallow: /xf/account/ Disallow: /xf/attachments/ Disallow: /xf/goto/ Disallow: /xf/login/ Disallow: /xf/admin.php Disallow: /xf/mobiquo/ Disallow: /xf/members/ Allow: / Would love to chat with someone about this. Thanks.