XF 1.5 xenforo causing thousands of 403 Errors (from a SEO website audit)

Marco Famà

Active member
hey guys,
I've just run a deep level SEO Audit using an external tool, in order to optimize my website as best as possible in terms of SEO.

Unfortunately, tons of the remaining unfixed errors are coming from my xenForo instance:
http://timelapseitalia.com/forum/

Basically, the SEO Tool keeps telling me:
How can this be fixed?

thanks
Marco
 
Hi @Marco Famà ,

It is normal to get a 403 for those members that have chosen to hide their profile except for followers. So of course Google will have this issue as it has Guest permissions.

You've already disallowed profiles in your robots.txt I can see, you probably also want to do these two things:
  1. Prevent user profiles from being included in the XenForo sitemap:
    /admin.php?options/list/sitemap
    (Google will proactively try to follow the sitemap URLs)

  2. Add this at the top of your member_view template:
    Code:
    <xen:container var="$head.robots"><meta name="robots" content="noindex" /></xen:container>
    This will add "noindex" for member profiles only to help force Google to remove them quicker vs just robots.txt only
 
yes, actually it looks like:

User-agent: *
Allow: /
Disallow: /forum/members/
Disallow: /forum/attachments/
my aim was to tell Google "hey, do NOT index those as I basically don't care them to be".

Did I do things not properly, then?
How should I amend it in order to avoid my 4xx errors BUT also having Google not indexing them? is it possible, or better delete the two Disallow lines and let google index members and attachments too?

thanks a lot @n00bsaibot
 
Prevent user profiles from being included in the XenForo sitemap:
  1. /admin.php?options/list/sitemap
    (Google will proactively try to follow the sitemap URLs)

  2. Add this at the top of your member_view template:
    Code:
    <xen:container var="$head.robots"><meta name="robots" content="noindex" /></xen:container>
    This will add "noindex" for member profiles only to help force Google to remove them quicker vs just robots.txt only

you rock, mate :)
thanks a lot, much appreciated too!

I have forced not to index tags and users in the xenForo sitemap now in order to avoid that, and changed the template too so that it now starts with

<xen:container var="$head.robots"><meta name="robots" content="noindex" /></xen:container>
<xen:title>{$user.username}</xen:title>
<xen:h1></xen:h1> <xen:comment>H1 empty, do not render.</xen:comment>​

Hope this is all right now :)

will try launching another audit and see how it goes..
 
Marco, it doesn't need "fixing" - it is behaving as expected and you won't receive any penalty for it.

The reported 403's are just for information purposes in case you're getting them where you don't expect to get them.

The header change will not make any difference because Google isn't indexing those pages anyway (403).

What it's doing is following the links to member profiles from your homepage, thread view, post view, and any other page where member avatars/usernames are linked to their profile - and it's going to carry on doing this and you're going to carry on getting 403's. It's perfectly normal and nothing at all to worry about.

I've just checked my Google Search Console and I'm running at around 48,000 or so 403 errors - which is to be expected as I block guest access to member profile pages.

The only way to stop these being reported is to add ... rel="nofollow" ... to all of the member profile links to tell Google et all to stop following them; a lot of work for no benefit.

Cheers,
Shaun :D
 
hey Shaun,

oh than you say that this is NOT affecting badly my rankings? they say you should definitely try to free the Google Search Console from any reported errors, as much as you can.

Thought 403's would definitely be a bad thing, though..

PS: Checked this on MOZ and they say the same thing, for NON important pages:
https://moz.com/community/q/do-403-forbidden-errors-from-website-pages-hurt-rankings

I am discarding this notice, then :)

take a very good care, and thanks for helping me All
 
Back
Top Bottom