Google indexing: Page is not indexed: Duplicate without user-selected canonical for the resource / rate and updates pages

woei

Well-known member
Affected version
2.3.2
Hi,

i have some issues with resources not beeing indexed because according to google they are duplicate content.

Page indexing
Page is not indexed: Duplicate without user-selected canonical

Discovery
Sitemaps
Temporary processing error
Referring page


I see this issue with the updates and rate resource pages. Is there an easy way to prevent these pages beeing marked as duplicate content? I supose a non index or user declared canconial should fix it?
 
You should try this

• Log in to your Xenforo admin interface.

• Navigate to Appearance > Templates (Xenforo style).

• Look for the PAGE_CONTAINER template, (Xenforo style) which is the main container for your site and is used to include the header, footer, and other global elements on all pages.

Step 2: Add the conditional canonical tag

• Once in the PAGE_CONTAINER template, find the opening <head> tag

• Add the following code just above the closing </head> tag to insert a canonical tag only if it does not already exist:

Code:
<xf:if is="{$xf.canonicalUrl}">

<link rel="canonical" href="{$xf.canonicalUrl}" />

<xf:else />

<link rel="canonical" href="{$xf.options.boardUrl}{$xf.request.getRequestUri()}" />

</xf:if>

• This code first checks if a canonical URL is already defined by Xenforo with {$xf.canonicalUrl}.

• Then, it checks if a canonical URL is defined via the meta with {$page.meta.canonical}.

• If neither condition is met, it adds a canonical URL based on the current URL of the page.

• The script also adds a canonical URL to XFRM resources

Step 3: Save and verify

• Click Save changes to apply the changes.

• Test several pages of your site to ensure that the canonical URL is added by opening the source code of each page with Firefox or by adding in the URL in Google Chrome:
Code:
view-source: (preceding the https://www.)

Example of a source code display URL:
Code:
view-source:https://www.your-site.php

Search for the line starting with:
Code:
<link rel="canonical" href=
This is the canonical URL that was added by the script.

I tested this script myself and it works perfectly, it adds a canonical url on posts and resources that do not have one and it does not add one if there is one that you have added manually with the add-on like [Xen-Soluce] SEO Optimization 2.7.0 Fix 3 or other addon.

After adding the canonical tag, monitor your site's SEO performance to ensure that the changes have a positive impact on your SEO. Use tools such as Google Search Console to check the indexed canonical URLs.
 
Canonical URLs are provided to choose a single "blessed" URL for content which may be displayed on several different URLs. The approach above will result in single pieces of content having multiple canonical URLs, which defeats the entire point. I can empathize that people want to do their best to appease Google, but as with structured data warnings the workarounds often wind up being detrimental if they aren't robust.
 
Great minds still can not solve this issue, in every cms. this is worldwide problem. XF team can handle it better in my opinion.
for example:

<link rel="canonical" href="https://xenforo.com/community/resources/remove-image-metadata.9417/">
<link rel="canonical" href="https://xenforo.com/community/resources/remove-image-metadata.9417/updates">

on both url you can see have incorrect canonical, the second one update should have the same canonical url


there is a way out
^\/community\/resources\/remove-image-metadata.9417\/[0-9]+\/$
/community/resources/remove-image-metadata.9417/

but, there is no addon which can write these rules on each page

mannually, will take ages, but possible
 
You are absolutely right, it is no longer a question of thousands of unindexed pages but hundreds of thousands. Xenforo coders should focus on this problem that is causing us a handicap in terms of SEO on Google. I think they should start to take an interest in this problem and provide a fix that can take into account Google's new restrictions because for 2 years I have dropped dramatically in Google's ranking despite the fact that I publish many interesting tutorials on my forum every day. How can we let such a gap drag on without providing any solution?

seo.webp
 
We all have the same problem with Xenforo, the SEO is pitiful, I used PHPBB and Vbulletin 4 before, including VBSEO, and the SEO was up to par, and PHPBB 3 has a higher SEO than Xenforo. It's all well and good all the system updates, but we never focus on the most important thing, SEO, because a discussion forum must be referenced to have visitors, and frankly I'm very disappointed with the results.

I have tried everything to increase my ranking on Google and nothing works, I continue to stagnate despite all the investment I have made.
 
I checked that in your forum, it does effect post 2-3-4-5, but that means google won't index posts, which is not suitable for everyone
 
Here is my robots.txt that i modify

Code:
User-agent: PetalBot
User-agent: AspiegelBot
User-agent: AhrefsBot
User-agent: SemrushBot
User-agent: SemRush
User-agent: DotBot
User-agent: MauiBot
User-agent: MJ12bot
User-agent: seekportbot
Disallow: /

User-agent: *
Disallow: /account/
Disallow: /admin.php
Allow: /

Sitemap: https://www.tutoriaux-excalibur.com/sitemap.php

Here is my htaccess

Code:
# Mod_security can interfere with uploading of content such as attachments. If you
# cannot attach files, remove the "#" from the lines below.
#<IfModule mod_security.c>
#    SecFilterEngine Off
#    SecFilterScanPOST Off
#</IfModule>

ErrorDocument 401 default
ErrorDocument 403 default
ErrorDocument 404 default
ErrorDocument 405 default
ErrorDocument 406 default
ErrorDocument 500 default
ErrorDocument 501 default
ErrorDocument 503 default

<IfModule mod_rewrite.c>
    RewriteEngine On

    # If you are having problems with the rewrite rules, remove the "#" from the
    # line that begins "RewriteBase" below. You will also have to change the path
    # of the rewrite to reflect the path to your XenForo installation.
    RewriteBase /

    # This line may be needed to workaround HTTP Basic auth issues when using PHP as a CGI.
    #RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]

    RewriteCond %{REQUEST_FILENAME} -f [OR]
    RewriteCond %{REQUEST_FILENAME} -l [OR]
    RewriteCond %{REQUEST_FILENAME} -d
    RewriteRule ^.*$ - [NC,L]
    RewriteRule ^(data/|js/|styles/|install/|favicon\.ico|crossdomain\.xml|robots\.txt) - [NC,L]
    RewriteRule ^.*$ index.php [NC,L]
</IfModule>

<IfModule mod_rewrite.c>
    RewriteEngine On
    RewriteBase /

    # Interdire l'indexation des pages spécifiques
    RewriteRule ^whats-new/profile-posts/?$ - [L,NC]
    RewriteRule ^whats-new/news-feed/?$ - [L,NC]
    RewriteRule ^whats-new/posts/?$ - [L,NC]
    RewriteRule ^css.php$ - [L,NC]
    RewriteRule ^forums/-/index.rss$ - [L,NC]

    # Redirection de l'ancienne URL vers la nouvelle URL
    RewriteRule ^whats-new/latest-activity/?$ /whats-new/ [R=301,L,NC]
</IfModule>

<IfModule mod_headers.c>
    # Ajouter l'en-tête X-Robots-Tag pour les URL spécifiques
    <FilesMatch "^(profile-posts|news-feed)$">
        Header set X-Robots-Tag "noindex, nofollow"
    </FilesMatch>
    </IfModule>

# Deny and Allow bots by User-Agent
SetEnvIfNoCase User-Agent "bot|crawler|fetcher|headlesschrome|inspect|search|spider|seekportbot" bad_bot
SetEnvIfNoCase User-Agent "duckduckgo|googlebot|yahoo" good_bot
Deny from env=bad_bot
Allow from env=good_bot

RewriteEngine On
RewriteRule ^\.well-known/traffic-advice$ - [T=application/trafficadvice+json,END]
 
Back
Top Bottom