XF 1.5 Submitting sitemaps does not work any longer

fredrikse

Active member
Hi, I need some help to figure out why the sitemap is not submitted to the search engines any longer. I switched to HTTPS earlier this week.

errorSubmittingSiteMap.webp

Br

Fredrik
 
These are mostly just general network errors that unfortunately XF has no real control over. This may indicate a DNS issue on the server or a firewall blocking the outgoing requests. Your server admin may be able to help diagnose it.
 
Is the web address supposed to start with SSL? I've never seen that before.

Zend_Http_Client_Adapter_Exception: Error submitting sitemap to Google: Unable to Connect to ssl://www.google.com:443. Error #0: - library/Zend/Http/Client/Adapter/Socket.php:235
 
Yes, that's correct.

Unfortunately, you web host would really likely need to test via PHP itself using sockets to be a fair test. It's important to run the test using the same user PHP is running as -- and as a real PHP process -- because there can be per process/user restrictions applied.
 
Picking up this thread again. I've been working with my service provided to try and pin point the error.

They ran a script and for Google it returned a 200 OK message:
PHP:
<?php
        // create curl resource
        $ch = curl_init();

        // set url
    $url = "https://www.google.com";
        echo "URL: " . $url . "<br>";
        curl_setopt($ch, CURLOPT_URL, $url);

        //return the transfer as a string
        curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);

        // $output contains the output string
        $output = curl_exec($ch);

        echo "Output: <br>" . var_dump($output);;
        echo "CURLINFO_HTTP_CODE: " . curl_getinfo($ch, CURLINFO_HTTP_CODE);


        // close curl resource to free up system resources
        curl_close($ch);
?>

But I still get these errors:

190409

Any ideas where to go from here?
 
Unfortunately, that isn't really the code path of the code being triggered here. Based on the error here, this is done through PHP sockets rather than cURL. The code in question is from the Zend Framework code in library/Zend/Http/Client/Adapter/Socket.php. There is more code before this to configure things like the context, but this is the line that isn't connecting:

Code:
$this->socket = @stream_socket_client($host . ':' . $port,
                                      $errno,
                                      $errstr,
                                      (int) $this->config['timeout'],
                                      $flags,
                                      $context);

It may be down to things like the SSL certificates not being setup in PHP itself properly.
 
Hi,

At the beginning of November I switched to SSL on my Xenforo website. Everything was working fine at a glance but then I started seeing more and more entries in the error log.

190629

I kept getting errors related to sitemap submitting to various search engines. For the past month or so I've been working with my service provider to sort this out (https://xenforo.com/community/threads/submitting-sitemaps-does-not-work-any-longer.155699/). To this date it has still not been sorted out.

Today I was going to submit the sitemap manually to the Google Search Console. In the overview I saw a dramatic drop in clicks and impressions. Can sombody explain to me what this mean? Is it related to the switch to SSL?

When I look in the sitemap file itself for the threads I see this:

190630

It says HTTP at the beginning of each URL when it should be HTTPS. My service provider offers the Let's Encrypt certificate to be installed for free. And in that process there is an option that you can select if you want all traffic on HTTP to be redirected to HTTPS.

It also seems like Google can read the sitemaps generated by Xenforo:

190631

Any guidance on this would be very much appreciated. It feels like I'm not getting the traffic I used to before I switched to SSL.
 
Last edited:
The service provider tried to access Google with wget from the web server hosting my forum. And it worked. Where do I go from here? :unsure:
I am sorry for having to say that, but:
Your service provider is incompetent.
A HTTPS request with wget from the command line is a completely different environment than using curl/socket operations from PHP within the webserver (mod_php) or PHP-FPM.
Jesus christ, one month for investigatinh such an issue is insane, normally this should be resolved within mnutes.
I'd expect this to be an issue with certificates, outdated OpenSSL or firewall settings.
To check if it is a firewall issue I'd test if a raw socket connection without and to www.google.com port 443 can be opened from a PHP-Script executed to the weberver.
If this succeeds go on with checking PHP OpenSSL config.
 
This is what my robots.txt looks like:
User-agent: *
Disallow: /account/
Disallow: /find-new/
Disallow: /help/
Disallow: /goto/
Disallow: /login/
Disallow: /lost-password/
Disallow: /misc/style/
Disallow: /online/
Disallow: /posts/
Disallow: /recent-activity/
Disallow: /register/
Disallow: /search/
Disallow: /admin.php
Disallow: /index.php?account/
Disallow: /index.php?find-new/
Disallow: /index.php?help/
Disallow: /index.php?goto/
Disallow: /index.php?login/
Disallow: /index.php?lost-password/
Disallow: /index.php?misc/style/
Disallow: /index.php?online/
Disallow: /index.php?posts/
Disallow: /index.php?recent-activity/
Disallow: /index.php?register/
Disallow: /index.php?search/
Disallow: /admin.php
Allow: /
Sitemap: https://www.mydomain.com/sitemap/sitemap.xml.gz
In another thread I saw that the Sitemap URL nowadays is /sitemap/sitemap.php. Should I change the robots.txt to that?
 
Last edited:
XenForo pings Google etc. whenever a new sitemap has been generated, this allows search engines to grab updates "immediately".

Otherwise they would have to re-check sitemaps in certain intervals (which they also do) to find updates, though that might/will take longer than without pings.
 
Top Bottom