XML Sitemap for XenForo 1.3 [Not needed, included in 1.4]

On the command line

Code:
$ php library/XfAddOns/Sitemap/Cli.php
Yes i tried it but something is wrong

$ php /home/xxx/public_html/forum/library/XfAddOns/Sitemap/Cli.php
enter than give error;
Running from directory: /home/xxx/public_html/forum
An unexpected error occurred. Please try again later.
 
Could you open the Cli.php file, and change
Code:
XenForo_Application::setDebugMode(false);
to
Code:
XenForo_Application::setDebugMode(true);

And run again?
 
I have this error coming up in my Google Webmaster Tools

1
alert.png

Errors
Invalid URL
This is not a valid URL. Please correct it and resubmit.
1
Sitemap: yoursite/sitemap/sitemap.urls.1.xml.gz
Parent tag: url
Tag: loc
 
It would seem in the "extra Urls" section you have an extra space at the end since it generates an empty section at the very end.

Code:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <url>
    <loc>http://www.triumphtalk.com/</loc>
    <lastmod>2014-03-15</lastmod>
  </url>
  <url>
    <loc>http://www.triumphtalk.com/gallery/</loc>
    <lastmod>2014-03-15</lastmod>
  </url>
  <url>
    <loc>http://www.triumphtalk.com/events/</loc>
    <lastmod>2014-03-15</lastmod>
  </url>
  <url>
    <loc>http://www.triumphtalk.com/portal/</loc>
    <lastmod>2014-03-15</lastmod>
  </url>
  <url>
    <loc></loc>
    <lastmod>2014-03-15</lastmod>
  </url>
</urlset>
 
@Rigel Kentaurus I have removed all the Additional URLs but I still seem to get the same error. I also deleted all the files in my sitesmap folder and ran the cron again to regenerate them.

I actually completely remove this add on and installed it again as well
 
Last edited:
@Rigel Kentaurus I have removed all the Additional URLs but I still seem to get the same error. I also deleted all the files in my sitesmap folder and ran the cron again to regenerate them.

I actually completely remove this add on and installed it again as well
URLS sitemap does not exist anymore
http://www.triumphtalk.com/sitemap/sitemap.urls.1.xml.gz

I imagine that is because you removed the URLS, since I don't see the urls sitemap in the index anymore
http://www.triumphtalk.com/sitemap/sitemap.xml.gz

Code:
<?xml version="1.0" encoding="UTF-8"?>
<sitemapindex xmlns="http://www.sitemaps.org/schemas/sitemap/0.9">
  <sitemap>
    <loc>http://www.triumphtalk.com/sitemap/sitemap.forums.1.xml.gz</loc>
  </sitemap>
  <sitemap>
    <loc>http://www.triumphtalk.com/sitemap/sitemap.threads.1.xml.gz</loc>
  </sitemap>
  <sitemap>
    <loc>http://www.triumphtalk.com/sitemap/sitemap.forums.pags.1.xml.gz</loc>
  </sitemap>
  <sitemap>
    <loc>http://www.triumphtalk.com/sitemap/sitemap.threads.pags.1.xml.gz</loc>
  </sitemap>
</sitemapindex>

Which means, most likely the error will go away the next time google parses the sitemap
 
Yes I deleted it all and then ran the cron again so only the relevant files would be in the folder. So I just need to hang in and see what it now does, thanks
 
My optionsset to maximum 50000 units in one file. But after sitemap building i got sitemap.threads.pags.1.xml with 100.000+ page links:
255675de898bb31918e9b4db2841de5a24366238.jpg

...
... using XF 1.3 last supported.
 
Hello,

I upgraded the add-on to the latest version and found that the GWT only shows following two sub-sitemaps:

/sitemap/sitemap.forums.1.xml.gz

/sitemap/sitemap.pages.1.xml.gz

all the other sub-sitemaps aren't picked up by Google. I've tried submitting and re-submitting the site map but it didn't help.

Test results show:
Sitemap: /sitemap/sitemap.xml.gz
Type: Sitemap index
Number of children in this Sitemap index 2
Error details: No errors found.
 
did the last update solve this error ?!

Zend_Exception: No entry is registered for key 'session' - library/XenForo/Application.php:959

Stack Trace
#0 /usr/local/lsws/DEFAULT/html/library/XenForo/Application.php(1497): XenForo_Application::get('session')
#1 /usr/local/lsws/DEFAULT/html/library/Waindigo/MaxGuestViews/Extend/XenForo/Model/Thread.php(28): XenForo_Application::getSession()
#2 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Thread.php(826): Waindigo_MaxGuestViews_Extend_XenForo_Model_Thread->canViewThread(Array, Array, '', Array, Array)
#3 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Sitemap/ThreadPagination.php(136): XenForo_Model_Thread->canViewThreadAndContainer(Array, Array, '', Array, Array)
#4 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Sitemap/ThreadPagination.php(75): XfAddOns_Sitemap_Sitemap_ThreadPagination->canView(Array)
#5 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Sitemap/ThreadPagination.php(42): XfAddOns_Sitemap_Sitemap_ThreadPagination->generateStep('10000')
#6 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Model/Sitemap.php(137): XfAddOns_Sitemap_Sitemap_ThreadPagination->generate()
#7 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/CronEntry/RebuildSitemap.php(40): XfAddOns_Sitemap_Model_Sitemap->runAllAvailableSiteMaps()
#8 [internal function]: XfAddOns_Sitemap_CronEntry_RebuildSitemap::run(Array)
#9 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Cron.php(356): call_user_func(Array, Array)
#10 /usr/local/lsws/DEFAULT/html/library/XenForo/Deferred/Cron.php(24): XenForo_Model_Cron->runEntry(Array)
#11 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Deferred.php(256): XenForo_Deferred_Cron->execute(Array, Array, 7.9999961853027, '')
#12 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Deferred.php(390): XenForo_Model_Deferred->runDeferred(Array, 7.9999961853027, '', false)
#13 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Deferred.php(335): XenForo_Model_Deferred->_runInternal(Array, NULL, '', false)
#14 /usr/local/lsws/DEFAULT/html/deferred.php(23): XenForo_Model_Deferred->run(false)
#15 {main}
 
I have this running on two XF forums. The one where XF is installed on the root directory this works great. The one where XF is installed in a sub-folder the addon does not work. Robots.txt shows the sitemap path at www.mydomain.com/sitemap/ instead of www.mydomain.com/community/sitemap so of course the robots.php file has the wrong path.
Yeah, this is an already reported unfixed issue.
You could just add your own robots.txt and not use the robots.php feature from the add-on until fixed.
 
XF 1.3. Error with v1.4.1 (current) ....
Code:
Exception: The path /root/sitemap is not writable. Maybe you need to chmod 777 - library/XfAddOns/Sitemap/CronEntry/RebuildSitemap.php:28
Generated By: Unknown Account, Today at 06:01
Stack Trace
#0 [internal function]: XfAddOns_Sitemap_CronEntry_RebuildSitemap::run(Array)
#1 /var/www/netrider/library/XenForo/Model/Cron.php(356): call_user_func(Array, Array)
#2 /var/www/netrider/library/XenForo/Deferred/Cron.php(24): XenForo_Model_Cron->runEntry(Array)
#3 /var/www/netrider/library/XenForo/Model/Deferred.php(256): XenForo_Deferred_Cron->execute(Array, Array, 7.9999978542328, '')
#4 /var/www/netrider/library/XenForo/Model/Deferred.php(390): XenForo_Model_Deferred->runDeferred(Array, 7.9999978542328, '', false)
#5 /var/www/netrider/library/XenForo/Model/Deferred.php(335): XenForo_Model_Deferred->_runInternal(Array, NULL, '', false)
#6 /var/www/netrider/deferred.php(23): XenForo_Model_Deferred->run(false)
#7 {main}
Request State
array(3) {
  ["url"] => string(7) "http://"
  ["_GET"] => array(0) {
  }
  ["_POST"] => array(0) {
  }
}

It appears to be incorrectly targeting /root/sitemap/ rather than my web path of /var/www/netrider/sitemap/
 
XF 1.3. Error with v1.4.1 (current) ....
Code:
Exception: The path /root/sitemap is not writable. Maybe you need to chmod 777 - library/XfAddOns/Sitemap/CronEntry/RebuildSitemap.php:28
Generated By: Unknown Account, Today at 06:01
Stack Trace
#0 [internal function]: XfAddOns_Sitemap_CronEntry_RebuildSitemap::run(Array)
#1 /var/www/netrider/library/XenForo/Model/Cron.php(356): call_user_func(Array, Array)
#2 /var/www/netrider/library/XenForo/Deferred/Cron.php(24): XenForo_Model_Cron->runEntry(Array)
#3 /var/www/netrider/library/XenForo/Model/Deferred.php(256): XenForo_Deferred_Cron->execute(Array, Array, 7.9999978542328, '')
#4 /var/www/netrider/library/XenForo/Model/Deferred.php(390): XenForo_Model_Deferred->runDeferred(Array, 7.9999978542328, '', false)
#5 /var/www/netrider/library/XenForo/Model/Deferred.php(335): XenForo_Model_Deferred->_runInternal(Array, NULL, '', false)
#6 /var/www/netrider/deferred.php(23): XenForo_Model_Deferred->run(false)
#7 {main}
Request State
array(3) {
  ["url"] => string(7) "http://"
  ["_GET"] => array(0) {
  }
  ["_POST"] => array(0) {
  }
}

It appears to be incorrectly targeting /root/sitemap/ rather than my web path of /var/www/netrider/sitemap/
Are you running through the command line, or through the cron system?
 
Top Bottom