XML Sitemap for XenForo 1.3 [Not needed, included in 1.4]

google indexing is sucking this up great, thanks.

But the robots.txt isn't changing when I add info, save and even do a cron. I've added the rewrite but... ?
 
Copy it next time and paste it here :) I'll fix it if it's something to fix

these r the errors

Code:
Zend_Exception: No entry is registered for key 'session' - library/XenForo/Application.php:959
تم إنشاؤها بواسطة: عضو غير معروف, اليوم في 07:00
Stack Trace
#0 /usr/local/lsws/DEFAULT/html/library/XenForo/Application.php(1497): XenForo_Application::get('session')
#1 /usr/local/lsws/DEFAULT/html/library/Waindigo/MaxGuestViews/Extend/XenForo/Model/Thread.php(28): XenForo_Application::getSession()
#2 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Thread.php(826): Waindigo_MaxGuestViews_Extend_XenForo_Model_Thread->canViewThread(Array, Array, '', Array, Array)
#3 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Sitemap/ThreadPagination.php(136): XenForo_Model_Thread->canViewThreadAndContainer(Array, Array, '', Array, Array)
#4 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Sitemap/ThreadPagination.php(75): XfAddOns_Sitemap_Sitemap_ThreadPagination->canView(Array)
#5 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Sitemap/ThreadPagination.php(42): XfAddOns_Sitemap_Sitemap_ThreadPagination->generateStep('10000')
#6 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/Model/Sitemap.php(137): XfAddOns_Sitemap_Sitemap_ThreadPagination->generate()
#7 /usr/local/lsws/DEFAULT/html/library/XfAddOns/Sitemap/CronEntry/RebuildSitemap.php(40): XfAddOns_Sitemap_Model_Sitemap->runAllAvailableSiteMaps()
#8 [internal function]: XfAddOns_Sitemap_CronEntry_RebuildSitemap::run(Array)
#9 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Cron.php(356): call_user_func(Array, Array)
#10 /usr/local/lsws/DEFAULT/html/library/XenForo/Deferred/Cron.php(24): XenForo_Model_Cron->runEntry(Array)
#11 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Deferred.php(256): XenForo_Deferred_Cron->execute(Array, Array, 7.9999959468842, '')
#12 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Deferred.php(390): XenForo_Model_Deferred->runDeferred(Array, 7.9999959468842, '', false)
#13 /usr/local/lsws/DEFAULT/html/library/XenForo/Model/Deferred.php(335): XenForo_Model_Deferred->_runInternal(Array, NULL, '', false)
#14 /usr/local/lsws/DEFAULT/html/deferred.php(23): XenForo_Model_Deferred->run(false)
#15 {main}
حالة الطلب
array(3) {
["url"] => string(37) "http://forum.ccccc.com/deferred.php"
["_GET"] => array(0) {
}
["_POST"] => array(3) {
["_xfRequestUri"] => string(12) "/online/408/"
["_xfNoRedirect"] => string(1) "1"
["_xfResponseType"] => string(4) "json"
}
}
 
I just tried to submit our sitemap to Google for the first time and it rejected it because of 11 404 errors out of 1,887,104 pages.
I had a look through the problem files and I think they relate to forums which I have removed from public view. After moving all the threads out, I made them private and removed them from the node list.
E.g. /sitemap/sitemap.forums.pags.4.xml.gz with a processed date of March 6th references
Code:
  <url>
    <loc>http://www.avforums.com/forums/forza-xbox-one.553/page-2</loc>
    <lastmod>2014-03-07</lastmod>
  </url>
and the Xbox forza forum was retired a couple of days ago.
Now I'm damn sure I deleted all the sitemap files before I ran the sitemap generation today. So why would the processed date be two days ago, a time before I removed those forums from public view?
Another error was with /sitemap/sitemap.threads.113.xml.gz which has 10,000 links in it. The processed date on this is March 6th also.
Why would these URLs be included in the latest site map?
 
Last edited:
I just tried to submit our sitemap to Google for the first time and it rejected it because of 11 404 errors out of 1,887,104 pages.
I had a look through the problem files and I think they relate to forums which I have removed from public view. After moving all the threads out, I made them private and removed them from the node list.
E.g. /sitemap/sitemap.forums.pags.4.xml.gz with a processed date of March 6th references
Code:
  <url>
    <loc>http://www.avforums.com/forums/forza-xbox-one.553/page-2</loc>
    <lastmod>2014-03-07</lastmod>
  </url>
and the Xbox forza forum was retired a couple of days ago.
Now I'm damn sure I deleted all the sitemap files before I ran the sitemap generation today. So why would the processed date be two days ago, a time before I removed those forums from public view?
Another error was with /sitemap/sitemap.threads.113.xml.gz which has 10,000 links in it. The processed date on this is March 6th also.
Why would these URLs be included in the latest site map?

=| i saw this before but i was thinking its something normal maybe from the server time or addon code do this =\ !!
thats interesting ^_^
 
its says on the admin (admin.php?options/list/xfa_sitemap_robots)
"You can have XenForo generate the robots.txt file for you. Please note for this to work, you need to do a mod rewrite that goes:"
robots.txt => index.php?xfa-robots/index

If I put this in my htaccess the website goes into error.
 
its says on the admin (admin.php?options/list/xfa_sitemap_robots)
"You can have XenForo generate the robots.txt file for you. Please note for this to work, you need to do a mod rewrite that goes:"
robots.txt => index.php?xfa-robots/index

If I put this in my htaccess the website goes into error.
I don't really give support for the robots.txt option, since server configurations are many, and it would require a 1:1 to even know what people are using.

That is an optional feature though, the sitemap works perfectly without it :) It should be generated under sitemap/
 
Sitemaps works great. thanks!

But I like the robots option. Where does this rewite go? What directory or the root htaccess?
 
Sitemaps works great. thanks!

But I like the robots option. Where does this rewite go? What directory or the root htaccess?
If you already have the rewrite (it should be posted somewhere earlier in the discussion - I know I posted one for nginx) then it goes in the root .htaccess.
 
Rigel,

I haven't looked into the code in detail at all, but some sitemap generation code I have written isn't executing when run from the Cli script but it works fine when executed via the Admin CP Cron Entry.

When the Cli script is run, are all of the necessary XF dependencies loaded that would allow things like the load_class_model event to fire?
 
It should

PHP:
// disable limits
@set_time_limit(0);
ini_set('memory_limit', '256M');

chdir(dirname(__FILE__) . '/../../..');

if (!is_file('./library/config.php'))
{
    print 'We do not appear to be running from the correct directory, needs to be XF root and is: ' . getcwd() . "\r\n";;
    exit;
}
else
{
    print 'Running from directory: ' . getcwd() . "\r\n";
}

$startTime = microtime(true);
require('./library/XenForo/Autoloader.php');
XenForo_Autoloader::getInstance()->setupAutoloader('./library');

XenForo_Application::initialize('./library');
XenForo_Application::set('page_start_time', $startTime);
// XenForo_Application::setDebugMode(true);

$db = XenForo_Application::getDb();
$db->setProfiler(false);

// and run the sitemap
XfAddOns_Sitemap_CronEntry_RebuildSitemap::run();


Basically sets up the Autoloader, disables the profiler, runs the cron ...
I'll test locally myself, the only thing that it does not do is preloading the Dependencies_Public (and some of the preloaded tables come from there). Will let you know


Edit: Fixed in the latest update
 
Last edited:
Top Bottom