XML Sitemap for XenForo 1.3 [Not needed, included in 1.4]

I "think" I have the rewrite converted to nginx format, but would someone a little more familiar with it double check my rewrite rule
Code:
rewrite /(robots.txt)$ /robots.php last;

EDIT:
And it DOES work... I just went to http://twowheeldemon.com/robots.txt and it kicked up the options selected and the additional info I put in.

Terrific! Thanks.
Where within the site config file do I put that? Inside a server{} or one level below inside a location{}?
 
Terrific! Thanks.
Where within the site config file do I put that? Inside a server{} or one level below inside a location{}?
Quick sample of the applicable part of my site nginx vhost config.
Code:
server {
      listen  80;}
 
server {
    server_name twowheeldemon.com;
 
location /nginx_status {
                stub_status on;
                access_log  off;
                allow 127.0.0.1;
                allow 192.73.236.43;
                allow 24.49.69.204;
                deny all;
        }      
location = /favicon.ico { alias /var/www/twowheel/favicon.ico; }
    error_log /var/log/nginx/twd-error.log warn;
    access_log /var/log/nginx/twd-access.log;
    root /var/www/twowheel;
 #  Main Site Related.
 
}
             rewrite /(robots.txt)$ /robots.php last;
 
 
 
    location / {
            try_files $uri $uri/ /index.php?$uri&$args;
            index index.html index.htm index.php;
 
          gzip
 
I added an option to let the sitemap add-on manage your robots.txt file. To accomplish this you will need to do a mod_rewrite rule. Instructions are provided in the zip file, you need to forward the robots.txt file to the provided robots.php file

OK, found a considerable bug here and have had to diable this feature.

With /robots.php active, the sitemap being returned has:
Code:
Sitemap: http://www.netrider.net.au/sitemap/blog.www.xml.gz
but no such blog.www.xml.gz file exists and thus the spiders are immediately getting an error and failing.
 
It's actually a bug with the blogs (newest version generates a sitemap per domain, and it thinks www is a user)

The following patch will fix it.
Substitute library/XfAddOns/Blogs/ControllerPublicOverride/Robots.php with this attached file
 

Attachments

It's actually a bug with the blogs (newest version generates a sitemap per domain, and it thinks www is a user)

The following patch will fix it.
Substitute library/XfAddOns/Blogs/ControllerPublicOverride/Robots.php with this attached file

And now it produces
Code:
Sitemap: http://*.netrider.net.au/sitemap/sitemap.xml.gz
 
Update 1.2.6

If you like this add-on, please post a review. It takes 1 minute and I enjoy the feedback! :)

New features
  • Support for Resource Manager, with options to generate sitemap for Resources and Updates

    In the event that the Resource Manager is not installed on your forum, that bit would just be silently ignored
screen-shot-2013-04-12-at-6-24-25-pm-png.44371
 
Most likely it wouldn't be useful for Showcase since it is not really a collection of pages but rather a dashboard with different sections.

However, there is an option for the sitemap called "extra urls" on the admincp, so you can add the URL to showcase there
 
I am trying to use the feature of this addon though ;-)

My robots.txt rewrites to robots.php and tested that all ok it just serves a blank response using the one with the addon.
 
Top Bottom