Size of Xenforo site

ibaker

Well-known member
Apart from server backups etc I also once a month download my complete site to my local PC and it has become increasingly difficult due to the size of my site now. It isn't a "big board" as such but it now contains 100,000 files and the database has 3 million records.

On my last download I looked into this a bit and found footprints from old addons that I had used only to have been bitten by the addon developers not maintaining their addons or just really bad coding which has resulted in later uninstalling the addons. Waindigo addons were the biggest culprit of this as I have found many left over database records and even left over image files in the data/attachments folder...all taking up space and perhaps reducing performance of a millisecond or two...every millisecond adds up.

Old addons are not the only issue. Migration from other platforms is another as the potential of migrating problems into a new platform exist. After 12 years of going from one platform to another in the early days of my site to meet growth to finally settling with XF, I wonder how many superfluous files and database records I now have.

I woulod love to start my site completely from scratch again, knowing what I know now, but I don't think that is an option...I mean create a new standard XF site and migrate "only" what is needed like users, threads and posts but then how could you you check that it is only what you need and what about existing users, what would they have to go through like having to re-register or re-login, are cookies impacted etc etc etc.

Interested to hear thoughts from others on this.
 
I don't think starting clean would be all that much issue, but to be honest, it sounds a little more like some simple DB cleaning. You still have 400k of posts, and that can be taxing on a DB. Starting fresh has more cons than pros, especially as you would break every existing posted link, as everything would have new id's.

Things that aren't used in your DB any longer, I don't see how they would do anything to your sites functioning when not being called or such. They would be clutter, sure, agreed... but you can carefully clean clutter by taking out old tables and columns no longer used.

If you use WHM, you can use the backup feature in that to take a daily backup, and link it to an S3 account so it automatically sends a copy of your DB and files there each day / week (removes download issue). Current WHM deletes from S3 as it deletes your old backups from your server, as per your settings. If you don't use WHM, maybe setup a backup script to send it to S3 instead of downloading it.

Do you compress your site before download? That way you only download a single file vs thousands of files?

There are always options compared to wiping all id's and starting anew.
 
Last edited:
Just delete all non-standard files that you don't need and examine the DB for non-standard tables and fields. Drop those that you dont use anymore.
If you dont feel comfortable with database work then hire an experienced database specialist with good references. Dont hire a cheap one.
 
Thanks Guys,
Anthony, I use a 3 stage backup, firstly the site is backed up 3 times a week rolling and stored on a separate sata hard disk in my server, also a weekly back up is automatically taken and stored off site and thirdly, I do a monthly backup manually zipping the site and downloading to my PC with a db export. The zip file is over 4gb and the db export is now getting close to 1gb.

The manual backup on my PC is then checked by installing in xampp and is also used for any analysis, development etc.

It was just in my last look inside the site I found all these footprints and just thought that I have a massive job ahead of me if I was to check every single thing of the site to clean it up and thought there must be a quicker way of doing this. The site now has 100,000 files and 3 million database rows.
 
Anthony, no unlike vb the media gallery does not have the option of storing in the db and even if it did I would assume storing in the db would have a performance hit than from the disk. My server uses 2 ssd's in raid as the main drive.
 
I would bet whats slowing you down the most is images. You have 700mb+ just in the media addon.
Yeah, stupid me, I just checked the media gallery settings and it seems that the max file size was set at 10mb...doh, wonder if that is the addon default and I missed that. It only gives you the option of mb. Will do some testing and see if it allows for a .512 and if it does I wonder if I could do a mass file shrink
 
Does your host offer automatic backups? If they offer automatic / complete backups --- then you could just download a backup your database since in all liklihood you'll never need a backup you downloaded and just want a worst case scenario. There are also third party backup services like www.codeguard.com (not an endorsement - just an example)
 
You can learn fairly safely though, install a local server on your system, XAMPP or MAMP, unzip site files and a DB backup into the directory and start learning how to clean it without errors... the site files is really just to check you haven't screwed up, aka, the site keeps showing for you and working correctly. Once happy and you know what you want to remove, attack your actual one.

If you use all those backups... why bother downloading a version at all each month?
 
If you use all those backups... why bother downloading a version at all each month?
Thanks Anthony, the manual backup is for 2 reasons, if something is automated something can go wrong besides how many site owners think their backup is ok till the time comes when they need it and they find some error happened and the backup was corrupt because of not being manually verified. The other reason is so I have a recent copy to do any development on in a local development environment. That's just me though. For example at the moment I am enhancing the side menu system I developed for my site so having the latest copy helps and I know that the backup is then completely verified in worse case scenario
 
The zip file is over 4gb
Easy don't use zip format use a better compression method that compresses to smaller sizes

my benchmarks from a 1.5+ GB tar file being compressed via below compression tools on a 4 cpu thread VPS server

tar_gziptest_leveltest_level5_chart11.png
 
You lost me now Eva, I just use the compress function in cPanel File Manager which brings the site down to 4gig excluding db
 
backups outside of cpanel system which are usually faster and less resource intensive
Suggestions? I don't know what I'm doing... more dangerous with the command line, but I bungle my way through stuff normally with instructions. I find the WHM backup good, but way too resource intensive with my main site which is about 35Gb in size. All the other sites I have hosted on the server are just a blip of nothing for resources to backup... but the main one, ouch!
 
Top Bottom