Any way to regurarly backup the xenForo database?

Eric Russell

Active member
I bought backup smart. Backup completed and all of my folders were empty. Still waiting to here back from them. Anyone else have this problem?
 

petertdavis

Well-known member
I was just looking at backup-smart.com based on recommendations in this thread, but the site itself looks rather sketchy (as if an "Internet Marketing Guru" wrote it), and it seems to be associated with Clickbank, which to me is like saying "snake oil" so I'm kind of thinking you're never going to get a reply Eric.
 

Eric Russell

Active member
Well that is not good news! (n) I'm gonna need to come up with something. My site is taking off a little quicker than expected.
 

MattW

Well-known member
I do basically the same thing as Calamity James -- take the site offline at roughly 4AM (nginx 503 redirect), mysqldump the databases, gzip the sql files, run some db maintenance, then bring the site back online. Once that's finished, a separate script rsyncs to a remote backup server.

However, I recently discovered Percona Xtrabackup, and am planning on testing that out with the intention of replacing mysqldumb with Xtrabackup. The nice thing about Xtrabackup is that it can run on a live db and recreate an internally-consistent data set in the backup file. I always take the site offline if I want to run mysqldump, just because I don't want to have to worry about table locking or partially-complete transactions or anything like that. My XenForo tables are now (mostly) InnoDB, but I still have some myisam stuff floating around.

You can read an overview of Xtrabackup here: http://www.percona.com/doc/percona-xtrabackup/how_xtrabackup_works.html
I'm going to have a play with this tonight (y)
 

steven s

Well-known member
I bought backup smart. Backup completed and all of my folders were empty. Still waiting to here back from them. Anyone else have this problem?
I was just looking at backup-smart.com based on recommendations in this thread, but the site itself looks rather sketchy (as if an "Internet Marketing Guru" wrote it), and it seems to be associated with Clickbank, which to me is like saying "snake oil" so I'm kind of thinking you're never going to get a reply Eric.
I've been using backup-smart for several months with no issues.
 

petertdavis

Well-known member
I've been using backup-smart for several months with no issues.
I might give it a try anyway, it's just $37. But, most of the stuff I've ever bought from Clickbank ends up being just crap. I like the idea of a one-click backup for all of my sites though.
 

Eric Russell

Active member
I've been using backup-smart for several months with no issues.
I can't tell if I'm having a problem or not. When I unzip my files there appears to be nothing in the folders but the zipped file says it's 50 mb. The forum has been online for a month with a decent amount of attachments. 50mb would seem about right no?
 

petertdavis

Well-known member
I can't tell if I'm having a problem or not. When I unzip my files there appears to be nothing in the folders but the zipped file says it's 50 mb. The forum has been online for a month with a decent amount of attachments. 50mb would seem about right no?
Unless you have a lot of images, I'd expect a zipped backup to be somewhat smaller than that. I'd be concerned if it were empty too, though.
 

Eric Russell

Active member
Unless you have a lot of images, I'd expect a zipped backup to be somewhat smaller than that. I'd be concerned if it were empty too, though.
Yeah, we use a lot of images. I'm thinking my .tar file didn't unzip properly? If that's the case, the software worked as expected and I'm very happy. I think I'll know more when I wake up in the morning. I should have an email waiting for me saying the site was backed up in the middle of the night.
 

Eric Russell

Active member
Woke up this morning to an email showing my site was backed up early this morning while I was asleep. The people at backup smart did respond to every ticket I place with them.

I would say backup smart appears to be an excellent product.
 

steven s

Well-known member
Woke up this morning to an email showing my site was backed up early this morning while I was asleep. The people at backup smart did respond to every ticket I place with them.

I would say backup smart appears to be an excellent product.
Good to hear. I was regularly backing up my files to my home drive for several sites until I found that my ISP is now metering my downloads. :(
 

Ingenious

Well-known member
Just wanted to say thanks again for this helpful thread. In the end I saved a modified version of the script as backupdatabase.sh, stuck it in my root folder, and set up a cron job in the server/host's control panel to run that script every night. Easy as that.

All I want to do is a daily back up, so I changed the date part to write a file backupDAY.sql so over the week I get backupmon.sql, backuptues.sql and so on for each weekday, then over writing them each new week, this is because if I use the full date I'll just forget and end up filling up my webspace (even if they're zipped) with backup files :)
 

simbolo

Well-known member
I set up a crontab on my server to back everything up...

Code:
#!/bin/sh
 
# XenForo backup script
 
#set database connection info
db_username="username"
db_password="password"
db_name="database"
 
# set filenames
filename_sql="backup-db-"`eval date +%Y%m%d`".sql"
filename_sql_gz="backup-db-"`eval date +%Y%m%d`".gz"
filename_data="backup-data-folder.tar"
filename_data_gz="backup-data-folder.tar.gz"
 
# set path to backups and website directory without trailing slashes
backup_path="/path/to/backup/directory"
web_dir="/path/to/website/directory"
 
# dump the database
mysqldump -u$db_username -p$db_password $db_name > $backup_path/$filename_sql
 
# gzip the database at max compression to save space
gzip -9 $backup_path/$filename_sql
 
# if it's friday, we'll backup the website directory too
if [ $(date '+%a') == "Fri" ]; then
    # create a tarball of the web folder
    tar -cf $backup_path/$filename_data $web_dir/*
    # gzip the website directory
    gzip -f9 $backup_path/$filename_data
    # remove the tarball
    rm -f $backup_path/$filename_data
fi
 
# Log that the backup has been done
echo "Backup for `eval date +%d`/`eval date +%m`/`eval date +%Y` complete" >> $backup_dir/backup.log
It's a bit crude, but it works fine for myself, and keeps daily backups of the SQL data, and weekly ones of the website directory. If you want to take daily backups of the website directory, just remove the "if" and "fi" lines
I've basically the same thing with a scp transfer to another remote server. Although another option would be a big tarball to Amazon S3 or something like it.
 

MattW

Well-known member
I've basically the same thing with a scp transfer to another remote server. Although another option would be a big tarball to Amazon S3 or something like it.
That is what I do, I call a perl script inside the bash script to sftp the database files to two remote locations, and keep 5 rolling days worth of database files.
 

simbolo

Well-known member
That is what I do, I call a perl script inside the bash script to sftp the database files to two remote locations, and keep 5 rolling days worth of database files.
I keep 5 days, 1 from prior week, 1 month. The last two are probably overkill but I keep them on the off chance that I get hacked and db files are changed and I don't notice for a week or more.
 

Digital Doctor

Well-known member
I keep 5 days, 1 from prior week, 1 month. The last two are probably overkill but I keep them on the off chance that I get hacked and db files are changed and I don't notice for a week or more.
Not overkill ... those are probably your best backups !
 

simbolo

Well-known member
Not overkill ... those are probably your best backups !
Yeah, they are certainly the disaster backups.
Side note, I just made a plugin to take a snapshot of my forum's stats (ie: forum posts, members, attachment #, etc) save it to a table each week. I enjoy seeing those historical stats.
 

akia

Well-known member
How did people get on with using Xtrabackup.

I've been trying to get my head around it.
 

Deepmartini

Well-known member
What are people using now to backup their sites? Any easier solutions? I'm thinking something like http://www.vaultpress.com which is for wordpress but something similar for Xenforo. An automated cloud-based solution. Maybe Amazon s3?
 
Top