Any way to regurarly backup the xenForo database?

Marco Famà

Active member
Dear all

is there a function which I dont' see within the admin panel, that can let me choose to create a trigger for automatically backupping the xenForo instance, and receive such backup file via Email, maybe zipped?

I'm starting to be afraid of this matter, as my community is rapidly growing in contents..

Thanks a lot for your time,
Marco
 
I was just looking at backup-smart.com based on recommendations in this thread, but the site itself looks rather sketchy (as if an "Internet Marketing Guru" wrote it), and it seems to be associated with Clickbank, which to me is like saying "snake oil" so I'm kind of thinking you're never going to get a reply Eric.
 
I do basically the same thing as Calamity James -- take the site offline at roughly 4AM (nginx 503 redirect), mysqldump the databases, gzip the sql files, run some db maintenance, then bring the site back online. Once that's finished, a separate script rsyncs to a remote backup server.

However, I recently discovered Percona Xtrabackup, and am planning on testing that out with the intention of replacing mysqldumb with Xtrabackup. The nice thing about Xtrabackup is that it can run on a live db and recreate an internally-consistent data set in the backup file. I always take the site offline if I want to run mysqldump, just because I don't want to have to worry about table locking or partially-complete transactions or anything like that. My XenForo tables are now (mostly) InnoDB, but I still have some myisam stuff floating around.

You can read an overview of Xtrabackup here: http://www.percona.com/doc/percona-xtrabackup/how_xtrabackup_works.html
I'm going to have a play with this tonight (y)
 
I bought backup smart. Backup completed and all of my folders were empty. Still waiting to here back from them. Anyone else have this problem?
I was just looking at backup-smart.com based on recommendations in this thread, but the site itself looks rather sketchy (as if an "Internet Marketing Guru" wrote it), and it seems to be associated with Clickbank, which to me is like saying "snake oil" so I'm kind of thinking you're never going to get a reply Eric.
I've been using backup-smart for several months with no issues.
 
I've been using backup-smart for several months with no issues.

I can't tell if I'm having a problem or not. When I unzip my files there appears to be nothing in the folders but the zipped file says it's 50 mb. The forum has been online for a month with a decent amount of attachments. 50mb would seem about right no?
 
I can't tell if I'm having a problem or not. When I unzip my files there appears to be nothing in the folders but the zipped file says it's 50 mb. The forum has been online for a month with a decent amount of attachments. 50mb would seem about right no?

Unless you have a lot of images, I'd expect a zipped backup to be somewhat smaller than that. I'd be concerned if it were empty too, though.
 
Unless you have a lot of images, I'd expect a zipped backup to be somewhat smaller than that. I'd be concerned if it were empty too, though.

Yeah, we use a lot of images. I'm thinking my .tar file didn't unzip properly? If that's the case, the software worked as expected and I'm very happy. I think I'll know more when I wake up in the morning. I should have an email waiting for me saying the site was backed up in the middle of the night.
 
Woke up this morning to an email showing my site was backed up early this morning while I was asleep. The people at backup smart did respond to every ticket I place with them.

I would say backup smart appears to be an excellent product.
 
Woke up this morning to an email showing my site was backed up early this morning while I was asleep. The people at backup smart did respond to every ticket I place with them.

I would say backup smart appears to be an excellent product.
Good to hear. I was regularly backing up my files to my home drive for several sites until I found that my ISP is now metering my downloads. :(
 
Just wanted to say thanks again for this helpful thread. In the end I saved a modified version of the script as backupdatabase.sh, stuck it in my root folder, and set up a cron job in the server/host's control panel to run that script every night. Easy as that.

All I want to do is a daily back up, so I changed the date part to write a file backupDAY.sql so over the week I get backupmon.sql, backuptues.sql and so on for each weekday, then over writing them each new week, this is because if I use the full date I'll just forget and end up filling up my webspace (even if they're zipped) with backup files :)
 
I set up a crontab on my server to back everything up...

Code:
#!/bin/sh
 
# XenForo backup script
 
#set database connection info
db_username="username"
db_password="password"
db_name="database"
 
# set filenames
filename_sql="backup-db-"`eval date +%Y%m%d`".sql"
filename_sql_gz="backup-db-"`eval date +%Y%m%d`".gz"
filename_data="backup-data-folder.tar"
filename_data_gz="backup-data-folder.tar.gz"
 
# set path to backups and website directory without trailing slashes
backup_path="/path/to/backup/directory"
web_dir="/path/to/website/directory"
 
# dump the database
mysqldump -u$db_username -p$db_password $db_name > $backup_path/$filename_sql
 
# gzip the database at max compression to save space
gzip -9 $backup_path/$filename_sql
 
# if it's friday, we'll backup the website directory too
if [ $(date '+%a') == "Fri" ]; then
    # create a tarball of the web folder
    tar -cf $backup_path/$filename_data $web_dir/*
    # gzip the website directory
    gzip -f9 $backup_path/$filename_data
    # remove the tarball
    rm -f $backup_path/$filename_data
fi
 
# Log that the backup has been done
echo "Backup for `eval date +%d`/`eval date +%m`/`eval date +%Y` complete" >> $backup_dir/backup.log

It's a bit crude, but it works fine for myself, and keeps daily backups of the SQL data, and weekly ones of the website directory. If you want to take daily backups of the website directory, just remove the "if" and "fi" lines
I've basically the same thing with a scp transfer to another remote server. Although another option would be a big tarball to Amazon S3 or something like it.
 
I've basically the same thing with a scp transfer to another remote server. Although another option would be a big tarball to Amazon S3 or something like it.
That is what I do, I call a perl script inside the bash script to sftp the database files to two remote locations, and keep 5 rolling days worth of database files.
 
That is what I do, I call a perl script inside the bash script to sftp the database files to two remote locations, and keep 5 rolling days worth of database files.
I keep 5 days, 1 from prior week, 1 month. The last two are probably overkill but I keep them on the off chance that I get hacked and db files are changed and I don't notice for a week or more.
 
Top Bottom