Resource icon

XenForo Database Backup Shell Script 1.0.8

No permission to download

Mopquill

Active member
If you appreciate my work or find this script useful, consider a donation! :D You can do so here. I'd really appreciate it!

XenForo Backup Shell Script (version 1.0.8) - Use this script to back your database up by simply pointing it to your config.php

This script's intended use is to be run as a cron or called via command line to quickly back up your forum's database, but, I also added a parameter to have it prompt you for your database information. Anyhow, it saves the database in the format database_name_yyyy-mm-dd_hh-mm-ss.sql.gz , because I sometimes have more than one backup from the same day, and having time helps with that.

Read more about this resource here.

Most Recent Changelog

1.0.8 - July 31st, 2012
  • Added support for backing up all databases on server with mysql root account, and new mode parameter backup-all-databases
  • Optionally backs up data and internal_data directories (suggested by craigiri) with configurable locations -- this is enabled by default, and assumes they are sibling to your library directory
  • Added data-only mode parameter to optionally back up just the data directories -- useful for backing up data at different times or in different intervals than your database
  • Optionally combines data and internal_data into the same gzipped tar archive -- this avoids a warning from tar regarding parent directories, and is enabled by default
  • Optionally creates database backup directory for you if it does not exist -- this is enabled by default
  • Now includes --single-transaction in the mysql query (suggested by Coop1979), which should be run much smoother on big boards where you don't want to be locking a bunch of tables
  • includes various other stability tweaks to the MySQL commands
  • changed inner functioned to be more module-oriented -- this makes the script more efficient
 
Thanks - tried an initial version and only problem I had were the DOS line breaks and other such stuff.....fixed with a bit of perl I dug up from googling the problem....

Quick note - the destination directory must exist! This does not create it for you.

Definitely makes it a bit handier even if I use it manually (my ISP backs up my stuff each day also, but I like to have a copy of the main db!).
 
Thanks - tried an initial version and only problem I had were the DOS line breaks and other such stuff.....fixed with a bit of perl I dug up from googling the problem....
There are no DOS line-breaks to my knowledge- I use a text editor that allows me to specify which line-breaks I use, and I always use standard LF- which I mention in my post, actually. You can convert a plaintext file's line-breaks with a utility called dos2unix, which I also mentioned in my post- no perl necessary. I'm willing to bet your issue has more to do with whatever you edited with. I just double-checked by uploading the very file from my zip:

Code:
~$ cp xf_backup.sh xf_backup_d2u.sh && dos2unix xf_backup_d2u.sh && md5sum xf_backup*
dos2unix: converting file xf_backup_d2u.sh to UNIX format ...
9dc53144ba773bd1ec5552924822f548  xf_backup_d2u.sh
9dc53144ba773bd1ec5552924822f548  xf_backup.sh

Apologies if I'm missing something on my end, but near as I can tell, my line-breaks are all LF- unix-style.

Quick note - the destination directory must exist! This does not create it for you.

Definitely makes it a bit handier even if I use it manually (my ISP backs up my stuff each day also, but I like to have a copy of the main db!).
My experience with linux tells me that the unix philosophy is to be explicit- this is why rmdir does not delete an empty directory, and why touch won't create an empty file in a directory that doesn't exist. I can possibly add an option that does this, but, I certainly wouldn't want it enabled by default.

Glad to hear it. (y) I actually use a slightly modified one that backs up *all* my databases into a single gzipped tar archive, though, I don't know if anyone else on here would find that useful.
 
Just FYI, I downloaded to a Mac, unzipped and uploaded with Fetch.
Then I edited with nano.

Nano knew (or said) it was a DOS type file.

I probably should have uploaded as the ZIP and unzipped on the server.
Also, Fetch (FTP prog) may set the wrong upload type for an .sh extension.....

Who knows? So many possibilities.....
 
Sorry, I have zero experience with Macs. Nano is an excellent program, so by that point, it was likely already a DOS, but it could have been the unzip program, or the FTP could have uploaded it in ASCII mode- and might have been configured to use CRLF for some reason. I just checked, and nano is indeed aware of Mac format- I converted the file to CR-only, and it said "Converted from Mac Format", so, it definitely knows. My nano didn't mention a conversion for either of the aforementioned files, however.

Anyhow, I definitely appreciate the notification- had I uploaded a shell script with CRLF, that would have been an incredibly silly mistake. :p
 
^ All of them.

EDIT: Sorry, let me elaborate. I have no way of knowing which tables might be created by add-ons, and there shouldn't be tables from another software in the database, so it makes a lot more sense to me to just back up the whole thing.
 
If your tables are InnoDB, you might want to add
Code:
--single-transaction
so that it can backup tables as they are being used. This is especially useful for the posts table where someone is bound to be inserting data on a busy forum.
 
If your tables are InnoDB, you might want to add
Code:
--single-transaction
so that it can backup tables as they are being used. This is especially useful for the posts table where someone is bound to be inserting data on a busy forum.
Thanks for the suggestion. =]
 
I'm working on an update to this; it'll optionally backup data and internal_data (as per craigiri's suggestion), and it will have Coop1979's suggestion, as well as a few consistency fixes and such from me.
 
Oops. I updated this script, but forgot to upload it; I've been busy. I'll zip it up and upload it hopefully soon, when I have time to write up a proper changelog.

EDIT: (It backs up the data directories now, as well as adding an option to create the backup directory if it doesn't exist, as well as the single-transaction suggestion mentioned above, along with some other changes, mostly related to stability, robustness, and error-handling)
 
Mopquill updated XenForo Database Backup Shell Script with a new update entry:

XenForo Backup Shell Script 1.0.8

Changelog

1.0.8 - July 31st, 2012
  • Added support for backing up all databases on server with mysql root account, and new mode parameter backup-all-databases
  • Optionally backs up data and internal_data directories (suggested by craigiri) with configurable locations -- this is enabled by default, and assumes they are sibling to your library directory
  • Added data-only mode parameter to optionally back up just the data directories -- useful for...

Read the rest of this update entry...
 
Sorry for taking so long to update, guys, I have been -- and will likely continue to be -- quite busy. I have a limited testing environment, so, do let me know of any bugs you guys might be running into. I use the script via bash on Debian Squeeze 64-bit, if it helps any. Let me know if you guys run into any bugs, good luck, and enjoy. =]
 
I have no experience with pigz, but, from looking at the help, it looks like it takes the same commands as gzip, and which pigz gives me its install directory. Since I don't know pigz, I'll defer to you for this -- if I add a tiny change where if some arbitrary variable is set to true, it'll use pigz instead of gzip, setting gzip_dir to pigz directory instead, will all the other commands work as normal, but take advantage of the multiple cores instead? If that will work, I'll add it right now.

EDIT: Actually, nevermind, I guess your post implies that. XD I'll add it in.
 
an option to delete older backups automatically or just keep latest x amount of backups in the folder would be awesome :)

thanks for this script!
 
an option to delete older backups automatically or just keep latest x amount of backups in the folder would be awesome :)

thanks for this script!
I'm currently working on this bit of it. I don't like messing around with things that delete things; the last thing I want to do is be responsible for wiping some poor user's backup folder. So, this will be off by default, and I'm only going to update once I think I've got the command all worked out.
 
Thanks for the scripts.

A few suggestions if I may...
  • An option to save the file via ftp to another server - having a backup reside on the same server isn't great if the server fails - nfs is fine for servers on same network ftp is needed for external servers
  • An option to email the file offsite - for hosted XenForo sites being able to have the backup emailed to the user means that offsite backup is easily handled for those that have no ftp availability
  • A restore option. Remember the old saying that a backup is only as good as its restore.
 
Top Bottom