Any way to regurarly backup the xenForo database?

Marco Famà

Active member
Dear all

is there a function which I dont' see within the admin panel, that can let me choose to create a trigger for automatically backupping the xenForo instance, and receive such backup file via Email, maybe zipped?

I'm starting to be afraid of this matter, as my community is rapidly growing in contents..

Thanks a lot for your time,
Marco
 
What are people using now to backup their sites? Any easier solutions? I'm thinking something like http://www.vaultpress.com which is for wordpress but something similar for Xenforo. An automated cloud-based solution. Maybe Amazon s3?
I still have the same script running, but edited it to upload the databases to Amazon S3 and keep 14 days worth in there, along with an FTP upload of the same files to my NAS at home
 
I still have the same script running, but edited it to upload the databases to Amazon S3 and keep 14 days worth in there, along with an FTP upload of the same files to my NAS at home
What are you using to prune the S3 bucket? Or is it just something you do manually?

To upload the backups I use s3put, but it's getting quite full now!

If anyone's interested, I updated my script to use pigz which is a multi-core implementation of gzip, and rapidly speeds up the compression (at the cost of resources of course). It also uses aws/s3put to upload the SQL backup to an Amazon S3 bucket.

Attached below:

Code:
#!/bin/sh

# XenForo backup script
# Made by James because I hate doing this manually
# Updated 16th July 2012 to replace gzip with pigz for multi-core support

#### CONFIG - YOU MUST EDIT THIS ####

# set database connection info
db_username="username"
db_password="password"
db_name="database"

# set path to backups and website directory without trailing slashes:

# your backup location
backup_path="/path/to/backup/directory"

# your web root location
web_dir="/path/to/website/directory"

# set amazon s3 bucket name
bucket_name="bucket/directory"

#### END CONFIG - YOU PROBABLY DONT NEED TO EDIT BELOW HERE ####


# set filenames
filename_sql="backup-db-"`eval date +%Y%m%d`".sql"
filename_sql_gz="backup-db-"`eval date +%Y%m%d`".gz"
filename_data="backup-data-folder.tar"
filename_data_gz="backup-data-folder.tar.gz"

# dump the database
mysqldump -u$db_username -p$db_password $db_name > $backup_path/$filename_sql

# gzip the database at max compression to save space
# update: using pigz as it is a crap-ton faster!
pigz -9 $backup_path/$filename_sql

# create a tarball of the web folder
tar -cf $backup_path/$filename_data $web_dir/*

# move the old one to _old
mv $backup_path/$filename_data_gz $backup_path/old_$filename_data_gz

# gzip the website directory
pigz -9 $backup_path/$filename_data

# upload to Amazon S3...
s3put $bucket_name $backup_path/$filename_sql_gz

# write to the backup log
echo "Backup for `eval date +%d`/`eval date +%m`/`eval date +%Y` complete" >> $backup_path/backup.log
 
Last edited:
What are you using to prune the S3 bucket? Or is it just something you do manually?

To upload the backups I use s3put, but it's getting quite full now!

If anyone's interested, I updated my script to use pigz which is a multi-core implementation of gzip, and rapidly speeds up the compression (at the cost of resources of course). It also uses aws/s3put to upload the SQL backup to an Amazon S3 bucket.

Attached below:

Code:
#!/bin/sh

# XenForo backup script
# Made by James because I hate doing this manually
# Updated 16th July 2012 to replace gzip with pigz for multi-core support

#### CONFIG - YOU MUST EDIT THIS ####

# set database connection info
db_username="username"
db_password="password"
db_name="database"

# set path to backups and website directory without trailing slashes:

# your backup location
backup_path="/path/to/backup/directory"

# your web root location
web_dir="/path/to/website/directory"

# set amazon s3 bucket name
bucket_name="bucket/directory"

#### END CONFIG - YOU PROBABLY DONT NEED TO EDIT BELOW HERE ####


# set filenames
filename_sql="backup-db-"`eval date +%Y%m%d`".sql"
filename_sql_gz="backup-db-"`eval date +%Y%m%d`".gz"
filename_data="backup-data-folder.tar"
filename_data_gz="backup-data-folder.tar.gz"

# dump the database
mysqldump -u$db_username -p$db_password $db_name > $backup_path/$filename_sql

# gzip the database at max compression to save space
# update: using pigz as it is a crap-ton faster!
pigz -9 $backup_path/$filename_sql

# create a tarball of the web folder
tar -cf $backup_path/$filename_data $web_dir/*

# move the old one to _old
mv $backup_path/$filename_data_gz $backup_path/old_$filename_data_gz

# gzip the website directory
pigz -9 $backup_path/$filename_data

# upload to Amazon S3...
s3put $bucket_name $backup_path/$filename_sql_gz

# write to the backup log
echo "Backup for `eval date +%d`/`eval date +%m`/`eval date +%Y` complete" >> $backup_path/backup.log
I have a lifecycle configured on the bucket where the databases are stored. It removes any files more than 14 days old automatically.

I use pbzip2 to compress mine. Shrinks them down more than gzip (along with the parallel processing, so hammers all 14 cpu cores while it's doing it......but it's bloody quick!)
 
I have a lifecycle configured on the bucket where the databases are stored. It removes any files more than 14 days old automatically.

I use pbzip2 to compress mine. Shrinks them down more than gzip (along with the parallel processing, so hammers all 14 cpu cores while it's doing it......but it's bloody quick!)
I didn't know you could do that. What a good idea!

Thanks for the tip on pbzip2, I'll swap out pigz and use that! We've only got a Core i7 but that should still help massively :)
 
Hi!

Sorry, I'm pretty new to this stuff and I hope this is the right thread to ask... Just wandering if it's ok what I am doing to backup my forum:

I'm using CPanel and inside of it I go to Backups and download forum database and home directory to my computer. Is this ok and enough if I will ever need to restore something from my backup?

Btw, about restoring - will it be enough to restore my database and files just by uploading it through CPanel restore function?

Thanks in advance for every answer.
 
I installed pigz under /home/account1/etc/pig and typed make. It built. But when I run the script I get this message:
/home/account1/etc/xenbackup.sh: line 39: pigz: command not found
tar: Removing leading `/' from member names

Does pigz have to be installed in the same folder as the backup script? What does the leading / from member names mean?
 
Last edited:
I installed pigz under /home/account1/etc/pig and typed make. It built. But when I run the script I get this message:
/home/account1/etc/xenbackup.sh: line 39: pigz: command not found
tar: Removing leading `/' from member names

Does pigz have to be installed in the same folder as the backup script? What does the leading / from member names mean?
Did you run make install?
 
I typed make and it ran. I went to the directory again and tried again and got "make: `pigz' is up to date." When I type pigz, I get -bash: pigz: command not found. Is it something in the permissions? Here is what I see in ftp. When I type make it installs in the same directory right?

make.webp
 
I typed make and it ran. I went to the directory again and tried again and got "make: `pigz' is up to date." When I type pigz, I get -bash: pigz: command not found. Is it something in the permissions? Here is what I see in ftp. When I type make it installs in the same directory right?

View attachment 60332
No, it should install it into the directory it needs to be (whatever was defined in the make file configuration (/usr/bin, /usr/sbin, etc).
 
I installed pigz under /home/account1/etc/pig and typed make. It built. But when I run the script I get this message:
/home/account1/etc/xenbackup.sh: line 39: pigz: command not found
tar: Removing leading `/' from member names

Does pigz have to be installed in the same folder as the backup script? What does the leading / from member names mean?
what do you get if you type

# which pigz

eg:

[root@astra public]# which pbzip2
/usr/bin/pbzip2
 
I'm not sure it specified anyhting in the make file configuration but I don't understand what it says. There's nothing in the help documentation or readme (or at least I didn't see anything).

When I type which pigz it doesn't do anything. Just brings up another prompt.
 
OK, I just typed "make". When I try "make install" it says make: *** No rule to make target `install'. Stop.
Not sure exactly what you have done.
Try going into the source directory
./configure
make
make install
and see what it does then. If the make install doesn't work, try copying the executable created to /usr/local/sbin or /usr/local/bin and see if it picks it up. If not, move it into either /usr/bin or /usr/sbin.
 
Back
Top Bottom