xtrabackup, how do you use it?

Nuno

Well-known member
Hi,

Start playing with xtrabackup recently and it's an amazing tool.
I'm learning things and would like to know from you how do you manage backups.

Do you make daily full backups (backup + prepare)?
Do you do incremental backup?

Any tool to automate this?

Thanks
 
Hi,

Start playing with xtrabackup recently and it's an amazing tool.
I'm learning things and would like to know from you how do you manage backups.

Do you make daily full backups (backup + prepare)?
Do you do incremental backup?

Any tool to automate this?

Thanks
We run a daily backup and prepare at night. Great tool, helped us out a few times :)
If you want I can share the cron job we built.

Fyi: our backup is large, 65GB
 
If you want I can share the cron job we built.
Please do :)
Yes, it's a great tool, so no one makes incremental with xtrabackup?
I confess that incrementals are not a good way for me since the roolback has more steps.

I am testing some functionalities like --compres but using this we can't apply logs so, no way to go either,
Last ... tar and compress the backup .. it's very cpu intensive, so the way should be create and prepare the backup.
 
  • I am using bash script with single comman to do mysqldump of database and put it into folder on server - run every night once
  • Another bash script to delete in above mentioned folder to delete database older than 5 days, so i allways have 5 databases from 5 past days - run every night once
  • Another database backup is done automatically with excellent addon from @SneakyDave. Backup is put in folder on server and copy of backup is automatically uploaded to Dropbox with help of another @SneakyDave addon - run every night once
  • And finally, I am using Rnapshot to rsync all database backup folders, nginx/conf directory and complete domain folder to another server (backup server) - run every night once and last in order, and saves last 5 backups
My database is about 700 mb unzipped, and files from domain directory are about 11GB and Rsnapshot do that for about 17 minutes.
I can give you cron settings, bash script etc... if you want. One addon from @SneakyDave is free, and another one (Dropbox) cost only 5 dollars
 
Hello @Sunka
I'll accept your scrips too :)

This is what I'm planning to do with xtrabackup:

Code:
#!/bin/bash
bkdate=$(date "+%Y-%m-%d-%H-%M-%S")
bkdir=/opt/backup

innobackupex --no-timestamp ${bkdir}/${bkdate}
innobackupex --use-memory=1G --apply-log ${bkdir}/${bkdate}

for dir in "$( find ${bkdir} -mindepth 1 -maxdepth 1 -type d -mtime +5 )"
do
  rm -rf $dir   
done

Create and prepare the backup and remove backups older than 5 days.
Than just pull it with rsnapshot to my backup server.
 
I am not coder, so do not laugh for this commands or setting. For me, they are working, and that is most important:
Change settings for your forum, and username and pass too.
Below are pure database & files commands. Better try to find full path to command

Database backup script
Code:
#!/bin/bash
/usr/bin/mysqldump --opt database_name > /path/to/database/backup/folder/database_backup_`date +%d-%m-%Y---%H-%M`.sql

Database clean script (delete backup database older than 5 days)
Code:
#!/bin/bash
/bin/find /path/to/database/backup/folder -mtime +3 -type f -exec rm -f {} \;

Files backup script (gzipped)
Code:
#!/bin/bash
/usr/bin/tar -cf - /path/to/domain/folder | /bin/gzip -c > /path/to/files/backup/folder/files_backup_`date +%d-%m-%Y---%H-%M`.tar.gz

Files clean script (delete backup gziped backup file older than 10 hours)
Code:
#!/bin/bash
/bin/find /path/to/files/backup/folder -mmin +10 -type f -exec rm -f {} \;

Save all four as script (some_name.sh), put in to some folder and call it with crontab in desired name. Delete script execute before save script, for both (delete database - save database - delete files - save files), and give time for work.

And after all that, I call Rsnapshot installed on backup server to rsync all backup folders and pure domain directory.
So in the end I have database backup (gzipped and not zipped both on server), one database on dropbox, gzipped domain directory on server and all of that on backup server too
 
Yes, it's a great tool, so no one makes incremental with xtrabackup?
Personally not much a fan of incremental backups as the only method of backup. As incrementals are only as good as the last known good full snapshot backup. Without that last full working snapshot backup, those incremental backups are usually useless.
Yes I wrote and use dbbackup.sh (mysqldump based) or an advanced variation of it myself to backup databases every 4, 6 or 8hrs to local disk + amazon s3 + with rsnapshot on remote server pull backups every 4hrs, 24hrs, weekly and monthly. Also multi-threaded compression support means faster compressed backup routines. dbbackup.sh can automatically detect if a database uses innodb or myisam tables and dynamically adjust the mysqldump options to optimally backup innodb or myisam based databases. The script also backups databases in ascending order of size. So smallest databases are backed up first so you have the greatest chance of backing up the most data if you run into problems due to resources or network connectivity issues and aborted backup processes :)

I've also written custom database backup scripts using multi-threaded backup processes available to mydumper and percona xtrabackup as well for my private paying clients :D
 
So i looked at many different options for my backup system, the one I went with was as follows

- Use Cron to create my xtrabackup incremental and full backups (weekly full, daily incremental)
- Have a raspberry pi at home which has backuppc installed on it
- BackupPC goes off once per day and does an incremental file system backup and a backup of the xtrabackup database directory
- Have a second pi that hosts my controls my simple NAS. This device logs into the first pi and takes a directory backup.

Why do I do it like this?
The idea is that first of all nowhere on my server is there any login information to my network, so that if I am hacked (worst case scenario for me) the only way for them to get at my backups is by polluting the SSHD but usually at this stage I have already identified the problem, and incoming connections to my pi are alerted so I can cut things off and be no more than one day behind on data
 
Sorry for the late reply.

Our script:
Code:
#!/bin/sh
# Database Backup script


username=xxxxxx
password=xxxxxxx

now=`date +"%Y-%m-%d_%H-%M"`

# First: Remove old backups

find /path/to/db/db.* -type f -mmin +1200 -delete

# Create the Backup
export PATH=$PATH:/path/to/xtrabackup
innobackupex --slave-info --user=$username --password=$password --no-timestamp /path/to/backupdir/db.$now 1>/path/to/backupdir/db.$now.log 2>&1

success=`grep -c "completed OK" /path/to/backupdir/db.$now.log`

# Prepare the Backup
if [ "$success" -eq "2" ]
then
        innobackupex --apply-log /path/to/backupdir/db.$now 1>>/path/to/backupdir/db.$now.log 2>&1
fi

success=`grep -c "completed OK" /path/to/backupdir/db.$now.log`

# Mail on Error
if [ "$success" -ne "4" ]
then
        echo "Subject: Database Backup Failure $now" > /path/to/backupdir/mailoutput
        cat /path/to/backupdir/db.$now.log >> /path/to/backupdir/mailoutput
        /usr/sbin/sendmail you@mail.com < /backup/mailoutput
        rm -f /path/to/backupdir/mailoutput
fi

As you can see, no incrementals...
 
Top Bottom