What is your backup plan?

First Daily Process:
one cron to copy primarily config files to a predetermined folder, that is alongside another folder that holds all of separate mysql dumps. They are both included in another third cron that does all of the websites and the config/mysqldumps and rolls them up into one tar.gz; which is then moved to a SAN in the same data center.

Second Daily Process:
Is pretty much a series of separate backups for each individual site. Another database dump and copy of the files. One cron for each site that handles it all and puts the tar.gz backups to the same external SAN. I used to have struggles with one particular Wordpress ecommerce plugin and this process helped put me back if the challenges were too great.

Third Daily Process:
Runs in the 2nd 12-Hour period after the processes above are done. It is a proprietary backup software solution that includes the sites, databases, and server configs; that puts the backups in separate offsite storage.

Fourth Daily Process:
Has been a encrypted sync with an AWS S3, but disabled it a few months ago. It seemed that there was a minor error message almost every night. It did help me with a few incremental single-file restores though.

I haven't experienced a catastrophic failure in a while, but one of first three processes above has saved me more than once. Restorations tend to be done using backups from processes #1 and #2.

If one considered the total existence of my XF site, including the years it was a vb3 site, it has experienced at least ten catastrophic failures. One of the backup processes above brought it back with a loss of just a few hours of posts.
 
For the people who might find it handy, I use this script to make with the mysqlhotcopy a consistent backup of my mysql databases. Call this script what you want (I just name it mysqlhotcopy.sh) and put it in /usr/local/bin and give it execute rights (chmod +x /path/to/script.sh) and run it by example ./mysqlhotcopy.sh
It will dump the database and you can create multiple versions before the old one gets deleted. Set up a cron job to run it nightly.

Code:
#!/bin/sh

# List of databases to be backed up separated by space
dblist=“databasename1 databasename2”

# Directory for backups
backupdir=/backup/mysql

# Number of versions to keep
numversions=14

# Full path for MySQL hotcopy command
hotcopycmd=/usr/bin/mysqlhotcopy

# MySQL Username and password
userpassword=" --user=fill_username_here --password=your_password”

# Create directory if needed
mkdir -p ${backupdir}
if [ ! -d ${backupdir} ] 
then
    echo "Invalid directory: ${backupdir}"
exit 1
fi

# Hotcopy begins here
echo "Hotcopying MySQL Databases..."
RC=0
for database in $dblist
do
    echo "Hotcopying $database ..."
    $hotcopycmd $userpassword $database ${backupdir}
    RC=$?
    if [ $RC -gt 0 ]
    then
        break;
    fi

    # Rollover the backup directories
    i=$numversions
    mv ${backupdir}/${database} ${backupdir}/${database}.0 2> /dev/null
    rm -fr ${backupdir}/${database}.$i 2> /dev/null
    while [ $i -gt 0 ]
    do
        mv ${backupdir}/${database}.`expr $i - 1` ${backupdir}/${database}.$i 2> /dev/null
        i=`expr $i - 1`
    done
done

if [ $RC -gt 0 ]
then
    echo "MySQL Hotcopy failed!"
    exit $RC 
else
    # Hotcopy is complete. List the backup versions!
    ls -l ${backupdir}
    echo "MySQL Hotcopy is complete!"
fi
exit 0

I have this in my crontab so I get an email after it runs.
 
Being somewhat a newbie when I first started with Xenforo, I don't know how many times after installing the software, either due to my fault or something else, if I could not figure it out, I was screwed and would have to start completely back over. Usually I depended on my host to do a restore but let me tell you depending on what host or any host, you simply cannot count on on that. Some hosts don't even do backups or if they do maybe once a week at best. So after getting tired of that scenario I purchased a third party that does backups twice a day and alerts me to any files or changes to my site 24 hours a day. Just the other day I was messing around and did something and received the white screen of death and if not having the third party and with no way of figuring it out, I would have been in the same situation again of having to start all over. Now whenever I simply have an issue, I can restore my site from any previous restore date I choose with a simple click of a button and back in business within 10 minutes. Very secure and peaceful feeling as I'm still a newbie and learning on this stuff and will make many more future mistakes that I can't figure out or trust my host in doing so either. I'ts a great peace of mind for sure.
 
My backup
1. Data and db are backed up to the local HD every day and I also transfer it to a external ftp server every night.
2. I use rsync to sync all changed data to rsync.net every night. (really helpful guys out there)
 
I do a daily backup to my home NAS with Navicat, which also backs up on Crashplan. My host also keeps frequent back ups available in the off chance I need them.
 
Is there a good way for noobs to backup db and home directory to local HD with a schedule?
Those php scripts are too much for me. I dont know how to use them.
 
I have a dedicated server that runs rsnapshot and takes 4 x daily snapshots of the folders I tell it to from my hosting servers. I then have a couple of scripts in place which run from cron for the database backups, and I keep 30 rolling days worth of DB backups.
 
I put my backup to S3. But sometimes, the process is terminated. Don't know why. It's random. I think I'm gonna create some tickets to DO and AWS.
 
I take a database backup every 24 hours via cron, and a file backup every 7 days. Two weeks database backups stored locally, and every week a cron runs to send backups onto my other server. The other server has a lot of storage, so it keeps backups for 30 days.
 
Top Bottom