Any Tips or Suggestions for Backing up Database?

Kiwi

Member
We will have a forum(very 1st one) go live very soon. Everything becomes more and more serious. I am looking for good suggestions or tips about backing up the database. Could anyone help to share your experience in this particular part? It'll be especially helpful if you could tell a little more details regarding the tools/software/best practices that you use or have for backups.

Greatly appreciated!
 
Check with your host to see if they do it. As Brogan suggests, don't count on that one though!

For smaller dbs, you can use phpmyadmin and just dump the db.

If you are handy with a little "nix, you can tar up the directory which contains mysql files.

There is a script here on XF which does backup also....

How often to do it depends on exactly what you can afford to lose. If it would be no big deal to lose a couple days of posts, then once a week (in addition to your hosts daily backup) might be fine.

Here is the script - or at least one of them
http://xenforo.com/community/resources/auto-database-backup-script.1664/
 
I've just started backing my databases up to Amazon S3 storage using a Perl script. I was previously backing them upto a second cheap VPS I had.

EDIT: My resource mentioned earlier has been changed slightly because of me now using S3 and my NAS at home.

You'll need the Amazon::S3::FastUploader perl module installing

Code:
#!/usr/bin/perl -w
 
use Amazon::S3::FastUploader;
 
    my $local_dir = '/home/XXX/YYY/databases/';
    my $bucket_name = 'my_database_bucket';
    my $remote_dir = '';
    my $uploader = Amazon::S3::FastUploader->new({
        aws_access_key_id => 'my_access_key',
        aws_secret_access_key => 'my_secret_key',
        process => 10, # num of proccesses in parallel
        secure  => 0,  # use SSL
        encrypt => 0,  # use ServerSide Encryption
        retry  => 1,
        verbose => 1,  # print log to stdout
        acl_short => 'public-read',  # private if ommited
    });
 
    $uploader->upload($local_dir, $bucket_name, $remote_dir);
 
If you can, keep your latest backup off your site's server, in case a disaster happens, and your server goes down.

I'd keep a backup of your xF code too.

Can you clarify a bit about keeping a backup of xF code? Isn't everything already in the database except your avatars and attachments stored on file system?

For those stuff only stored on file system, usually how do you back them up?
 
Well, if you have all of your xF files, and all of your addon files available, and any third party themes available, you may not need to backup your code. I like to keep the database and code in sync so I back them both up.

If you're on Linux and have access to a terminal session, you can run a tar command to create an archive of your root xF folder, and then gzip it to compress it.

Something like this (I'll double check this when I'm not on mobile)

Code:
tar -cf my_code_backup.tar /location/of/your/xf/root/community
gzip my_code_backup.tar

That will create a gzipped file called my_code_backup.tar.gz that you can keep in a safe place
 
I've just started backing my databases up to Amazon S3 storage using a Perl script. I was previously backing them upto a second cheap VPS I had.

EDIT: My resource mentioned earlier has been changed slightly because of me now using S3 and my NAS at home.

You'll need the Amazon::S3::FastUploader perl module installing

Code:
#!/usr/bin/perl -w
 
use Amazon::S3::FastUploader;
 
    my $local_dir = '/home/XXX/YYY/databases/';
    my $bucket_name = 'my_database_bucket';
    my $remote_dir = '';
    my $uploader = Amazon::S3::FastUploader->new({
        aws_access_key_id => 'my_access_key',
        aws_secret_access_key => 'my_secret_key',
        process => 10, # num of proccesses in parallel
        secure  => 0,  # use SSL
        encrypt => 0,  # use ServerSide Encryption
        retry  => 1,
        verbose => 1,  # print log to stdout
        acl_short => 'public-read',  # private if ommited
    });
 
    $uploader->upload($local_dir, $bucket_name, $remote_dir);

Matt, I have some questions I hope you can answer. Thanks.
  1. What do you do with your attachments? Are they backed up as well?
  2. How big is the database, which I assume is gzipped when sending it to Amazon?
  3. Do you keep overwriting the database on Amazon? If not, how many copies do you keep and how do you delete the older ones?
  4. How much is Amazon storage costing you?
 
Matt, I have some questions I hope you can answer. Thanks.
  1. What do you do with your attachments? Are they backed up as well?
  2. How big is the database, which I assume is gzipped when sending it to Amazon?
  3. Do you keep overwriting the database on Amazon? If not, how many copies do you keep and how do you delete the older ones?
  4. How much is Amazon storage costing you?

  1. I back up my entire public_html directory to my NAS at home via rsync (I've also been testing doing it to S3)
  2. I use bzip2 for compression, and the database is currently ~40mb compressed (was larger before moving search over the ES)
  3. No, I keep 10 rolling days worth. On my NAS, the script has a command to delete the file matching date 10 days previous. On S3, you can set an Lifecycle on the bucket, where it automatically deletes files that haven't changed in X days
  4. At the minute, nothing, as I'm on their free usage tear.
 
Back
Top Bottom