1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Any Tips or Suggestions for Backing up Database?

Discussion in 'XenForo Questions and Support' started by Kiwi, May 23, 2013.

  1. Kiwi

    Kiwi Member

    We will have a forum(very 1st one) go live very soon. Everything becomes more and more serious. I am looking for good suggestions or tips about backing up the database. Could anyone help to share your experience in this particular part? It'll be especially helpful if you could tell a little more details regarding the tools/software/best practices that you use or have for backups.

    Greatly appreciated!
     
    Vicki likes this.
  2. Brogan

    Brogan XenForo Moderator Staff Member

    You will get many recommendations on what to use and how to do it.

    I will simply say back up at least once a day. Every day.
     
    Kiwi likes this.
  3. craigiri

    craigiri Well-Known Member

    Check with your host to see if they do it. As Brogan suggests, don't count on that one though!

    For smaller dbs, you can use phpmyadmin and just dump the db.

    If you are handy with a little "nix, you can tar up the directory which contains mysql files.

    There is a script here on XF which does backup also....

    How often to do it depends on exactly what you can afford to lose. If it would be no big deal to lose a couple days of posts, then once a week (in addition to your hosts daily backup) might be fine.

    Here is the script - or at least one of them
    http://xenforo.com/community/resources/auto-database-backup-script.1664/
     
    Kiwi, Vicki, MattW and 1 other person like this.
  4. SneakyDave

    SneakyDave Well-Known Member

    If you can, keep your latest backup off your site's server, in case a disaster happens, and your server goes down.

    I'd keep a backup of your xF code too.
     
    Kiwi likes this.
  5. MattW

    MattW Well-Known Member

    I've just started backing my databases up to Amazon S3 storage using a Perl script. I was previously backing them upto a second cheap VPS I had.

    EDIT: My resource mentioned earlier has been changed slightly because of me now using S3 and my NAS at home.

    You'll need the Amazon::S3::FastUploader perl module installing

    Code:
    #!/usr/bin/perl -w
     
    use Amazon::S3::FastUploader;
     
        my $local_dir = '/home/XXX/YYY/databases/';
        my $bucket_name = 'my_database_bucket';
        my $remote_dir = '';
        my $uploader = Amazon::S3::FastUploader->new({
            aws_access_key_id => 'my_access_key',
            aws_secret_access_key => 'my_secret_key',
            process => 10, # num of proccesses in parallel
            secure  => 0,  # use SSL
            encrypt => 0,  # use ServerSide Encryption
            retry  => 1,
            verbose => 1,  # print log to stdout
            acl_short => 'public-read',  # private if ommited
        });
     
        $uploader->upload($local_dir, $bucket_name, $remote_dir);
     
    Kiwi and SneakyDave like this.
  6. RoldanLT

    RoldanLT Well-Known Member

  7. Vicki

    Vicki Active Member

    Can you clarify a bit about keeping a backup of xF code? Isn't everything already in the database except your avatars and attachments stored on file system?

    For those stuff only stored on file system, usually how do you back them up?
     
    Kiwi likes this.
  8. SneakyDave

    SneakyDave Well-Known Member

    Well, if you have all of your xF files, and all of your addon files available, and any third party themes available, you may not need to backup your code. I like to keep the database and code in sync so I back them both up.

    If you're on Linux and have access to a terminal session, you can run a tar command to create an archive of your root xF folder, and then gzip it to compress it.

    Something like this (I'll double check this when I'm not on mobile)

    Code:
    tar -cf my_code_backup.tar /location/of/your/xf/root/community
    gzip my_code_backup.tar
    
    That will create a gzipped file called my_code_backup.tar.gz that you can keep in a safe place
     
    Vicki and Kiwi like this.
  9. 0ptima

    0ptima Well-Known Member

    Matt, I have some questions I hope you can answer. Thanks.
    1. What do you do with your attachments? Are they backed up as well?
    2. How big is the database, which I assume is gzipped when sending it to Amazon?
    3. Do you keep overwriting the database on Amazon? If not, how many copies do you keep and how do you delete the older ones?
    4. How much is Amazon storage costing you?
     
    Kiwi and SneakyDave like this.
  10. TheBoss

    TheBoss Well-Known Member

  11. MattW

    MattW Well-Known Member

    1. I back up my entire public_html directory to my NAS at home via rsync (I've also been testing doing it to S3)
    2. I use bzip2 for compression, and the database is currently ~40mb compressed (was larger before moving search over the ES)
    3. No, I keep 10 rolling days worth. On my NAS, the script has a command to delete the file matching date 10 days previous. On S3, you can set an Lifecycle on the bucket, where it automatically deletes files that haven't changed in X days
    4. At the minute, nothing, as I'm on their free usage tear.
     
    0ptima, Vicki, Kiwi and 1 other person like this.
  12. SneakyDave

    SneakyDave Well-Known Member

    If you have Windows and would like to keep a copy of your site backups on it, you can use WinSCP, an FTP client, to schedule a task to run a batch script to transfer your backup files.

    More information here:

    http://winscp.net/eng/docs/scripting
     
  13. Kiwi

    Kiwi Member

    Many thanks to Brogan, Craigiri, SneakyDave, MattW, RoldanLT, Vicki, Optima, TheBoss! I really appreciate for your kind sharing.I need to take a little time to digest this. At the meantime, hope to hear more from the XF community.
     
    whynot, MattW and SneakyDave like this.

Share This Page