Add-on Backup Entire XenForo Install + Database To Amazon S3

TheBigK

Well-known member
I just performed a quick search for this; but didn't notice what I'm looking for. My search skills suck and if this add-on has already been requested or coded, please point me to the right link. If not, here's something that'd be totally awesome!

I want a Xenforo Backup Add-on that will -
  • Create a gzipped tar of the entire Xenforo Installation in a separate folder on the site.
  • It will then upload it to Amazon S3 service.
  • It will maintain 'X' number of copies in Amazon (configurable from AdminCP)
  • BONUS: It will even restore with a 'click'. This can be optional. But it'd be great if it can just fetch the gzipped tarball back on the site's folder (same or different folder) so that we can begin the manual restoration process.
  • It should send email or alert the admin on the forum that backup is successful.
  • It should also transfer money automatically from Warren Buffet's account to your account. I guess enough for version 1.0.
I've been using Amazon service and they are super-duper-awesome in terms of their pricing. I want my website to be backed up on their servers for a few dollar a month for some upgraded peace of mind.

Hit 'Like' if you want this add-on. (y)
 
Agreed walter. I have a set of scripts I use on my own server to upload my important data weekly to glacier. Maybe you guys would like them?
 
Agreed walter. I have a set of scripts I use on my own server to upload my important data weekly to glacier. Maybe you guys would like them?

I would be interested.

I'm currently using automysqlbackup to handle the daily/weekly/monthly backups of my DB, and then a amazon-s3-backup.sh script to copy files (inc. DB backups) in an archive and upload it to S3 (as well as removing old archives automatically).
 
In short it does the following

Dumps mysql to a backup directory (7 day rolling backup)
Zips the httpdocs directoy and moves it to the backup directory (7 day rolling)
Uploads to amazon glacier weekly via a useful java script I found.

I dont use any automated deletion from glacier, I remove those as and when I see fit manually.
 
Uploads to amazon glacier weekly via a useful java script I found.

I dont use any automated deletion from glacier, I remove those as and when I see fit manually.
Why do you use Glacier instead of S3? Isn't S3 giving you instant access to your backup in case of something went wrong?
 
Why do you use Glacier instead of S3? Isn't S3 giving you instant access to your backup in case of something went wrong?

I keep an ftp repository for instant access. Glacier is used as a backup to my backups incase something realy bad hits the fan as well as keeping several months of backups available for legal reasons.
 
I'd want to store backup in Glasier as well as in S3. I hope it's possible to move files from S3 to Glacier without much efforts?

I'd want to keep 10 copies of backups in S3 and all a monthly copy would go to glacier. Could you share the script you've found?
 
I'd want to store backup in Glasier as well as in S3. I hope it's possible to move files from S3 to Glacier without much efforts?

I'd want to keep 10 copies of backups in S3 and all a monthly copy would go to glacier. Could you share the script you've found?
Amazon S3 Now Supports Archiving Data to Amazon Glacier


Dear Amazon Web Services Customer,
We are pleased to introduce a new storage option for Amazon S3 that enables you to utilize Amazon Glacier’s extremely low-cost storage service for data archival. Amazon Glacier stores data for as little as $0.01 per gigabyte per month, and is optimized for data that is infrequently accessed and for which retrieval times of several hours are suitable. With the new Amazon Glacier storage option for Amazon S3, you can define rules to automatically archive sets of Amazon S3 objects to Amazon Glacier for even lower cost storage.
To store Amazon S3 objects using the Amazon Glacier storage option, you define archival rules for a set of objects in your Amazon S3 bucket, specifying a prefix and a time period. The prefix (e.g. “logs/”) identifies the object(s) subject to the rule, and the time period specifies either the number of days from object creation date (e.g. 180 days) or the specified date after which the object(s) should be archived (e.g. June 1st 2013). Going forward, any Amazon S3 standard or Reduced Redundancy Storage objects past the specified time period and having names beginning with the specified prefix are then archived to Amazon Glacier. To restore Amazon S3 data stored using the Amazon Glacier option, you first initiate a restore job using the Amazon S3 API or the Amazon S3 Management Console. Restore jobs typically complete in 3 to 5 hours. Once the job is complete, you can access your data through an Amazon S3 GET request.
You can easily configure rules to archive your Amazon S3 objects to the new Amazon Glacier storage option by opening the Amazon S3 Management Console and following these simple steps:
  1. Select the Amazon S3 bucket containing the objects that you wish to archive to Amazon Glacier.
  2. Click on “Properties. Under the “Lifecycle” tab, click “Add rule.”
  3. Enter an object prefix in the “Object prefix:” input box. This rule is now applicable to all objects with names that start with the specified prefix.
  4. Choose whether you want to archive your objects based on the age of a given object or based on a specified date. Click the “Add Transition” button and specify the age or date value. Click the “Save” button.
The Amazon Glacier storage option for Amazon S3 is currently available in the US-Standard, US-West (N. California), US-West (Oregon), EU-West (Ireland), and Asia Pacific (Japan) Regions. You can learn more by visiting the Amazon S3 Developer Guide or joining our Dec 12 webinar.
Sincerely,
The Amazon S3 Team
 
Yeah, got the mail. Can someone quickly share a script that lets us backup our web directories to S3? :)
 
In short it does the following

Dumps mysql to a backup directory (7 day rolling backup)
Zips the httpdocs directoy and moves it to the backup directory (7 day rolling)
Uploads to amazon glacier weekly via a useful java script I found.

I dont use any automated deletion from glacier, I remove those as and when I see fit manually.
Slavik
Can you please share the scripts?
Thank you
 
Did you get a chance to put the scripts together yet, Slavik?

Been working on this: http://xenforo.com/community/threads/a-r-f-i-a-really-fast-vb4-importer-paid.43779/page-2 so havent realy had time.


What you'll need to do though (ive added comments to help)

Get https://github.com/MoriTanosuke/glacieruploader

Make a cron job to run the script below every day / week etc as needed.

Code:
#!/bin/bash
#Example Backup Script to upload website to Amazon Glacier by Slavik at XenForo.com
#May be re-distributed if above credits left entact
 
suffix=$(date +%w%a)
 
rm -Rf /var/backup/* #remove all files in the backup directory
 
mysqldump -h localhost -uusername -ppassword databasename > /var/backup/database.sql #dump your database, copy this as many times as needed
 
cp -R /location/to/files /var/backup/files #copies all files from your main directory eg httpdocs to the backup directory, copy as many times as needed
 
tar -cvf /var/$suffix.tar /var/backup/* #tars everything into a file with permissions etc entact
 
mv /var/$suffix.tar /var/backup
 
java -jar /var/glacieruploader.jar --endpoint https://glacier.eu-west-1.amazonaws.com --vault vaultname --upload /var/backup/$suffix.tar
 
Last edited:
Back
Top Bottom