Amazon Glacier for Xenforo server backup

Andy.N

Well-known member
Hi guys,
Got an email from Amazon this morning about their new service called Glacier. I think with $0.01 per Gigabyte per month it's pretty cheap to do daily backup of our Xenforo server.

Do we have any tool that can auto backup our server directly to Glacier?

We are excited to announce the immediate availability of Amazon Glacier – a secure, reliable and extremely low cost storage service designed for data archiving and backup. Amazon Glacier is designed for data that is infrequently accessed, yet still important to retain for future reference. Examples include digital media archives, financial and healthcare records, raw genomic sequence data, long-term database backups, and data that must be retained for regulatory compliance. With Amazon Glacier, customers can reliably and durably store large or small amounts of data for as little as $0.01/GB/month. As with all Amazon Web Services, you pay only for what you use, and there are no up-front expenses or long-term commitments.

Amazon Glacier is:
  • Low cost- Amazon Glacier is an extremely low-cost, pay-as-you-go storage service that can cost as little as $0.01 per gigabyte per month, irrespective of how much data you store.
  • Secure - Amazon Glacier supports secure transfer of your data over Secure Sockets Layer (SSL) and automatically stores data encrypted at rest using Advanced Encryption Standard (AES) 256, a secure symmetric-key encryption standard using 256-bit encryption keys.
  • Durable- Amazon Glacier is designed to provide average annual durability of 99.999999999% for each item stored.
  • Flexible –Amazon Glacier scales to meet your growing and often unpredictable storage requirements. There is no limit to the amount of data you can store in the service.
  • Simple– Amazon Glacier allows you to offload the administrative burdens of operating and scaling archival storage to AWS, and makes long term data archiving especially simple. You no longer need to worry about capacity planning, hardware provisioning, data replication, hardware failure detection and repair, or time-consuming hardware migrations.
  • Designed for use with other Amazon Web Services – You can use AWS Import/Export to accelerate moving large amounts of data into Amazon Glacier using portable storage devices for transport. In the coming months, Amazon Simple Storage Service (Amazon S3) plans to introduce an option that will allow you to seamlessly move data between Amazon S3 and Amazon Glacier using data lifecycle policies.
Amazon Glacier is currently available in the US-East (N. Virginia), US-West (N. California), US-West (Oregon), EU-West (Ireland), and Asia Pacific (Japan) Regions.

A few clicks in the AWS Management Console are all it takes to setup Amazon Glacier. You can learn more by visiting the Amazon Glacier detail page, reading Jeff Barr’s blog post, or joining our September 19th webinar.
 
Wonder if I could get my Linode Server hooked into this. Would be WAY cheaper backup then what Linode is offering.
 
Would love to use this to backup my clients data.

From what I can see you have to use amazons java or c api's to connect to it, theres no legacy connections such as ftp?
 
Only Amazon with their MASSIVE scale can provide this kind of price. I have moved a lot of my site services to them (SES for email, Cloudfront for CDN, S3 for file hosting, etc).
It only makes sense to not use our server hosting's expensive services. They can't compete with Amazon on price or scale.
 
When Glacier will work with tools like rsync or csync2, it will be a valuable commodity. Until then, I will stick with local backup solutions.
 
Just note that you will be charged if you delete data too early (free delete if it is at least 3 months old). Also, I believe I've read somewhere that the average access time is 3.5-4 hours. Still a good deal for some use cases but not everybody.
 
This is what I do now for one site: incremental backup on a 6 disks ZFS RaidZ2 NAS for physical files (about 200GB) and full backup for MySQL database (10GB). I use rsync to sync the physical files (php files, attachments etc.) in archive mode (new files only transferred) and do mydumper snapshots every 24hrs for MySQL, while storing each dump for 2 weeks. Technically, I can go back in time up to 14 days. It would be interesting to see if is possible to replicate the same behavior with Glacier.
 
Glacier is OK to store your monthly snapshot backups, but I wouldn't do much else. The time delay and transfer speed in data retrieval is just too large if you need to get something back, and it will seem like an eternity if you need to rely on it for important/disaster recovery purposes.

I've been using S3 for offsite backups for almost 12 mths, and it's fantastic. I use AutoMySQLBackup to handle my daily/weekly/monthly full and incremental database dumps, and then use a cron nightly scheduled script (removing the line for mysqldump, since AutoMySQLBackup handles this) to upload my server contents to Amazon S3. The upload script is basically a short and simple wrapper for the duplicity backup tool.

Paying approx $22 USD per month (~$17 for storage size, and ~$5 for data transfer (~100Gb per mth).
 
Top Bottom