Off server storage for backups - what do you use?

webbouk

Well-known member
I'm currently running two nightly backups, both using JetBackup - one to back up all cPanel accounts to the server and another backup of the same off server to Amazon AWS just as an insurance should the server ever fail.
Both backups are set to store daily for seven days and then overwrite leaving rolling seven day backups.

However, the AWS free tier usage only allows for 2,000 Put, Copy, Post or List Requests of Amazon S3 and I've exceeded that in less than two days so therefore starting to acrue charges.

What other options are available for large website/forum backups 'off server', preferably free or cheap :)
 
For what AWS S3 offers, it's relatively cheap already - especially if you know how to properly utilise S3 bucket lifecycle management and Standard Infrequent Access storage class reduced pricing.

I write my own custom backup scripts with backup locally + remotely and for remote, I use AWS S3 - ~US$35-45/month for up to 4.6+ TB of stored data. Backblaze S3 compatible storage will be cheaper if you don't mine less transfer speed if you're in a location that isn't close to Backblaze servers.

Other S3 compatible providers include, Linode, DigitalOcean and Wasabi.

FYI S3 AWS requests are relatively cheap too - last month's bill for requests for just one AWS region's s3 buckets

1609542542286.webp
 
Last edited:
For what AWS S3 offers, it's relatively cheap already - especially if you know how to properly utilise S3 bucket lifecycle management and Standard Infrequent Access storage class reduced pricing.

I write my own custom backup scripts with backup locally + remotely and for remote, I use AWS S3 - ~US$35-45/month for up to 4.6+ TB of stored data. Backblaze S3 compatible storage will be cheaper if you don't mine less transfer speed if you're in a location that isn't close to Backblaze servers.

Other S3 compatible providers include, Linode, DigitalOcean and Wasabi.

FYI S3 AWS requests are relatively cheap too - last month's bill for requests for just one AWS region's s3 buckets

View attachment 243272

So how much storage you got?
 
Not bad, how about if you want to migrate from them?
Yeah that would be costly. IIRC, Blackblaze had a Black Friday special for free transfer costs on their end for AWS S3 migrations. Probably deal ended now. If it's just backup sets, probably cheaper to just switch backup scripts to the new backup provider and change AWS S3 bucket lifecycle management to a shorter time deletion threshold to over time purge backups stored in AWS S3 and maybe just migrate last XX days of AWS S3 backups to new provider instead of all of them.

I usually keep a local backup copy for at least 14-30 days so it's easier and cheaper to also just transfer them to new provider instead of transferring from AWS S3 to new provider.
 
Yeah that would be costly. IIRC, Blackblaze had a Black Friday special for free transfer costs on their end for AWS S3 migrations. Probably deal ended now. If it's just backup sets, probably cheaper to just switch backup scripts to the new backup provider and change AWS S3 bucket lifecycle management to a shorter time deletion threshold to over time purge backups stored in AWS S3 and maybe just migrate last XX days of AWS S3 backups to new provider instead of all of them.

I usually keep a local backup copy for at least 14-30 days so it's easier and cheaper to also just transfer them to new provider instead of transferring from AWS S3 to new provider.

I was thinking about aws , but in my opinion could cost cheaper and less headaches if you have own big hosting plan with lots of storage.
 
I was thinking about aws , but in my opinion could cost cheaper and less headaches if you have own big hosting plan with lots of storage.

You loose high availability and high redundancy in case of storage failures that is what AWS S3's key benefit is.
 
If storage occupied space is an issue you can always at the expense of higher cpu/mem usage during backups, compress with better compression algorithms that result in smaller backup sizes. See my compression algorithm comparison benchmarks for compression speed vs compression ratio at https://community.centminmod.com/th...d-vs-brotli-vs-pigz-vs-bzip2-vs-xz-etc.18669/

with zstd compression you could even improve compression ratio more with zstd data dictionary training!

pigz is multi-threaded gzip and if you want faster and smaller compressed sizes than gzip, look at zstd compression or even lbzip2 which is multi-threaded bzip2

1609546266681.png
 
Last edited:
On if storage occupied space is an issue you can always at expensive of higher cpu/mem usage during backups, compress with better compression algorithms that result in smaller backup sizes. See my compression algorithm comparison benchmarks for compression speed vs compression ratio at https://community.centminmod.com/th...d-vs-brotli-vs-pigz-vs-bzip2-vs-xz-etc.18669/

View attachment 243274

I can see you are ovh too? I'm planning for the future to migrate to a bigger hosting with lots of storage or other option.
 
Only use OVH for my dedicated Centmin Mod projects dev server where I do testing/development for Centmin Mod. Don't use OVH for any production stuff :)
 
If storage occupied space is an issue you can always at the expense of higher cpu/mem usage during backups, compress with better compression algorithms that result in smaller backup sizes. See my compression algorithm comparison benchmarks for compression speed vs compression ratio at https://community.centminmod.com/th...d-vs-brotli-vs-pigz-vs-bzip2-vs-xz-etc.18669/

Just ran one of my backup scripts testing pigz/gzip level 5 (42MB) vs zstd level 4 (39MB) vs zstd level 9 (33MB) vs zstd level 19 (29MB). Look at the resulting compressed tar size for site's public web root which has 208MB uncompressed size

Code:
Jan 2   08:26   42M    mybackup_public_020121-082553.tar.gz
Jan 2   08:27   39M    mybackup_public_020121-082655.tar.zst
Jan 2   08:27   33M    mybackup_public_020121-082728.tar.zst
Jan 2   08:30   29M    mybackup_public_020121-082940.tar.zst
 
For what AWS S3 offers, it's relatively cheap already - especially if you know how to properly utilise S3 bucket lifecycle management and Standard Infrequent Access storage class reduced pricing.

I write my own custom backup scripts with backup locally + remotely and for remote, I use AWS S3 - ~US$35-45/month for up to 4.6+ TB of stored data. Backblaze S3 compatible storage will be cheaper if you don't mine less transfer speed if you're in a location that isn't close to Backblaze servers.

Other S3 compatible providers include, Linode, DigitalOcean and Wasabi.

FYI S3 AWS requests are relatively cheap too - last month's bill for requests for just one AWS region's s3 buckets

View attachment 243272
Here's our billing for this month (new account), one backup run...

1609582683075.webp
 
I use Acronis off server backups. It's who my host uses for remote backups and it is bundled with my server.

I use their product for backups on my PC as well. They both seem to work well.
 
Last edited:
I simply use scp to transfer data from my db- and webserver to my personal NAS.

My servers also create snapshots every x hours.
 
I have
1. 500GB NVMe - system & forum installed on it. to speed up & run forum fast
2. 14TB HDD installed as internal data, all attachments goes there. have 7TB files
3. 14TB HDD for backups, automatically every Sunday

So, dorum & files devided, minimal risk to lose data.
all files & settings are under control!

I don't trust cloud hosting
 
I have
1. 500GB NVMe - system & forum installed on it. to speed up & run forum fast
2. 14TB HDD installed as internal data, all attachments goes there. have 7TB files
3. 14TB HDD for backups, automatically every Sunday

So, dorum & files devided, minimal risk to lose data.
all files & settings are under control!

I don't trust cloud hosting
What hosting?
 
Top Bottom