1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

Backup everything into the cloud, forever???

Discussion in 'Off Topic' started by =MGN=RedEagle, Feb 6, 2013.

  1. =MGN=RedEagle

    =MGN=RedEagle Well-Known Member

    What do you guys make of this: http://www.bitcasa.com/

    Then a guy like me comes along and stores 6 million files... lol.
  2. Jake Bunce

    Jake Bunce XenForo Moderator Staff Member

    I have an external hard drive.
  3. =MGN=RedEagle

    =MGN=RedEagle Well-Known Member

    Try backing up 6 million files to it... takes forever... then they get stolen :S
  4. Sim

    Sim Well-Known Member

    I just hope people don't store their only copy of files there ... $10 per month (or less) is not a sustainable business model for "unlimited". Make sure you can still access your data once the music stops playing.

    Compare with Amazon's Glacier @ $0.01 per GB per month ... for $10, you can store 1TB on Glacier.

    Then again, I guess if you assume that Bitcasa has a similar cost base such that $0.01/GB/m is profitable, then they only need to hope that on average, their users will store less than 1TB each and still make money on $10pm.

    Realistically though, other than for automated processes, actively using and managing > 1TB in the cloud is generally unrealistic unless you have a Gbit internet link and can sustain high data transfer rates all the way to their data centre - effectively making it LAN-like. Anything less is just too slow for anything other than backups or small files.
  5. ManagerJosh

    ManagerJosh Well-Known Member

    What is your purpose of storing everything in the cloud?
  6. Andy.N

    Andy.N Well-Known Member

    What do you suggest for automatic daily backup of servers/database/files then?
  7. Mouth

    Mouth Well-Known Member

    Andy.N likes this.
  8. Andy.N

    Andy.N Well-Known Member

  9. ManagerJosh

    ManagerJosh Well-Known Member

    Folks, don't just backup data to the cloud just because you can. Please remember to evaluate your data and ensure you're not storing data that's stored in plain text, including, but not limited to:
    • Passwords
    • Credit Card
    • Driver's License Numbers
    If you are backing up these data, please be sure you're encrypting your backups extremely well with AES or something else difficult :D
    Adam Howard likes this.
  10. Sim

    Sim Well-Known Member

    All I was suggesting was to make sure you use a service you are sure has a sustainable business plan. If you are happy with Bitcasa and are sure you can retrieve your files in the future (and they have a useable API, which I don't think they have published yet?), then fine. I have no experience with them, so can't comment on the quality of their service.

    I run a multi-tier automated backup strategy for my websites (currently about 50 sites, mostly smallish WordPress sites, plus several 500K post forums plus a 1M post forum):
    • I run Linode VPSes which I have backed up daily using their built in backup service, it's crude, but effective.
    • I then run daily database dumps, which are compressed and stored on the VPS, plus I make daily compressed archives of the files for each website, again stored on the VPS. These are deleted after 7 days.
    • I upload a copy of the databases and files to Amazon S3 nightly. These files are kept for 90 days. One day I'll set up automatic archiving of those old databases and files to Amazon Glacier to reduce costs and keep them for longer. I currently use about 135GB of RRS on S3 for these backups.
    • Finally, I download a copy of the databases and files to my local file server nightly. These files are kept indefinitely. Currently, these backups consume around 700GB on my file server - they only go back 2 years, earlier data was lost in the "Great Windows Home Server Disaster of 2011".
    For my own computers, I run Crashplan+ and have automated backups to both my local file server and Crashplan's cloud storage. I have a family plan, so I have my 3 computers, 2 from my parents, plus my sister, parents-in-law and sister-in-law all backing up to my Crashplan server + Crashplan cloud. Note that this only backs up key data files. Currently my local Crashplan server uses 360GB of space for all these backups. My Crashplan cloud backup usage across all computers is currently just under 2.1TB (of which 1.7TB is my photo/video collection, which took about 6 months to back up!).

    I also do bare-metal backups of all my local computers (including my media server) to my file server nightly using Acronis.

    My file server runs unRAID and has two sets of drive pools - one for data, the other for backups. I mirror the data pool to the backup pool nightly, so in addition to the built-in redundancy from the parity drive, I always have at least 2 copies of all data files stored on the file server.

    My next plan is to set up a cheap file server at my sister-in-laws place across town (once I can convince her to change ISPs) and run another Crashplan server on it so I can backup my local computers offsite (but physically nearby), so I can quickly restore data in a worst-case-scenario (total loss of all computers and servers in my office). I do occasionally copy data to several 2TB drives and drive it over to my sister-in-law's house.

    There is a worser-case-scenario whereby both my house and my sister-in-law's house are destroyed at the same time, but that probably requires a nuclear explosion or asteroid impact, so I'm guessing my data will be the least of my concerns. But even then, I have copies of my most important data stored in the Cloud anyway!

    I do like Crashplan's ability to choose multiple backup destinations (local + nearby + cloud) ... when dealing with such large volumes of data, proximity to that data is important if you ever want to be able to restore it in a reasonable amount of time.

    I can transfer 4TB of data from my sister-in-law's house to my house in about 1 hour (return trip by car!). That's nearly 10Gbps transfer rate! What's more, this scales infinitely, just add more drives. If I added 2 more 2TB drives, I could transfer 8TB in the same time: 20Gbps transfer rate. Even our yet-to-be-built Gbps-capable National Broadband Network here in Australia can't compete with those transfer rates ;)
    Dinh Thanh and Andy.N like this.
  11. Mouth

    Mouth Well-Known Member

  12. Adam Howard

    Adam Howard Well-Known Member

    <--- Does not depend on Cloud Storage to backup important info.

    Although I may sometimes use it as 5th or 6th alternative... ie... Having 1 - 5 alternatives before it. And when I do use the cloud, the minimal security I use is 2048 bit encryption (twice over), then zipped with password, encrypted again, and finally rar with password.

    Sensitive stuff though never makes it online though (ever).
  13. Sim

    Sim Well-Known Member

    So, I assume this would also apply to your website database backups? For security?

    How about the original database itself on your MySQL servers? Do you also use 2048 bit encryption (twice over) on that, then zip is with a password, encrypt it again and then rar it with a password? Makes it a little bit difficult for PHP to read it doesn't it?

    Personally, I think your MySQL server is far more vulnerable than most cloud backup solutions.
    Markos and SneakyDave like this.
  14. Adam Howard

    Adam Howard Well-Known Member

    Two different topics all together. OP was talking about personal cloud backups and nothing to do with sites. At least that is what is compared when you visit that link.

    As for our site.... We have a few steps to keep it secure. If X person got a hold of it and tried to import it else where.... It would be very useless to them & odds are high they would only have part of it. We also make it a habit to keep the database else where. If you visit SociallyUncensored.eu the database isn't there (only files).
  15. Mouth

    Mouth Well-Known Member

    How does this provide any level of protection? If a hackers gains file level access to your webserver, they only need to read the library/config.php to see the location and account details to access your DB. If they have file level access to the webserver, then they are also most likely able to connect to the DB server and copy/dump it's contents.
    SneakyDave likes this.
  16. Adam Howard

    Adam Howard Well-Known Member

    We have steps that I'll not talk about to prevent that. ;)

Share This Page