Multiple Databases?

I currently have a large xenforo site. The database size is 1.5 gb. I currently have a dedicated server but would like to save some money each month by going to to web hosting setup. The only issue I am running into is the web hosting companies all say unlimited databases BUT they limit each database size to 1gb. Is it possible to split the xenforo install across multiple databases?
 
Almost every web hosting company I have checked from godaddy to 1and1.com (I have check at least 15 different companies) have Database size limits. But allow unlimited databases. So I could have 1000 1 gb databases but not a single database that was a 1000 gb.
 
I currently have a dedicated server that I pay $99.99 a month that I don't use fully. Sadly, dedicated hosting solutions are not solutions. I am most likely going to have to go to a VPS to get a cost savings but still have enough system resources to handle my site.
 
I vote VPS too - far more flexible than shared hosting, but also more configurable than dedicated (in that you can generally choose smaller/cheaper configuration on VPS than you could on dedicated and save quite a bit of money).
 
I would personally like to archive older posts in a separate database in a read only type of thing.

Why a separate database? Makes it far more difficult to manage data across multiple databases.

Why bother archiving them if you can't access them any more? Either delete them or leave them be.

If saving disk space is your primary concern, I fear you may be doing it wrong.
 
It would be accessible in a read-only format on the web for anyone searching ... the smaller db would be easier for those of us who like to download a local copy of the DB.

archived data = database 1 (1 gig)
current data = database 2 (200 Meg)
 
Compress them first - you'll probably find you get close to 3:1 compression on database dumps. I have a 1GB database which compresses to a 340MB file.

Why do you want to download a local copy anyway? For backup purposes? Try setting up an automated routine to dump your database to disk, compress it, then archive it to Amazon S3.

It's fully automated so doesn't take you any time, independent of any local internet download speeds or quotas you might have, and since it is going to be server-to-server, it is likely much faster than downloading anyway. You can then download any copy you want from S3 when you need it.

I have several hundred gigabytes worth of database backups on S3 costing less than $10 per month to store ... this could be cheaper still if I started using Glacier for archiving.
 
Back
Top Bottom