Database Backup..

FredC

Well-known member
My Server is copying the database nightly at around 3am for a backup and dumping it i dont know where.. the problem (besides not knowing where the backups are being stored) is that this puts quite a strain on my server and causes a half dozen server errors every single night and sometimes more depending on the activity levels on the forum itself that time of night..

So id be interested in knowing my options in regards to backing up my DB that would be less resource intensive??

I know it would help if you knew how i was currently backing up my data but i dont really know that either..

I am on a Dedicated Apache CENTOS 6.5 x86_64 standard server with WHM/cPanel

These backups have rendered my forum 90% useless for an hour each night and has destroyed my international traffic.. I really need a better solution. Preferably Free!!

HELP!!
 
My Server is copying the database nightly at around 3am for a backup and dumping it i dont know where.. the problem (besides not knowing where the backups are being stored) is that this puts quite a strain on my server and causes a half dozen server errors every single night and sometimes more depending on the activity levels on the forum itself that time of night..

do you know at least how are you backing up your db? from a control panel? with a shell script?
 
do you know at least how are you backing up your db? from a control panel? with a shell script?
Unfortunately No..

I know more about Unicorns then i do Servers. Once upon a time we hired a young guy to manage our server for us.. He actually was a member of my forum and worked for the company that hosted my site.. He did all the server maintenance then some.. unfortunately for me and good for him he wound up getting work with Facebook and hasnt been heard from since.. That was at least a year ago.
 
Does cPanel have the ability to do a full backup? I haven't played with it that much (just got a license for it a few days ago but haven't set up a VPS to install it on yet).
If so, that could be part of the problem.
 
@FredC
you can launch a search for files bigger that X size. In this way you are going to find where backups are stored.
or
take a look at cron entries, if the guy created a custom script to run the backup process you can find it there.
 
Last edited:
If your just wanting to do a database back up, you could try this add-0n: https://xenforo.com/community/resources/sypex-dumper-backup-and-restore-your-database.1102/
You can then backup your database direct from your Xenforo Admin CP and save the backup file to your computer, it also saves a copy in your backup folder that you make within your sites install path.

My database thats almost 400MB becomes around a 40MB backup file and only takes around 30 sec to backup.

Screen Shot 2014-09-08 at 11.27.26.webp
 
If your just wanting to do a database back up, you could try this add-0n: https://xenforo.com/community/resources/sypex-dumper-backup-and-restore-your-database.1102/
You can then backup your database direct from your Xenforo Admin CP and save the backup file to your computer, it also saves a copy in your backup folder that you make within your sites install path.

My database thats almost 400MB becomes around a 40MB backup file and only takes around 30 sec to backup.

View attachment 83128
When I'm starting using MariaDB 10 with Sypex I have serious problem with it.
The reason I stop using Sypex as my dumper.

Don't know how to fix, I even ask for assistance of Me @Brogan.
 
Thanks for the advice guys, good stuff.. I havent had a chance to try anything special yet but i will certainly rport back when i start solving trying to solve and fix this issue..
 
Just my two cents: I created a bash script and added it to crontab... everyday, the database is dumped on a private folder, everything is copied from the public folder to the private one, and push them on my bitbucket's private repo... not so secure but does the job :D

EDIT: and also, this does not put that much pressure on the resources :D
 
I created a bash script and added it to crontab... everyday, the database is dumped on a private folder, everything is copied from the public folder to the private one, and push them on my bitbucket's private repo... not so secure but does the job :D
Care to share the script? :)
 
Care to share the script? :)
Of course :D

Here's what I've created:
Code:
#! /bin/bash
DB_NAME=''
DB_USER=''
DB_PASS=''
BACKUP_DIR=''
BACKUP_FROM=''
TIMESTAMP=$(date +'%F %T (GMT%:z)')
cd "$BACKUP_DIR"
mysqldump --opt --single-transaction -u$DB_USER -p$DB_PASS $DB_NAME > "$DB_NAME.sql"
tar -zcpf files.tar.gz "$BACKUP_FROM"
BRANCH_NAME=$(date +'%Y%m%d')
git checkout -b $BRANCH_NAME
git add -A
git commit -q -m "$TIMESTAMP"
git push origin $BRANCH_NAME
rm "$DB_NAME.sql" files.tar.gz

Create a .sh file some where else beside in public and in the back up directory
e.g. /root/daily_backup.sh

fill in the necessary variables 'BACKUP_DIR' is the directory where everything will be backed up (e.g. /root/backups) make sure to init a git repo inside that directory and add your bitbucket repo:
Code:
git remote add origin https://{username}:{pass}@bitbucket.org/{username}/{repo name}.git
and 'BACKUP_FROM' is the public facing diretory, (e.g. /home/user/public_html).

Make sure to give the sh file execution capabilities:
Code:
chmod 755 /root/daily_backup.sh

And add a cron entry:
Code:
crontab -e
At the end of the file, create a new line and add:
Code:
0 0 * * * /root/daily_backup.sh > /dev/null 2>&1
And if you want some kind of logging system so that you can check if there's any error change /dev/null to /root/daily_backup.log


Disclaimer
Use at your own risk, I wont be held responsible if your server gets f**ked up :p Though mine is working without any hiccups :D
 
  • Like
Reactions: rdn
Top Bottom