XF 2.1 Backup XF 2.1 locally

upnet

Active member
Want to confirm my backup strategy is ok
Server: CentOS

Dumped the database using mysqldump
Code:
/usr/bin/mysqldump --opt --user=<db username> --password --single-transaction --skip-lock-tables <database name> --default-character-set=utf8mb4 > /home/<homedirectory>/<db name>_<date>_backup.sql

Compressed data and internal_data directories
Code:
tar -zcvf forumdata_backup.tar.gz /home/<user>/public_html/data

tar -zcvf foruminternal_data_backup.tar.gz /home/<user>/public_html/internal_data

Downloaded all to my local machine via FTP

Am I missing anything or is there a better strategy?
 
My local machine is Windows 10. Can you tell me the steps you would use to do so?

rsync is a linux-only beauty...though Bash on Windows 10 would probably support it. Are you looking for a one-time backup for a move or something repetitive to maintain backups?

What you did will work either way, it just requires you actually doing it as opposed to something automatic.
 
 
Why not use phpmyadmin to back up your database and download the backup to your local hard drive?

Also, many VPS hosters (such as Linode, which I use) offer automatic daily backups at a time you specify. I pay $5 a month for the daily backups along with a weekly snapshot backup that I do every Sunday manually.
 
Why not use phpmyadmin to back up your database and download the backup to your local hard drive?
Requires cPanel. SSH mysqldump is better for large databases?

Also, many VPS hosters (such as Linode, which I use) offer automatic daily backups at a time you specify. I pay $5 a month for the daily backups along with a weekly snapshot backup that I do every Sunday manually.
We have a regular backup solution. Looking to make a local copy.

Aware of the add-on, though the support looks not so great in the discussion. We aren't looking to create a backup routine, but simply make a local backup.
Are you looking for a one-time backup
Yes, or maybe something we would do once or twice a month for added peace of mind.
 
phpMyAdmin will happily timeout and use large amount of resources if you try to do that on any decent sized DB.

Taking a logical backup is a job for tools like mysqldump or mydumper, not pma.
I had issues with phpmyadmin when I was on a shared hosting server, even with a 600MB database. Since moving to a VPS, I set my own time-outs and have never had an issue. I do a manual phpmyadmin backup every Sunday morning and it has worked a charm since migrating to VPS last March. Admittedly my database is still under 1GB in size, compressed to around 150 MB via gzip.

mysqldump is superior to phpmyadmin. Command line tools are generally better than GUI tools when it comes to consistency, reliability, and speed.
No question this is the case, but it's also easier to use.
 
Who say this?

Have a look at xtrabackup from percona.

I was scratching my head when I saw this too.

I'm running Ubuntu 18.04. I don't have cPanel or any other Linux admin GUI anywhere near my VPS, and I run phpmyadmin with no problem.
 
I am using something similar, but I'm using it on a regular base (every night).
The only thing missing is a CLI command to put XF in maintenance during backup.
I believe the single transaction specification in mysqldump allows us to do so while the site is live. Compressing internal_data takes a long time, and did not result in substantial file size savings. We cannot realistically go into maintenance mode for that length of time. Is there any danger in running these commands I have posted while the site is live? As far as I can tell the site was not affected. Any possibility the backups will not be viable?
 
Any possibility the backups will not be viable?

Test it - create a new site and restore the database and files to it and see if it works (you may have to change URLs in the settings to get links working).

Always test your backups - especially when you make a change to how you run them.
 
Top Bottom