SSH Backup Script?

frm

Well-known member
I run these two commands to do an SSH backup, one on the DB server and the other on the files server:

mysqldump -u xf_u -p xf_p > site-$(date +%F).sql
tar -zcvf site-xfver-date.tar.gz public

What I would like is an SSH (BASH) script from this to first connect to the DB server and then check if there's a newer backup and then download it and check if they're the same files at the end (file size/hash) and then disconnect and then repeat for the file server so I can put these BASH scripts on different servers for quicker dumps with redundant backups. If possible, to even first run the backup commands from the remote server and then download the respective files when done too.

A password = variable for the servers and MySQL database password to get the backup at the top of the .sh script would be preferred so that it can be edited each time for security purposes.

Is this doable with scripting or would I still be stuck just downloading and uploading manually?
 
I have one set of crons to do the backups. Then another set of crons to do the sync(s) to remote storage. The backups end up on two different storage providers. There is another set that does the cleanup of anything erroneous. About the only thing I verify is server operation, so if my site is down backups are halted.

I use a fair amount of storage this way, but when the unexpected happens I've got more than one source to restore from. It has saved me many times.
 
  • Like
Reactions: frm
I have one set of crons to do the backups. Then another set of crons to do the sync(s) to remote storage. The backups end up on two different storage providers. There is another set that does the cleanup of anything erroneous. About the only thing I verify is server operation, so if my site is down backups are halted.

I use a fair amount of storage this way, but when the unexpected happens I've got more than one source to restore from. It has saved me many times.
Care to share your cron tasks? :)
 
Here is my backup script I tweaked for my needs:
Code:
#!/bin/bash

# Basic configuration: datestamp e.g. YYYYMMDD

DATE=$(date +"%Y%m%d")

# Location of your backups (create the directory first!)

BACKUP_DIR="/path/to/directory"

# MySQL login details

MYSQL_USER="backupuser"
MYSQL_PASSWORD="XXXXXXXXXXXXXXXX"

# MySQL executable locations (no need to change this)

MYSQL=/usr/bin/mysql
MYSQLDUMP=/usr/bin/mysqldump

# MySQL databases you wish to skip

# SKIPDATABASES="information_schema|performance_schema"

# Number of days to keep the directories (older than X days will be removed)

RETENTION=5

# ---- DO NOT CHANGE BELOW THIS LINE ------------------------------------------
#
# Create a new directory into backup directory location for this date

mkdir -p $BACKUP_DIR/$DATE

# Retrieve a list of all databases

# databases=`$MYSQL -u$MYSQL_USER -p$MYSQL_PASSWORD -e "SHOW DATABASES;" | grep -Ev "($SKIPDATABASES)"`

# Dumb the databases in seperate names and gzip the .sql file

# for db in $databases; do
# echo $db
$MYSQLDUMP --default-character-set=utf8mb4 --user=$MYSQL_USER -p$MYSQL_PASSWORD --single-transaction dbname | gzip > "$BACKUP_DIR/$DATE/dbname.sql.gz"

# Remove files older than X days

find $BACKUP_DIR/* -mtime +$RETENTION -delete

FYI: This script makes backups of a database with close to 30 mln posts.
 
  • Like
Reactions: frm
Top Bottom