Help with bash backup script

Brent W

Well-known member
This runs fine via command line ./backup.sh but when added to a cron the scp part does not work. It completes the tar command because the tar file exists in the directory but it doesn't scp it to the remote backup server and remove the file in the local directory.

Code:
#!/bin/bash
PREFIX="aspies"
DATADIR="/home/nginx/domains/aspiescentral.com/public"
BACKUPDIR="/home/filebackup"
REMOTEBACKUPDIR="/home/backup/aspiescentral.com/files/"
NOW=$(date +"%d-%m-%Y")

tar -cf $BACKUPDIR/$PREFIX.$NOW.tar $DATADIR

scp $PREFIX.$NOW.tar root@10.10.10.10:$REMOTEBACKUPDIR && rm -f $BACKUPDIR/$PREFIX.*

I have private keys setup so no password is required. It works, like I said, if I manually run the script.
 
Last edited:
Ouch.. you allow root logon? :confused:
Instead of using scp, have you thought about using rsync instead? Much better to use.
 
Ouch.. you allow root logon? :confused:
Instead of using scp, have you thought about using rsync instead? Much better to use.
Yes I'm a terrible person for allowing root. Blah blah blah.

I haven't ever messed with rsync. Just seemed easier to use scp but now I am learning it might not be.
 
Is there an error message logged? Maybe fully qualify the scp command as /sbin/scp or wherever it is.

If you su to root like "su root" without sourcing in your profile, does the command line execution still work? If not, maybe there is something in your profile setting a path that the cron process doesn't have available.
 
Yes I'm a terrible person for allowing root. Blah blah blah.

I haven't ever messed with rsync. Just seemed easier to use scp but now I am learning it might not be.
Not that you a terrible person... but if you had CSF installed and email alerts and saw (on average) the 300+ emails (with 5 attempted logons per email) x 12 VPS's that come in from China alone you might rethink allowing root access from anywhere remote. ;)
 
Not that you a terrible person... but if you had CSF installed and email alerts and saw (on average) the 300+ emails (with 5 attempted logons per email) x 12 VPS's that come in from China alone you might rethink allowing root access from anywhere remote. ;)

I do have CSF installed and do realize that. I just don't want this thread to turn into what it is now turning into. It has nothing to do with my question.
 
  • Like
Reactions: rdn
I do have CSF installed and do realize that. I just don't want this thread to turn into what it is now turning into. It has nothing to do with my question.
The recommendation is to use rsync instead of trying to use scp in a batch file/script. More features, less bandwidth useage. SCP is fine for command line use for individual files, but it rsync rules for synchronizing files/directories between two servers or a server/computer.
That's why I recommended it - and the made the implication that using root to access it is not one of the best practices (which you don't have to go by).
 
To be honest I second rsync over this but for a completely different reason and circumstance....I spent like 3 days writing a full dir/sql backup script and though it works it really feels clunky (I don't think my live web server can handle it to be honest) and like I had to write too much of the code to do what should be a couple of stacked executions, I just looked into setting up rsync after reading this thread and at first glance, it seemed faster to work with rsync than anything else I was doing and then to get over the hiccup of running windows and not wanting to setup a vm specifically for backups just yet, I installed cygwin, enabled ssh, rysnc and bash and ended up with a friendly and familiar looking place....

capz.webp


Now i'm going to try setting up a VM over the weekend with a super trimmed up nix flavor of some sorts with a basic lamp stack and see if I can automate backups to a virtual machine that I can clone and move around for long term storage reasons now that I am actually focusing my efforts on building my own site.

I know this doesn't exactly solve your problem, but I figured since this thread made me google some stuff and changed my course of action that it might be worth throwing out there if you are on the fence about trying it.


as far as your script...if it works just not on the cron then it probably isn't getting the key pair, try specifying a home folder
 
What I am looking for seems pretty simple:

mysqldump database
rsync database to another linux server

tar public folder
rsync public folder to another linux server

keep 7 days worth of incremental backups for each
 
I'll see if I can scare my script I used up (I don't use it anymore as I just have automated backups of my VM's and then rsync them to my Mac locally - I can restore the full VM faste).
What it did keep a weeks worth of backups of both the DB (full) and archived the forum structure up and kept 7 days worth of it on the server. I then used rsync on my Mac in a cron to pull them in. For me it wasn't worth hassling with incremental. If I had to restore I liked knowing I had the full structure to do at once instead of having to do it in steps.

EDIT:

Backup script that I used is here: https://servinglinux.com/threads/backup-script.3/

With some sight modification it could be used to do both the DB and the forum structures. You'd just name them differently and place them in /usr/local/bin and call them from a CRON job.
 
Last edited:
Just an FYI I got this working. I did not have the $DATADIR variable in the scp command so it was looking in the wrong place for the file.
 
Top Bottom