Backups!

spk100

Member
I have a dedicated server with a good host (not naming names).

I have like 8 to 10 cpanel accounts and 99% of the sites are WordPress with one being Xenforo.

Recently I had an issue with my root partition disk space and server had some issues, specifically with my xenforo database which got corrupted and they had to restore a backup from 4 days ago.

While the host did a great job and managed to get things sorted, I was thinking of having an alternative backup of my sites.

Xenforo site is by far the biggest in terms of DB size. My forum has been up since 2006.

For the WordPress, I was thinking I will add one of the several backup plugins (All in One WP or Duplicator) and connect it to my Google Drive for daily backups.

With the Xenforo, I was wondering if there is any other option.

Looking for cost effective options. The overall gzip backup of all accounts put together is somewhere around 200G to 250G (plus or minus 20%).

Host already provides me with a backup (in addition to the cpanel backup) but I just thought I shall explore an additional backup option.


PS: I am not a server person. I know my stuff but I am not an expert.
 
I take advantage of WHM's automated backups to automatically backup all of my sites to Amazon S3. It costs me less than $20/month and I backup all my sites daily. That said, I'm not sure what the costs would be for 200-250Gb. None of my site's backups are nearly that big.
 
I take advantage of WHM's automated backups to automatically backup all of my sites to Amazon S3. It costs me less than $20/month and I backup all my sites daily. That said, I'm not sure what the costs would be for 200-250Gb. None of my site's backups are nearly that big.

I contemplated this. I already have a 2TB subscription to Google Drive and I thought I shall take advantage of it.

But my host said this when I asked them and I don't understand it.

Using google drive backup will saturate your public interface. Which is the same interface that you use to server websites.
 
I contemplated this. I already have a 2TB subscription to Google Drive and I thought I shall take advantage of it.

But my host said this when I asked them and I don't understand it.
Yeah, I'm not sure what that means, either, to be honest. I'd assume they're talking about bandwidth maybe, meaning your site may run slow while you're uploading the backup off-site, but that would only be a guess on my part.
 
I have a dedicated server with a good host (not naming names).

I have like 8 to 10 cpanel accounts and 99% of the sites are WordPress with one being Xenforo.

Recently I had an issue with my root partition disk space and server had some issues, specifically with my xenforo database which got corrupted and they had to restore a backup from 4 days ago.

While the host did a great job and managed to get things sorted, I was thinking of having an alternative backup of my sites.

Xenforo site is by far the biggest in terms of DB size. My forum has been up since 2006.

For the WordPress, I was thinking I will add one of the several backup plugins (All in One WP or Duplicator) and connect it to my Google Drive for daily backups.

With the Xenforo, I was wondering if there is any other option.

Looking for cost effective options. The overall gzip backup of all accounts put together is somewhere around 200G to 250G (plus or minus 20%).

Host already provides me with a backup (in addition to the cpanel backup) but I just thought I shall explore an additional backup option.


PS: I am not a server person. I know my stuff but I am not an expert.
I had the same issue. Found that i had to empty the trash can which located in cpanel in your directory.
For your back ups look at auto backups which is what your host has if you have the same host as me.
 
I had the same issue. Found that i had to empty the trash can which located in cpanel in your directory.
For your back ups look at auto backups which is what your host has if you have the same host as me.

I do have the cpanel backup set up and that saved the day.
 
But my host said this when I asked them and I don't understand it.
yay, something I can probably answer.
they mean that uploading those backups to a remote site will take up a lot of your public bandwidth.
that is what users of your site use to visit.
depending on the speed of your uplink you could slow your site down until the transfer was done.
i have read from many that say depending on host back up is not a good choice usually.
tracy has a script that runs and backs mine up on his server and i use winscp to transfer.
 
I use one admin panel backup technology, and two different automated script processes, resulting in three different automated backup types done daily. My automated scripts are a mix of scripts and free software available with most linux distros. All Databases and Files.

Added some object storage within my host (a few dollars), and a separate private S3 storage bucket on AWS (fewer dollars). They are synced daily via automation.

Haven't experienced a critical failure in some time, but the most I've ever lost was four hours of forum posts.
 
offload your data to s3 in the first place so you don't have high backup overhead. 99% of that is likely attachments, not the db or site files.

I reduced my backups from being 50gb and taking 5 hours to around 150mb and taking seconds, storing everything on s3 in the first place. no bandwidth spikes or egress to cover.
 
offload your data to s3 in the first place so you don't have high backup overhead. 99% of that is likely attachments, not the db or site files.

I reduced my backups from being 50gb and taking 5 hours to around 150mb and taking seconds, storing everything on s3 in the first place. no bandwidth spikes or egress to cover.

That's a good idea. I need to explore more or find out if there are any addons available.
 
I run a VPS without any admin panels so I just use scripts.

First off, I use Cloudflare R2 for my attachments so that is sorted already.

I compress the rest of my web data (around 35gb) and upload it (daily) to my TrueNAS server at home.
Same for the DB which is around 9GB of compressed data.

My TrueNAS server uploads all that data to a cloud backup service for extra redundancy.

And don't forget to regularly restore that data to see if it all still works!
 
Backed up every night to the host backup directory and off site to an ftp location (my home NAS). Retention equels six days on site. On the home NAS ftp location months.

1000010940.webp
 
That's a good idea. I need to explore more or find out if there are any addons available.
it's native in xf 2.3. just need to set the data and internal data directives in the config file. (plugin no longer required in new versions)

just need the conifg file edits - and of course, move your stuff to the bucket with a tool like s3cmd (how to: https://www.digitalocean.com/docs/s...sage/#put-all-files-in-your-current-directory)
 
I'll share my own database backup script that works with linux & nginx to close the board with a maintenance page while the backup is being taken and compressed for storage with Zstandard (Zstd). We can get 3.28GB .sql file compressed to 652mb in a few minutes. Then you can add whatever offsite backup storage command you use, backblaze is about $6 per terabyte, I'm using that and restic for a 900GB forum to backup multiple locations like /var/www and /usr/local/backups at once, using one command. It's very good and has saved us a couple times.

Anyway say you have a folder called /usr/local/backups and inside you have the following files;
  1. backup.sh
  2. dump.sh
  3. maintenance.html
  4. minime.sql.20250309.zst
Run: ./backup.sh and it will;
  • copy maintenance.html from /usr/local/backups to /path/to/webroot (your forum}
  • this will replace your entire forum with the maintenance.html page
  • give maintenance.html correct permissions
  • give the command to run: dump.sh
  • - dump your database with mysqldump
  • - compress the database file with Zstd as minime.sql.20250309.zst for storage
  • - delete the original extracted .sql file
  • - keep only the last 3 backups
  • backup.sh now deletes maintenance.html in webroot and reopens your forum.
Run backup.sh as a cronjob or manually as needed.

Code:
[root@localhost]# ./backup.sh
Maintenance page copied successfully.
Ownership and permissions set.
minime.sql.20250309  : 19.44%   (  3.28 GiB =>    652 MiB, minime.sql.20250309.zst)
Backup created: minime.sql.20250309.zst
-rw-r--r--. 1 root root 683693093 Mar  9 07:52 minime.sql.20250309.zst
Remaining backups:
-rw-r--r--. 1 root root 683693093 Mar  9 07:52 minime.sql.20250309.zst
Backup completed successfully.
Maintenance page removed.

You'll need to modify your location for nginx;

Code:
   # index.php
   index index.html index.php;

   location ~ /(internal_data|library|src) {
   internal;
   }
 
   # Maintenance Mode
   if (-f $document_root/maintenance.html) {
   return 503;
   }
   error_page 503 @maintenance;
   location @maintenance {
   rewrite ^(.*)$ /maintenance.html break;
   }

   # index.php fallback
   location / {
   try_files $uri $uri/ /index.php?$uri&$args;
   }

backup.sh
enter your webroot for maintenance.html to be copied over.
set your user:usergroup for PHP-FPM, nginx/apache. Depends what you use.

Code:
#!/bin/sh

# backup.sh
# enter your webroot for maintenance.html
# set your user:usergroup for PHP

# Define paths for better readability
MAINTENANCE_FILE="/path/to/webroot/maintenance.html"
SOURCE_FILE="maintenance.html"

# Copy the maintenance file to the server directory
if cp "$SOURCE_FILE" "$MAINTENANCE_FILE"; then
  echo "Maintenance page copied successfully."
else
  echo "Error: Failed to copy maintenance page." >&2
  exit 1
fi

# Change ownership and set proper permissions
chown user:usergroup "$MAINTENANCE_FILE"
chmod 644 "$MAINTENANCE_FILE"  # Ensure it's readable by web server
echo "Ownership and permissions set."

# Run the backup script (assuming dump.sh is in the current directory)
if ./dump.sh; then
  echo "Backup completed successfully."
else
  echo "Error: Backup failed." >&2
  # Clean up and exit with error
  rm -f "$MAINTENANCE_FILE"
  exit 1
fi

# Remove the maintenance page after backup is done
if rm -f "$MAINTENANCE_FILE"; then
  echo "Maintenance page removed."
else
  echo "Error: Failed to remove maintenance page." >&2
  exit 1
fi

# Run offsite backup dump etc below and exit script

dump.sh
Edit BACKUP_DIR (only if you want them someplace else on your drive) and edit your forum database details.

Code:
#!/bin/sh

#----------------------------------------------------------
# dump.sh
# Database backup script with Zstandard compression
# how to: dnf/yum install zstd
# Retains only the 3 most recent backups
#----------------------------------------------------------

# (1) Set up all the mysqldump variables
FILE="minime.sql.$(date +"%Y%m%d")"
DBSERVER=127.0.0.1
DATABASE=databasename
USER=databaseuser
PASS=userpassword
BACKUP_DIR="/usr/local/backups"

cd "$BACKUP_DIR" || exit 1  # Exit if directory change fails

# (2) Remove previous backup for today if it exists
rm -f "${FILE}" "${FILE}.zst" 2> /dev/null

# (3) Perform MySQL database dump
mysqldump --opt --user="${USER}" --password="${PASS}" "${DATABASE}" > "${FILE}"

# (4) Compress using Zstandard with optimized speed/compression settings
zstd -T0 -12 -f "${FILE}"  # -T0 uses all CPU cores, -12 balances compression & speed

# (5) Remove the uncompressed SQL file after compression
rm -f "${FILE}"

# (6) Keep only the 3 most recent backups, delete older ones
ls -1t minime.sql.*.zst | tail -n +4 | xargs rm -f

# (7) Show the user the result
echo "Backup created: ${FILE}.zst"
ls -l "${FILE}.zst"

# (8) Show remaining backups
echo "Remaining backups:"
ls -lt minime.sql.*.zst

maintenance.html
edit to suit.

Code:
<!doctype html>
<title>Site Maintenance</title>
<link href="https://fonts.googleapis.com/css?family=Open+Sans:300,400,700" rel="stylesheet">
<style>
  html, body { padding: 0; margin: 0; width: 100%; height: 100%; }
  * {box-sizing: border-box;}
  body { text-align: center; padding: 0; background: #000; color: #fff; font-family: Open Sans; }
  h1 { font-size: 50px; font-weight: 100; text-align: center;}
  body { font-family: Open Sans; font-weight: 100; font-size: 20px; color: #fff; text-align: center; display: -webkit-box; display: -ms-flexbox; display: flex; -webkit-box-pack: center; -ms-flex-pack: center; justify-content: center; -webkit-box-align: center; -ms-flex-align: center; align-items: center;}
  article { display: block; width: 700px; padding: 50px; margin: 0 auto; }
  a { color: #fff; font-weight: bold;}
  a:hover { text-decoration: none; }
  svg { width: 75px; margin-top: 1em; }
</style>

<article>
    <h1>We&rsquo;ll be back in 5 minutes!</h1>
    <div>
        <p>Sorry for the inconvenience. We&rsquo;re performing routine maintenance and updates at the moment.</p>
    </div>
</article>
 
Last edited:
maybe this would work for apache 🤔

Code:
    # Maintenance Mode
    <IfModule mod_rewrite.c>
        RewriteEngine On
        RewriteCond %{DOCUMENT_ROOT}/maintenance.html -f
        RewriteRule ^(.*)$ /maintenance.html [R=503,L]
    </IfModule>

    # Index fallback to index.php
    DirectoryIndex index.html index.php

    # Error handling for 503 Maintenance Mode
    ErrorDocument 503 /maintenance.html
 
I have knownhost back up my vps every 2 days to their remote back up system. For 10 bucks it takes care of that.
 
Back
Top Bottom