XF2 [8WR] Database Backup

XF2 [8WR] Database Backup [Paid] 2.1.0.8

No permission to buy ($10.00)
Good evening JAXEL,

We use add-on version 2.0.5 because our xenforo is not migrated to the latest version (all the add-on that we use has not yet been updated by their developer).

We want to backup to an EXTERNAL ftp server.

We are facing a connection difficulty ...

The server elements are as follows
  • IP: 45.44.250.43 Port: 2121
  • Directory: / Download / Test /
IF we pass this url ftp://45.44.250.43:2121 in "direct" no problem => OK

On the other hand, the same parameters on the ADD-on do not work (see below screen capture) with this message (in _ftplog.txt)
Code:
-- Connecting to 45.44.250.43:2121...
-- Failed to connect to remote FTP host...


195147

Can you help us ? and tell us where we are wrong

Thank you in advance for your answer,

Regards,
 
You need to enable debug mode on to be able to edit cron entries.

https://xenforo.com/xf2-docs/manual/config/#debug-mode

According to that link, you should NEVER set debug=true on a live site.

Or you can create your own new Cron, similar to original one.

If I use this method, do I just fill in all the blanks with the same info except for the very first one, like this?

195215

The only change is that first line where I put my initials at the end. Will that still work?

I also did not realize that when setting the back up time, you have to use the UTC offset for your time zone. Once I snapped to that, I was able to set it for 3:00am without any problems.

I will do my first test run tonight. For now, I am just going to use it to make a backup of my database and then move it off to a different server using the remote feature. I don't really need to backup the whole file system, just the two attachment directories would be nice, especially if that could be done incrementally. With 30+GB of attachments, I would not want to have back all that up and ship it off to another server every night. Also, I only have 100GB of space on my VPS, so I can't have a lot of huge backup files laying around.
 
Okay, good news is that the backup ran on schedule and my remote connection to another server also worked.

One thing though. If I do a mysqldump of the forum database, I get a file about 1.5GB in size. The backup file is 460MB as a .gz file. Would the database really compress that much!?
 
I was thinking about the file system aspect of the backup.

The only thing that changes on a frequent basis in the filesystem is the /forums/data/attachments and /forums/internal_data/attachments directories as users upload new attachments.

It would be nice if there was an ability to generate a full file system backup, transfer that to a remote destination, and then do a daily backup of just the new content of those attachment directories. Then have the backup of that new content added to the already existing full backup, kind of like how ZIP just adds new files to an already existing file. This would be especially nice if the adding to the ZIP file could take place on the remote destination so that the entire full backup would not have to be resent each time.

I am not a IT techie type person, so I don't know if this suggestion is pure nonsense or readily doable. I just thought I'd throw it out there and see if it sticks ;)
 
Okay, good news is that the backup ran on schedule and my remote connection to another server also worked.

One thing though. If I do a mysqldump of the forum database, I get a file about 1.5GB in size. The backup file is 460MB as a .gz file. Would the database really compress that much!?

You should periodically checkout backups by restoring them on a test install to see if everything works correctly. I generally do this 1-2 times a week.
 
Generally they work, but I have had instances where multiple sites failed a few times a year.
 
Nothing is worse than needing to restore from a backup and because of some catastrophic failure or admin flub and come to find out that the last working backup you have is 6+ months old.
 
I was thinking it might work for monitoring the attachments directories and making backups of them to a remote destination. Then this add-on could be used just to backup the database.
 
If I undertsand it right, it basically acts like rsync, which is just a line command for transferring files from one server to another. However, this one actively monitors directories and transfers new content to a remote location accoding to a config file
 
Here is the actual article I followed,


I did have to ask my hosting service to install lsyncd on my VPS, which they did in a matter of minutes.

It took me about five minutes to go through the setup process. the only difference is that in that article, it mentions a conf file here

Code:
/etc/lsyncd/lsyncd.conf.lua

when the actual conf file should be here,

Code:
/etc/lsyncd.conf

and here are the necessary contents of the conf file,

Code:
sync { default.rsyncssh, source="/sourcedir", host="remote_ip_address", targetdir="/targetdir" }
settings {logfile = "/var/log/lsyncd/lsyncd.log", statusFile = "/var/log/lsyncd/lsyncd.status" }

It is running now and making a full uncompressed copy of my live site to my other VPS (which I wasn't using for anything, but which I have for another 2-1/2 years before my contract expires). Now, if I add any files anywhere on my site, they are automatically copied to the remote VPS. When users upload attachments, they are automatically copied to the remote VPS. If I install an add-on, the files are automatically copied to the remote VPS.

The nice thing is that once I have a full copy of everything transferred to the remote VPS, only new files will get transferred after that. So there is the initial big transfer and afterward just small transfers that don't hit my bandwidth real hard like a big backup of everything would do.

I use the add-on only to do nightly backups of the database and then to copy them to the remote VPS. I have it set to retain seven copies.
 
@Jaxel

Does the Remote FTP backup option also delete older backups according to the Retention setting?

Possible to insert a timestamp into the logs and perhaps average transfer speeds?

Option to backup files on a separate frequency than databases?
 
Good evening JAXEL,

We use add-on version 2.0.5 because our xenforo is not migrated to the latest version (all the add-on that we use has not yet been updated by their developer).

We want to backup to an EXTERNAL ftp server.

We are facing a connection difficulty ...

The server elements are as follows
  • IP: 45.44.250.43 Port: 2121
  • Directory: / Download / Test /
IF we pass this url ftp://45.44.250.43:2121 in "direct" no problem => OK

On the other hand, the same parameters on the ADD-on do not work (see below screen capture) with this message (in _ftplog.txt)
Code:
-- Connecting to 45.44.250.43:2121...
-- Failed to connect to remote FTP host...


View attachment 195147

Can you help us ? and tell us where we are wrong

Thank you in advance for your answer,

Regards,
That means ftp_connect() is failing. Does your server not have ftp_connect enabled?

According to that link, you should NEVER set debug=true on a live site.



If I use this method, do I just fill in all the blanks with the same info except for the very first one, like this?

View attachment 195215

The only change is that first line where I put my initials at the end. Will that still work?

I also did not realize that when setting the back up time, you have to use the UTC offset for your time zone. Once I snapped to that, I was able to set it for 3:00am without any problems.

I will do my first test run tonight. For now, I am just going to use it to make a backup of my database and then move it off to a different server using the remote feature. I don't really need to backup the whole file system, just the two attachment directories would be nice, especially if that could be done incrementally. With 30+GB of attachments, I would not want to have back all that up and ship it off to another server every night. Also, I only have 100GB of space on my VPS, so I can't have a lot of huge backup files laying around.
The FAQ has the preferred method: https://xenforo.com/community/resources/xf2-8wr-database-backup.6546/field?field=faq

I was thinking about the file system aspect of the backup.

The only thing that changes on a frequent basis in the filesystem is the /forums/data/attachments and /forums/internal_data/attachments directories as users upload new attachments.

It would be nice if there was an ability to generate a full file system backup, transfer that to a remote destination, and then do a daily backup of just the new content of those attachment directories. Then have the backup of that new content added to the already existing full backup, kind of like how ZIP just adds new files to an already existing file. This would be especially nice if the adding to the ZIP file could take place on the remote destination so that the entire full backup would not have to be resent each time.

I am not a IT techie type person, so I don't know if this suggestion is pure nonsense or readily doable. I just thought I'd throw it out there and see if it sticks ;)
The /forums/src etc folders are relatively small. There really isn't much harm in backing them up... maybe adds only about 50-60mb of space, which really only makes the backup take an extra 2-3 seconds. In reality, its the attachments that take up the most space. But if you're wanting more in-depth differential backups... you'll need to set up your own more complicated system.
 
Anyone who is having issues with the script timing out... can you guys try editing EWR\Backup\Repository\Backup.php and add at the begining of the runBackup() function:
Code:
set_time_limit(0);
ignore_user_abort(1);

Tell me if that fixes your issues.

I just purchased this and I'm getting timeout errors. Tried adding this entry to Backup.php, no change.

Should be added like this, right?

Code:
class Backup extends Repository
{
        public function runBackup()
        {
                set_time_limit(0);
                ignore_user_abort(1);
                $options = \XF::options();
                $oldMessage = $options->boardInactiveMessage;
                $newMessage = $options->EWRbackup_message;
 
I have my site set to do daily back ups at 3:00am for the database only .

The first backup worked perfectly. It closed the site, made the backup, copied it to the remote destination, and the site was reopened.

The second backup didn't go so well. The site was closed, the local backup was created, but it was not copied to the remote destination. The site never reopened.

I don't know where in the process the site is supposed to reopen? Is it immediately after the backup of the database? Or does it wait until completion of the remote transfer?

IF it is set to reopen AFTER completion of the remote transfer, is there some way to set a time out so that if it doesn't work, the site still gets reopened? Or perhaps, change it to reopen right after the local backup is completed?

For now, I have it set to NOT close the site during the backup. Is there a reason why it actually should be closed during the backup? I never close the site when I just do a manual MySQL dump. Is that a bad practice?
 
Top Bottom