How to duplicate a live site to use as a test site

1. I have reverted to the original .htaccess file and I get the below error message when I try to login into the dev site. I am not able to login ..

ZFBL0HM.jpg


2. Using the original .htaccess file, when I click on any forum link of the dev site I alwasys get taken to the main page of my mainsite
eg. Clicking on a link (http://www.<SiteName>.com/community2/find-new/threads) or any other link always takes me to (http://www.<SiteName>.com/community/)


Below is my original .htaccess file that I am using.


# Mod_security can interfere with uploading of content such as attachments. If you
# cannot attach files, remove the "#" from the lines below.
#<IfModule mod_security.c>
# SecFilterEngine Off
# SecFilterScanPOST Off
#</IfModule>

ErrorDocument 401 default
ErrorDocument 403 default
ErrorDocument 404 default
ErrorDocument 500 default

<IfModule mod_rewrite.c>


RewriteEngine On
RewriteCond %{HTTP_HOST} !^www\.<livesite>\.com$
RewriteRule ^(.*)$ http://www.<livesite>.com/community/$1 [R=301,L]

# If you are having problems with the rewrite rules, remove the "#" from the
# line that begins "RewriteBase" below. You will also have to change the path
# of the rewrite to reflect the path to your XenForo installation.
RewriteBase /community

# This line may be needed to enable WebDAV editing with PHP as a CGI.
#RewriteRule .* - [E=HTTP_AUTHORIZATION:%{HTTP:Authorization}]

RewriteCond %{REQUEST_FILENAME} -f [OR]
RewriteCond %{REQUEST_FILENAME} -l [OR]
RewriteCond %{REQUEST_FILENAME} -d
RewriteRule ^.*$ - [NC,L]
RewriteRule ^(data/|js/|styles/|install/|favicon\.ico|crossdomain\.xml|robots\.txt) - [NC,L]
RewriteRule ^.*$ index.php [NC,L]
</IfModule>
 
Wow.. things like this make me wanna slap myself. :)

I just updated the rewrite rules from commuity to community2 and viola ..
I guess I was stuck up deleting everything that doesn't work mindset ..

Cheers.

Thank you @woei @Brogan
 
Are there any licencing issues with this? I really like the idea of a test site as Xenforo is a little alien to me at the minute.
But would the test site mean you would have to have 2 x license?
 
The terms of the license permit one public installation and one private, password protected installation limited to you and your site staff.
 
I run a shell script to perform an update/refresh of my test site, from production ...
Code:
#!/bin/sh

SOURCE_DIR=$1
DEST_DIR=$2

SOURCE_DB_NAME="<your production db name>"
SOURCE_DB_USERNAME="<your production db username>"

DEST_DB_NAME="<your test db name>"
DEST_DB_USERNAME="<your test db username>"

if [ -z "$1" ] || [ -z "$2" ] ; then
    clear
    echo "You need source and destination directory parameters. For example ..."
    echo "$0 /var/www/production /var/www/test"
    exit 1
fi
if ! command -v rsync >/dev/null 2>&1; then
    clear
    echo >&2 "rsync is not installed. Please install it"
    exit 65
fi
clear

cp -p $DEST_DIR/library/config.php $DEST_DIR/library/config.replicate.php
rsync -arugoptlH --progress --itemize-changes --human-readable --stats $SOURCE_DIR/ $DEST_DIR
mv -f $DEST_DIR/library/config.replicate.php $DEST_DIR/library/config.php

mysqldump --user=$SOURCE_DB_USERNAME -p --skip-opt --disable-keys --create-options --extended-insert --single-transaction --add-drop-table --ignore-table=$SOURCE_DB_NAME.archived_import_log $SOURCE_DB_NAME > /var/tmp/$SOURCE_DB_NAME.sql
mysql --user=$DEST_DB_USERNAME -p $DEST_DB_NAME < /var/tmp/$SOURCE_DB_NAME.sql
rm -f /var/tmp/$SOURCE_DB_NAME.sql
mysql --user=$DEST_DB_USERNAME -p --database=$DEST_DB_NAME --execute="UPDATE xf_option SET option_value='<your test site name> DEV' WHERE option_id='boardTitle';UPDATE xf_option SET option_value='<your test site url>' WHERE option_id='boardUrl';UPDATE xf_option SET option_value='0' WHERE option_id='boardActive';UPDATE xf_option SET option_value='' WHERE option_id='boardInactiveMessage';"
 
Is there a reason not to just make a copy of the database using operations in phpmyadmin?
Yeah... one word
Timeouts
They can kill you and then the DB you have downloaded is not complete. Then you think it is and when you go to upload it (for when your system has crashed) you find out that it's not a complete dump.

To get a somewhat reliable complete dump of a large DB, you have to set the php.ini settings so high to be ridiculous.
 
Not really.. I've just seen it happen a few times. PhpMyAdmin is great for tweaking the DB from a GUI, but there are other backup abilities out there.
For backups, this may work - [SolidMean] ForumBackup
Since I run on a dedicated, I just SSH in and run it at the CLI - and then I also have a script that performs my backup and pushes them to a NAS at the house.
 
Yes, I was advised this but when I rang my host tech support to ask how to do this, they just said they would do it for me.
If a VPS, then you really need to be doing it yourself also. I would NEVER trust a hosting provider to backup my data, even though they say they will. To many times I've read horror stories in here when folks have placed their trust in their hosts backups.
 
If a VPS, then you really need to be doing it yourself also. I would NEVER trust a hosting provider to backup my data, even though they say they will. To many times I've read horror stories in here when folks have placed their trust in their hosts backups.
I know, but that's why I have been using bigdump rather than relying on them
 
If you have SSH access, then it's just as easy to do.
To dump
mysqldump -uusername -p --skip-lock-tables db_name > my_dumped_db.sql
and restore
mysql -uusername -p db_name < my_dumped_db.sql
Yes, I'm sure it's easy, once you know how to do it! It's just that they wouldn't tell me, preferring to do it themselves.
 
Top Bottom