[TH] Maximum Guest Views [Deleted]

ThemeHouse

Well-known member
Audentio Design submitted a new resource:

[TH] Maximum Guest Views - Limit the number of posts that guests and/or unconfirmed users can view before register/confirming.

This add-on adds an option to limit the number of posts that guests and/or unconfirmed users can view before they have to register or complete their registration.

You can also exclude certain forums from being affected by this restriction.

Read more about this resource...
 
Yes, unless you assign your search bots to a usergroup
Just wanted to make sure that google can crawl this at all times without being affected by guest restriction.

But tell me this...Can bots be assigned to a user group? Not going to do that, but your statement intrigued me..
 
Anyone using this addon (formerly waindigos) - please... if you could confirm this does not restrict bots.
The answer given by @ThemeHouse left me with some doubt... and he has chosen not to confirm.
 
We used it some time ago (before we deleted all Waindigo add-ons for quality & support reasons) and back then it had blocked Google bots.
 
Anyone using this addon (formerly waindigos) - please... if you could confirm this does not restrict bots.
The answer given by @ThemeHouse left me with some doubt... and he has chosen not to confirm.

We definitely didn't "Choose not to confirm", it was a holiday weekend and we were not in the office. But as stated anyone in the guest group -- including bots -- will be blocked. I'll look into a fix that will not block bots.

Jake
 
Jake without the fix for Search Engine Crawlers this Addon is useless - if useragents array will be the ultimate solution will have to be seen... they change over time

If you want to go with useragents I suggest something like this:

Code:
$api_request="http://www.useragentstring.com/?uas=".urlencode($_SERVER['HTTP_USER_AGENT'])."&getJSON=all";
$ua=json_decode(file_get_contents($api_request));
if($ua["agent_type"]=="Crawler") die();

They are up to date.

Dont get me wrong you have other great addons but this one is not productive at its current version
 
Does it exclude search bots?
Yes, unless you assign your search bots to a usergroup
Just wanted to make sure that google can crawl this at all times without being affected by guest restriction.
We used it some time ago (before we deleted all Waindigo add-ons for quality & support reasons) and back then it had blocked Google bots.
But as stated anyone in the guest group -- including bots -- will be blocked. I'll look into a fix that will not block bots.
Jake without the fix for Search Engine Crawlers this Addon is useless - if useragents array will be the ultimate solution will have to be seen... they change over time
I have used this add-on for some time and it does NOT block known bots. My Google Webmaster and Bing webmaster tools show no problems with these bots accessing all my content.
It (v1.0.2) uses XF's inbuilt 'is_robot' check to allow them unrestricted access ...
PHP:
        // Exclude robots
        if (XenForo_Application::$versionId < 1020000) {
            if (isset($viewingUser['is_robot'])) {
                $isRobot = $viewingUser['is_robot'];
            } else {
                $isRobot = XenForo_Visitor::getInstance()->get('is_robot');
            }
        } elseif (XenForo_Application::isRegistered('session')) {
            $session = XenForo_Application::getSession();
            $isRobot = ($session->get('robotId'));
        } else {
            $isRobot = true;
        }
        if ($isRobot) {
            return parent::canViewThread($thread, $forum, $errorPhraseKey, $nodePermissions, $viewingUser);
        }
 
I have used this add-on for some time and it does NOT block known bots. My Google Webmaster and Bing webmaster tools show no problems with these bots accessing all my content.
It (v1.0.2) uses XF's inbuilt 'is_robot' check to allow them unrestricted access ...
PHP:
        // Exclude robots
        if (XenForo_Application::$versionId < 1020000) {
            if (isset($viewingUser['is_robot'])) {
                $isRobot = $viewingUser['is_robot'];
            } else {
                $isRobot = XenForo_Visitor::getInstance()->get('is_robot');
            }
        } elseif (XenForo_Application::isRegistered('session')) {
            $session = XenForo_Application::getSession();
            $isRobot = ($session->get('robotId'));
        } else {
            $isRobot = true;
        }
        if ($isRobot) {
            return parent::canViewThread($thread, $forum, $errorPhraseKey, $nodePermissions, $viewingUser);
        }

Great! Guess I was wrong before :)
 
You can thank @Mouth for referring this to me. He deserves a freebie...or at least a pat on the back and a stiff drink on the house!
 
Would this addon or the first page free addon be better to use?

I was wondering why you guys, @ThemeHouse don't combine them. Make it first "pages" free and let admin set the amount of pages, which is pretty much what we do here, right?
 
Make it first "pages" free and let admin set the amount of pages, which is pretty much what we do here, right?
First page free lets a users view the first page of as many threads as they like. They could view each and every thread, first page, without restriction.
This add-on, sets a max number of posts (thread pages) taht the user can see before being restricted to seeing any further.
Both have separate restrictions and outcomes, depending upon which you think is most relevant for you.
 
Define what a robot is and how it is detected . i still think that you need an up to date list of user agents or whatever you use to detect "robots" also there are many "bad" robots out there how to block them?
 
Jake without the fix for Search Engine Crawlers this Addon is useless - if useragents array will be the ultimate solution will have to be seen... they change over time

If you want to go with useragents I suggest something like this:

Code:
$api_request="http://www.useragentstring.com/?uas=".urlencode($_SERVER['HTTP_USER_AGENT'])."&getJSON=all";
$ua=json_decode(file_get_contents($api_request));
if($ua["agent_type"]=="Crawler") die();

They are up to date.

Dont get me wrong you have other great addons but this one is not productive at its current version

- Themehouse could you please answer?
 
Top Bottom