[TAC] Bot Arrestor

[TAC] Bot Arrestor [Paid] 2.0.12

No permission to buy ($19.00)
I'm really not that sure, it's a good question though.

If they extend the core to make the calls (for a new functional areas, it's unlikely they would), then yes the auto-refresh may cause the trigger.
 
Last edited:
I'm currently looking at ajax calls causing extra page counts (for instance, hovering over the link calling the thread preview).
It's probably not a good idea to count these (since the user has no idea they are really 'visiting' an extra page).
For threads, I now only increase the count when actually visiting the page (actionIndex), I might have to do this for other areas to

I've added a new version:
DeDos_v1_0_9b.zip

This should also avoid counting things like the unread redirect for threads, I'm looking at other areas people might call a lot of ajax requests or redirects.

- The aim of this plugins isn't so much to stop human DOS attacks, as it is to not interfere with humans/spiders (yet still catch resource selfish scrapers & spam bots)
 
tenants updated DeDos - Anti DOS for spam bot/scrapers with a new update entry:

avoid logged in

I've added an option in the ACP to completely avoid applying DeDos to logged in users (since this plugin mainly focuses on spam bots / scrappers)

I've also now change the default setting, so instead of upgrading, using un-install -> install new plugin

I've change the friendly default settings to : 9 requests in 8 seconds
and Malicious: 9 requests in 6 seconds

This means users are far less likely to ever bump into the friendly message (and find it harder to bump into the Malicious message)

It does mean it will catch far less bots, however FBHP catches a high % of spam bots, DeDos focuses on catching scapers that would usually use a lot of resources (so just the hard hitting ones)

Read the rest of this update entry...

Instead of upgrading, I would un-install and reinstall this version (to use the new settings)
 
StopBotResources is no longer needed when using FBHP and DeDos together (it has been removed).

StopBotResources relied on an API for every single visitors.. That means the more forums that used the plugin, the more the API became saturated with requests (which is why I only allowed small forums to use that resource and asked admins to remove the plugin once they no longer met certain conditions)...

Using FoolBotHoneyPot & DeDos together we have a better approach
FoolBotHoneyPot detects spam bots registering (currently 100% of spam bots)
DeDos detects spam bots hitting your site heavily
Once detected, they both use the local cache... so we have a very low query approach (and low bandwidth) of locking these spam bot / scrapers out
- There is no need for an API, so yes, using both together they can help reduce bot resources on servers with limited resources / high amount of spam bots, and they can be used on any size forum
 
what is the difference between this and bad behavior?
does this addon recognize suspect browser fingerprints like for example ie6?
does this work with chat software which polls the server a few times every second?
I run litespeed Web server which has settings for blocking ips that connect too often. I had to increase that limit to allow chat users not to be banned by litespeed.
 
For a long time, spam bots have been faking user_agent. If you use FoolBotHoneyPot, you will see that every bot, although they have no JS, attempt to register within seconds, get detected on various APIs and alter 15-21 different types of hidden fields, every single one fakes the user_agent to look like a modern browser.

In DeDos, we actually use this to our advantage, we can ignore all spiders by saying that we only block users that look like browsers (and DOS), DeDos also looks for known spiders using core functionality (and avoid detecting them). So in away, the spam bots 'disguise' is it's own downfall... if the spam bot faked their selves as googlebot, then we would have no way of distinguishing them (thankfully spam bots want to look like humans when the attempt to sign up).

- Avoid false positives humans/spiders is the direction this is going (much like FBHP, FBHP current blocks 100% of spam bots with no false positives and users never notice it)

If we start adding options anywhere to block user_agent browsers types, this will only ever affect your humans with false positives (this is bad design), user_agent can be (and is always) faked. This is nothing like bad behaviour, I'll do everything I can to avoid false positives, I will not add options to hang your self with false positives, I may add options that will avoid them ...
DeDos (and FoolBotHoneyPot) are targeted at spam bots & scrappers without real human ever noticing an impact.

This is still an early version of DeDos, there are a couple more options I would like to add
Since this is focusing on Bots (and not humans), next I will add an options so that only JavaScript disabled users are added to the cache (humans will still get the lower query/lower kb warning, but wont ever get added to the cache)

For chat software:
... It depends how the chat software works, I suspect DeDos will ignore the chat software since DeDos is primarily focused at detecting DOS for core functions (having said that, if the plugin extends the core functional areas, it might not be ignored...) is there an example of the chat software (If it's free, I can take a look and tell you quite quickly)
 
Last edited:
  • Like
Reactions: DRE
Got a couple errors today.

Code:
Error Info
ErrorException: Fatal Error: Call to undefined method XFCP_Tac_DeDos_ControllerPublic_Misc::actionIndex() - library/Tac/DeDos/ControllerPublic/Misc.php:9
Generated By: Unknown Account, 41 minutes ago
Stack Trace
#0 [internal function]: XenForo_Application::handleFatalError()
#1 {main}
Request State
array(3) {
["url"] => string(29) "http://www.talkbass.com/misc/"
["_GET"] => array(0) {
}
["_POST"] => array(0) {
}
}
 
That's an interesting one (likely to have been caused by a bot)

example.com/misc/ does not have an actionIndex, so if you visit that page usually (a human is unlikely to), the it would just say:
The requested page could not be found.

All add a fix for it soon
 
@tenants Do you believe this addon could cause load issues on large forums? Just wondering - we've had some "clogging" in the pipes since installing this, and not sure if it's completely unrelated. These "clogs" happen once every 15-30 mins or so: pages don't load for about 8-10 seconds, then everything starts loading fine (webserver load jumps to 10 when it becomes "unclogged" and then settles back down). Server techs say the database server isn't having any issues. Hmm.
 
There are no extra database queries, it simply adds to the session, and adds to the cache if found as a spam bot (the data is tiny, bytes).

So for large forums, it's not going to add thousands of IPs to the global cache. Instead, each user session will have a small amount of data (from the last 10 seconds, and no more than x attempts).
Each user, in there sessions data, will have something like this
dd_recent_visits";s:125:"a:1:{i:0;a:2:{s:1:"t";i:1402556250;s:1:"l";s:72:"http://localhost/xenforo/index.php?threads/blabla.3378/";}}"
[This is a small amount of data, and only related to that user for the last 10 seconds]


If and only if this data is found as a DOS attempt, then the data is added to the global cache (once again, no extra queries)
This data is stored in the global cache in it's smallest form [ip]=timecached
"dd_known_dos";a:1:{s:9:"127.0.0.1";i:1402493810;}
[This does not build up, since the data is removed once older than the cache time range]

So, not only is it 0 query overhead, the data stored is very small (per user session for recent visits, global cache for known DOS)

The DOS check is called once per page, so it simply grabs dd_known_dos and checks to see if the IP is present (its a very simple calculation to check if a key exists in a small size array, so this is very low PHP overhead).

So no, this has been aimed at being
  • low query overhead (0)
  • low data size (bytes)
  • low PHP calculation (check if key exists in a small array)
It's low impact (far less than most add-ons). It reduces overall queries and downloaded data, but, I'm willing to investigate
 
Last edited:
Bought & installed!

One Question: I have tested it for myself, but the only Error Message i get is this: http://awesomescreenshot.com/0192ywc799 - there was no "friendly" Message with a Warning. I dont have changed the Settings, is all Default. How can i enable a Warning first?

// Ah, nevermind. I saw that Avoid Logged In Option is there, so Logged in Users are avoides by this Addon. Nice!
 
Last edited:
Yes, I'll avoid JavaScript enabled users too soon (so even real human users that are logged out should also never get added to the cache, but may still see the human friendly message if triggered)

To see the friendly message first, make sure it is triggered with less requests than the malicious settings (I think I may have set the number of requests by default to the same...that's not a good idea)

For instance, I've now changed the defaults to:

Friendly
7 Requests in 8 Seconds
Malicious (added to the cache)
9 Requests in 6 Seconds

This means, refreshing the page 7 times you'll see the friendly warning message (on the 9th, you'll get added to the cache if you have done this in under 6 seconds)

It also depends on where you refresh, what page was this?

For most areas I catch the actionIndex with the friendly message (this is usually related to the page, for instance the thread page, post page, conversation page), but I don't prompt the friendly message for other areas related to hover-over/ajax, and now avoid checking things such as preview posts (these no longer add to the recentVisits).
 
Last edited:
Dude, the way you explain how your add-ons work makes me want to fork over dough without thought. Already have FBHP on one site, now I want this too and another copy of both for my other site. Dang!!
 
tenants updated DeDos - Anti DOS for spam bot/scrapers with a new update entry:

Many Enhancements - detect cookless bots

This is still 0 query overhead, and uses cache for previous DOS detection.

This still works out of the box, and you do not need to change any of the default settings. I've added a few options for various forum set ups, but the default is designed to work for all forums.
  • I've change the default times again Friendly: 8 Requests in 7 Seconds, Malicious (added to the cache): 9 Requests in 6 Seconds.
  • Added a link from the user logs, view log IP address, to your IP info page (http://whatismyipaddress.com/)
  • Added a link from the cache logs (locked out IPs) to Google, displaying possible Malicious activity for this IP address
  • There are now 3 ACP options to avoid real users getting added to the cache
    • 1) Avoid Cache for Logged In Users: Avoid logged in users ever being added to the cache (regardless of how many times they refresh / visit pages)
    • 2) Avoid All DeDos for Logged In Users: This avoids logged in users ever getting added to the cache, they also never have to see the freindly warning message
    • 3) Automatically Remove JavaScript Users From Cache: This setting is for peace of mind. False positives should never occur. However, if your DeDos settings are too harsh (or areas force users to DOS certain pages), then a user could be added to the cache. If a user is added to the cache, they will be locked out of the site with a 401 Unauthorised message. On the 401 page, this option detects the JavaScript of the user, and if found it removes them from the cache immediately, prevents them from getting re-added, logs the information, then redirects the page (The 401 message will only blink at the user) This allows you to detect false positives from the logs , adjust your settings correspondingly, and avoid real users getting locked out from your site
  • I now get the response of threads before applying DeDos checks, this allows page number / page title redirects to occur before a count is added (it's not fair on human to count redirects as a recentVisit)
  • I've removed counting Likes, Reports and Quotes from Post so they don't get added to recentVisits. Some users really do click on 10 - 20 likes in a very short time frame, if real users do it, I don't want to catch them. So for Threads and now Posts, only the actionIndex (viewing the actual thread/post page) is added to recentVisits.
  • Turn Off Portal DeDos Detection: I've added an option to not apply DeDos to the portal page. Some portals (custom or other) use lazy load. If not designed cautiously, real users will call lots of pages in a very short time (effectively DOS attacking your site). Since the portal page is very customisable and some plugins allow this, I've added an option to turn DOS detection off for the portal page.
  • Experimental Option - Cookieless Bots In Cache: (not to be used on busy forums). Some malicious bots (it would seem about 1 in 10 from my logs) do not use cookies, this means there is no session data on your site for that user. If there is no session data, the recentVisits for that user can not be logged in their session. One way around this, is that if a user is cookieless, log their recentVisits in the global cache (and immediately remove it from the global cache if it's > 10 seconds, or the user suddenly has a cookie). The reason this option should not be use on large forums: the very 1st visit a user makes, the user appears to be cookieless, and only creates the cookie when going to the next page. So on busy forums, there will be many 1st time visitors recentVisits logged in the global cache (and only removed after going to the next page / waiting 10 seconds). I recommend not ticking this option if you have more than 100 visitors on line at any one time. I have hard coded it so there will never be more than 50 users with recentVisits added to the global cache to avoid issues even if large forums tick this option.
So we have a fairly robust method of never detecting logged in users, and a method of avoid detecting logged out real users...
(If logged out real users are ever detected by DeDos, they are immediately removed from the cache and the logged information flashes, letting you know you may need to change your settings, or look into that user)

This pretty much just leaves bots.

We already have robust methods to avoid detecting spiders
We look for core know spiders and remove them, then look at the user agent and remove further possible spiders, then confirm that the user_agent does indeed look like a browser... malicious bots almost always disguise their selves to look like modern browsers (and since we protect both logged in and logged out users, we should just be left with malicious bots spam-bots/scrapers)

Read the rest of this update entry...
 
Last edited:
Hi @tenants

Small bug after upgrading...

You can no longer view a users posts archive. If I click on the number of posts on a users pop-up member card, the page it takes you to fails to load.

eg. this URL: /search/member?user_id=1

Disable this add-on and that page loads.
 
<removed currently updating, will be asap (less than 1 day)>

You released a new Version, but its not possible to download?
 
Top Bottom