[TAC] Bot Arrestor

[TAC] Bot Arrestor [Paid] 2.0.12

No permission to buy ($19.00)
okay, that uses a local IP database provided by maxmind
You need to update that regularly to keep the IP:Country correct

They update their database weekly every tuesday:
https://support.maxmind.com/geoip-f...he-geoip2-and-geoip-legacy-databases-updated/

If you have the resources to update your entire country database regularly, I would recommend it. Stopcountryspam relies on third parties to do the regular updating.
Any way, how did we get side tracked from DeDos
 
Exactly, yes I do, & yes lets stay on track with DeDos.

Shall look at my finances over the next few weeks and make some decisions then and purchase.
 
@tenants Does this add-on add rules Automatic via htaccess file? I think we have added this. If so why RewriteRule added 2 time at same?

Code:
RewriteRule .*\.()$ - [F,NC]

RewriteRule .*\.()$ - [F,NC]

RewriteCond %{HTTP_REFERER} !^$
RewriteCond %{HTTP_REFERER} !^http://mysite.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://mysite.com$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.mysite.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^http://www.mysite.com$ [NC]
RewriteCond %{HTTP_REFERER} !^https://mysite.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^https://mysite.com$ [NC]
RewriteCond %{HTTP_REFERER} !^https://www.mysite.com/.*$ [NC]
RewriteCond %{HTTP_REFERER} !^https://www.mysite.com$ [NC]
RewriteRule .*\.(jpg|jpeg|gif|png|bmp)$ https://www.mysite.com [R,NC]
 
dedos does not add those rules, in fact it only appends between the lines
# Start DeDos
# End DeDos

You also have to turn the dedos option on for it to update .htacess (not all environments will be able to do this)
Before it makes any sort of update, it will have created a file named: .htaccessddbackup (in the same location as the .htaccess)

If the option is turned on, and your environment allows it, Dedos htaccess blocks repetitive scrapping ip addresses, as such:
Code:
# Start DeDos Deny Block DO NOT MODIFY #
order allow,deny
deny from 46.211.35.211
deny from 185.175.158.172
deny from 2.95.39.192
deny from 138.197.64.38
deny from 174.112.51.157
deny from 51.255.33.0
deny from 109.163.234.4
deny from 37.142.10.243
deny from 46.8.45.18
deny from 188.232.75.216
deny from 178.137.211.13
deny from 155.94.139.118
deny from 46.161.9.14
deny from 27.155.137.158
deny from 37.115.214.184
deny from 46.118.233.101
deny from 37.112.5.212
deny from 72.9.151.194
deny from 199.249.223.76
deny from 64.113.32.29
deny from 171.25.193.20
deny from 91.219.237.229
deny from 204.12.240.34
deny from 91.244.0.211
deny from 37.115.187.228
deny from 46.118.127.142
allow from all
# End DeDos Deny Block DO NOT MODIFY #
 
Last edited:
dedos does not add those rules, in fact it only appends between the lines
# Start DeDos
# End DeDos

You also have to turn the dedos option on for it to update .htacess (not all environments will be able to do this)
Before it makes any sort of update, it will have created a file named: .htaccessddbackup (in the same location as the .htaccess)

Dedos htaccess blocks repetitive scrapping ip address, as such:
Code:
# Start DeDos Deny Block DO NOT MODIFY #
order allow,deny
deny from 46.211.35.211
deny from 185.175.158.172
deny from 2.95.39.192
deny from 138.197.64.38
deny from 174.112.51.157
deny from 51.255.33.0
deny from 109.163.234.4
deny from 37.142.10.243
deny from 46.8.45.18
deny from 188.232.75.216
deny from 178.137.211.13
deny from 155.94.139.118
deny from 46.161.9.14
deny from 27.155.137.158
deny from 37.115.214.184
deny from 46.118.233.101
deny from 37.112.5.212
deny from 72.9.151.194
deny from 199.249.223.76
deny from 64.113.32.29
deny from 171.25.193.20
deny from 91.219.237.229
deny from 204.12.240.34
deny from 91.244.0.211
deny from 37.115.187.228
deny from 46.118.127.142
allow from all
# End DeDos Deny Block DO NOT MODIFY #

Does not work Preventing Website Scraping. How can i specifically block a site or any type of bot using by that site to scrap my content. It is scraping my content and publishing as their own. Please help.
 
You have only installed this LITERALLY 30 minuets after posting this (after sending me various PM's asking where this addon is...etc)
It does prevent website scrapping

What it sounds like you were suggesting in pms, is that a web site is mimicking your site. We don't yet know how, it might not be a simple case of basic scraping, but check for it anyway.

They could be using http://www.httrack.com/, which should get picked up from dos speed (keep an eye on the logs). Bare in mind, if someone really wants your site, they will find a way to bypass many detection methods (just slowing down could avoid dos detection) so it might not be that simple. Dedos can't stop all scrappers, but it prevents the ones that really hammer your site, or do inhuman things (that can clog up your resources), which is quite a high percentage.

Dedos has various methods, such as:
1) Detecting dos bots (bots that hit too many pages too quickly)
2) Detecting Same Session User Agent Switchers ( bots that have no JavaScript enabled yet also switch their user_agent in the same session. These are very unlikely to be humans. Bots sometimes use this method to avoid DOS detection
3) Detecting Json Scrapers (Block bots that have no JavaScript enabled and go from json page to json page scraping the content of your site, rarely visiting any other pages)
4) Blocking various bots from particular hosts (see below)
5) There are many more secret bot detection methods not published


We don't yet know what the website copiers are doing or how they are scraping, look at the dedoslogs and see if it picks up any activity to begin with

If I was you, I would first consider looking at your access logs, so you know what activity it is you want to block (your host will know where these are, but I believe there is often a section on cpanel for viewing them). Also, have a look at awstats, and look for anything suspicious

DEDOS

blocking hosts in the options:

upload_2017-4-7_22-36-54-png.150899






IN ADDITION

There are various methods that detect scrappers and report the hosts in your logs found:
admin.php?dedoslogs/

click on each one, and you will have information of why it was detected, in addition you will often be provided with the host (which you can add back into your settings to catch them quicker next time). But please bear in mind, not all ip's will be resolved as a host using gethostname

Example:

upload_2017-4-7_23-3-14-png.150900



This is where you can turn on the htaccess option
admin.php?options/list/dedos
It is found under the options "Update Htaccess"

upload_2017-4-7_23-16-17-png.150902




There is also a cache area, so even if you do not have an environment which allows automatic update of the htacess, the bots are added to a cache which significantly reduces queries and resources to due to bots, this can be found under
/admin.php?dedoscache/

upload_2017-4-7_23-21-20-png.150904





You might find that a lot of modern day scrapers are now using javascript, so I would consider turning off the option to detect javascript (this will become increasing more common as more scrappers addopt js/browser based automation)



You will notice a cpu and a server resource reduction on smaller servers that are getting hit by bots/scrappers, you wont see this in 30 minutes, but if you look at your awstats logs over days, you'll notice the peaks when it's not turned on, compared to the troughs when it is turned on
 

Attachments

  • upload_2017-4-7_22-36-54.webp
    upload_2017-4-7_22-36-54.webp
    81.1 KB · Views: 147
  • upload_2017-4-7_23-3-14.webp
    upload_2017-4-7_23-3-14.webp
    60.5 KB · Views: 147
  • upload_2017-4-7_23-16-17.webp
    upload_2017-4-7_23-16-17.webp
    16.8 KB · Views: 148
  • upload_2017-4-7_23-16-18.webp
    upload_2017-4-7_23-16-18.webp
    16.8 KB · Views: 3
  • upload_2017-4-7_23-21-20.webp
    upload_2017-4-7_23-21-20.webp
    90.2 KB · Views: 147
Last edited:
I think that you forgot to upgrade the version inside the addon. After installation of the lastest version, It keeps saying for me that I have the 2.0.08 version.
 
@tenants I'd suggest not removing expected parameters and accomplishing removing the data from the view to prevent scraping in another way (ideally by either throwing an error or using a template modification). We have several add-ons that use $controller->params['forum'] and $controller->params['thread'] which one would expect to always be available on XenForo_ControllerPublic_Thread::actionIndex but isn't when someone is using this add-on
 
To keep everything central and tracked, I'm going to start referencing any bugs I take on fixing with the tag: [Issue n], where n is the issue number. I've had a request for a ticket system, but don't I think it's currently necessary given my current add-on support load.

[Issue 1]
Multiple open tabs, possibly combined with 3rd party addons which use autorefreshing, can trigger the arrestor.
 
Feature Request: We've been using this addon for a long time and love it. Can we get ability to set a simple whitelist of IP's? Some features (same IP session switcher, Secret ingredient 2, etc) work 95% of the time but we end up with crazy cases of false positives once in a while. I'd love to be able to keep these options active and simply whitelist a few of our users with unusual setups.

Thanks!
 
Feature Request: We've been using this addon for a long time and love it. Can we get ability to set a simple whitelist of IP's? Some features (same IP session switcher, Secret ingredient 2, etc) work 95% of the time but we end up with crazy cases of false positives once in a while. I'd love to be able to keep these options active and simply whitelist a few of our users with unusual setups.

Thanks!

You're in luck as adding white-listing features is on my work stack (to fix Issue 1) :)
If you could PM me a (redacted if necessary) log showing a false positive, it would help me out, thanks.
 
Its purpose is to fix Issue 1.

If an admin can identify some string, that is in the request that is being done repeatedly by the offending 3rd party add-on, they can add that string to the URI whitelist. Then the Bot Arrestor will ignore those requests.

I doubt this will fix all cases (e.g. if the 3rd party addon is polling some normal URI without anything in the URI indicating it's coming from that add-on, it won't fix it). But this should narrow it, so I'll close the issue for now. [Issue 1]:closed

The IP whitelist I'll add next [Feature request 1].
 
Top Bottom