I suspect that they botted, then when back as a real user to see why they couldn't bot ... What I try to do as a priority, is avoid false negatives
Sure, I would want the same thing - all of my settings are tuned to avoid affecting regular users.
I suspect that they botted, then when back as a real user to see why they couldn't bot ... What I try to do as a priority, is avoid false negatives
Game Changer: Secret Ingredient 1 & 2
This version takes a giant leap at detecting more bots than ever. However, the method should not be revealed.
I have added an option "Tac Dedos Secret Ingredient 1 & 2"..
Tac Dedos Secret Ingredient 1 Will only detect bots that are
1) Logged out
2) Have no javascript
3) Fake their user_agent to look like normal browsers (but act like bots)
4) Do not do the normal things that browser must and always do
- It renders many of the other TAC Dedos methods almost redundant, since it catches bots much quicker and more reliably.
Turn On Secret Ingredient 1:
The mechanism for this detection should not be published. This avoids scrappers / spammers trying to get around it, it also avoids users trying to copy this mechanism. This mechanism is incredibly effective, it simply distinguishes browsers from bots (so bots that are browsers may still get around this, but they are rare). You may find that mechanism is far more effective at catching scrappers and spam bots after a just a few attempts
If you should have genuine reasons for discussing the mechanisms with me, I may do so in private messages, if there is good reason. Please do not talk about this mechanism publicly. It works extremely well, I don't want bots to figure out ways of bypassing this any time soon!
Turn On Secret Ingredient 2: This is mentioned in more detail within the options
Interesting, I turned on the JavaScript detection to avoid false positives and the first one that got deactivated is clearly some kind of bot:
I've started to get a lot of those proxy_header errors as well on my site.
if($proxyHeadersFound)
{
$writer->set('proxy_header', $proxyHeadersFound);
}
$writer->set('proxy_header', $proxyHeadersFound);
fixed the proxy_header bug
re-arranged code for earlier detection
On some forums, a user can get redirected from forum/163 to forum/myforum.163
previously I was logging the redirect to the recentVisits, this is not fair on humans, I've now fixed this to allow the Xenforo canonicalizeRequestUrl to be applied before counting recentVisits
@tenants, you are ruthless! Haha. Just updated, 5 mins ago, and have added about 20 bad IP's to the cache with your "secret ingredients".
I have also had what appears to be one false positive with the Same IP Session Switcher. At 9:45, an IP was added to the cache (same IP session switcher, no javascript). At 9:46, javascript detected and IP removed from cache (username logged this time).
Interestingly, 20 minutes later, the IP was again hit with the Same IP Session Switcher (no javascript, not logged in). This time the user didn't log in, and IP was not removed from the cache. Weird.
I'm very appreciative of this addon, and have been using it for months!
v2.00.02:
ErrorException: array_key_exists() expects parameter 2 to be array, boolean given - library/Tac/DeDos/ControllerPublic/DeDos.php:56
v2.00.02:
ErrorException: array_key_exists() expects parameter 2 to be array, boolean given - library/Tac/DeDos/ControllerPublic/DeDos.php:56
if(array_key_exists($stringIP, $dd_known_dos))
if($dd_known_dos && array_key_exists($stringIP, $dd_known_dos))
It's pretty incredible how accurate and quick secret ingredient 1 is. I have then been going through the logs and adding host names from the dedos logs to secret ingredient 2, this allows the system to pick the malicious bots up even faster. Now the only things that have a bandwidth / cpu impact are real spiders and humans, my resources have shot down.
Can you post your logs for when it was caught (click the red row, and paste the info here, including the recent visits, and the do the same for the red flashing row)
IP address added to cache: 65.8.151.82
Session User Agent: Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20100101 Firefox/10.0 (Chrome)
Log Time: Yesterday at 9:43 PM
User: User not logged in
JavaScript Detected: False
Has Cookie: True
Session ID: d8e0b4e05a66de76d87e86febd574ab2
gethostbyaddr(65.8.151.82) = adsl-65-8-151-82.mia.bellsouth.net
Friendly messages in this session: 0
gethostbyname(adsl-65-8-151-82.mia.bellsouth.net) = 65.8.151.82
Proxy Header: No Proxy Found
Recent Visits at Log Time
[unix_timestamp](human_readable_time) => url_location
[1431657835](Yesterday at 9:43 PM) => http://www.talkbass.com/threads/new-project-p-bass-mutt.1152450/
[1431657834](Yesterday at 9:43 PM) => http://www.talkbass.com/threads/new-project-p-bass-mutt.1152450/
Extra Info
a:3:{s:32:"047c5780544be006b15ccddea7e21c3b";a:2:{s:1:"t";i:1431657821;s:1:"i";s:11:"65.8.151.82";}s:32:"4f1a841101a7abf07159e81e60a59410";a:2:{s:1:"t";i:1431657826;s:1:"i";s:11:"65.8.151.82";}s:32:"d8e0b4e05a66de76d87e86febd574ab2";a:2:{s:1:"t";i:1431657835;s:1:"i";s:11:"65.8.151.82";}}
IP address added to cache: 65.8.151.82
Session User Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.152 Safari/537.36
Log Time: Yesterday at 9:44 PM
User: rmvdfor_privacy
JavaScript Detected: True
Has Cookie: True
Session ID: e46aa29ebff411c74ccd67eb3bff1fd7
gethostbyaddr(65.8.151.82) = adsl-65-8-151-82.mia.bellsouth.net
Friendly messages in this session: 0
gethostbyname(adsl-65-8-151-82.mia.bellsouth.net) = 65.8.151.82
Proxy Header: No Proxy Found
Recent Visits at Log Time
I've fixed the install issue
array_key_exists() expects parameter 2 to be array
I've also added one extra check to session switchers (not just a javascript check but secret ingredient 1.. this should really put my mind at ease that these are always bots)
This could be 2 users on the same IP, it could also be the same user (one looks more like a bot than the other)
So far what is interesting here is the user agent
When detected as a bot:
Mozilla/5.0 (X11; Linux x86_64; rv:10.0) Gecko/20100101 Firefox/10.0 (Chrome)
That is Firefox (10!!!!) on Linux... this is fairly strange regardless of anything else
- using a very old browser implies automation ( I personally hate having to update browsers, update webdrivers and re-write things when doing automation with selenium/ant/junit/awt robot, so often I keep my browser at the same older version).
- Using a Linux box is also a little bit indicative (this is usually what I host on / automate from)
- No js is also indicative
- Session switching could just be opening / closing browsers a few times within seconds
When detected as a human (at around the same time):
Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.152 Safari/537.36
That is Chrome (current latest version) on Windows 7
This one to me looks like their real browser, and now you know there username
I wonder if they attempted to bot, failed, so logged in to see what happened. I think grabbing the server access logs for this user will be more revealing
Have a chat with this member. I would play it dumb and ask just if they did anything unusual with their browsers (I would be interested in their response). It might be innocent (browsing for Linux FF10 is a bit strange, but not naughty on its own.. no js detection on that browser is even stranger). Assume innocence (but have a good look at the server access logs for this IP). Is that member a well known member?
We use essential cookies to make this site work, and optional cookies to enhance your experience.