Updated, tested, and working great now! Working in conjunction with our Cloudflare filters and it's knocked down several hundred active scrapers/bots/bad clients. Thanks for the fast response and updates!
Appreciate the fast addition to the request! Installed the latest update and enabled the 403/error detection (with low thresholds for testing), however I am unable to get it to block a client regardless of settings when loading multiple error pages. I can trip the 429 too many requests limit...
Would it be possible to implement 403 or "You must be logged-in to do that" pages as a specific counter for blocking? We have scraping bots that ignore the error codes and hit user profiles/restricted threads outside the existing thresholds for page counts. Counting the access denied pages over...