[TAC] Stop Human Spam

[TAC] Stop Human Spam [Paid] 1.4.8

No permission to buy ($19.00)
Had this mod a while but only just got round to installing it. Seems great, nice work.

A suggestion...

When a user tries to post and it throws up the requirement errors, is it worth removing the ones that they already meet?

It seems a bit unnecessary to list something like:

The number of days you have been registered must exceed: 1 (Yours: 18)

I also don't make 'Likes' a requirement on my forum so, again, not sure this is needed:

The number of Likes you've received must exceed: -1 (Yours: 0)

Makes it look a bit confusing for genuine users.


EDIT: Just realised I can edit the message shown to users! Ignore me.
 
If you don't use likes for the Link-Post-Rules or Signature-Modify-Rules, you can remove that line from the ACP options:

The number of Likes you've received must exceed: {min_likes_m1} (Yours: {like_count})<br/>

Although as you suggested, for usability I might put this in a template, and use a conditional operator to remove lines automatically when 0 is selected in the ACP options

I'll add it to the to-do list for this plugin: http://xenforo.com/community/threads/stophumanspam-anti-human-spam-paid.45491/#post-488411
 
Just noticed the grammar in the error message text

Rich (BB code):
Sorry, you were not able save since the content contained a link<br/>

Would be better to say

Rich (BB code):
Sorry, you were not able to save since the content contained a link<br/>
 
A link to edit the user from the logs would be nice (so I can easily ban those that are trying to add spam links.)
 
You know... I had actually missed this add-ons features until now, as this looks like a much better replacement for http://xenforo.com/community/threads/splendidpoint-com-antispam-prevent-links-and-emails.26884/ which the developer doesn't update / fix the holes with, thus it's half a human spam prevention, not fully, like this one offers. Sweet... will give it a run tomorrow.

It's great, been using it for a couple of weeks. Love checking the logs to see spam users give up trying to post their links.

This combined with tenants other mod http://xenforo.com/community/resources/foolbothoneypot-bot-killer-spam-combat.1085/ have got my forum spam free.
 
This doesn't stop spammy URL types in the name field for guest submissions, just so you know.

I used "s p a m . c o m" as a username when guest posting, and it worked fine.
 
This doesn't stop spammy URL types in the name field for guest submissions, just so you know.

I used "s p a m . c o m" as a username when guest posting, and it worked fine.


To be honest, guest submission isn't something I've tried (I use CustomImgCaptcha for guests, so guest bots have never gotten past)
So, usernames could be used for "sneaky urls" by human guest spammer...
I'll have to look into that, it is something guest spammers will try to use
 
Tenants, with the four options controlling link posting, is it you only need to meet one, or meet all requirements?

You have to meet all of the requirements in order to post links (or signatures / email addresses etc. depending how you have set it up)

So, I have to be
  • registered for at least 2 days
  • & have a minimum number of 2 likes
  • & have a link: post ratio > 1 %
You can set any of these to a value of 0 (so they don't need to be met)
Or set permissions for certain user groups, so certain groups don't need to meet these conditions
Or, select certain forums to bypass these rules
 
In 1.1.2, if a URL bbcode is posted which uses any formatting bbcodes inside the tag content, like:

Code:
[url=http://my.spammy.site][b]Hot Root Vegetable on Vegetable Action![/b][/url]

... this won't get caught by StopHumanSpam, as the contentHasBBUrls() method uses the regex:

Code:
        preg_match('#\[url([^\[]*?)\[/url\]#i', $message, $matches);

... so that class negation of [ means it won't trigger, because of the embedded tag. So I've been getting hammered by the latest "Michael Kors Outlet" spam, among others, which uses bold bbcode around the content of the URL tag.

I'm testing a different regex:

Code:
preg_match('#\[url\](.*?)\[/url\]|\[url\s*=\s*(\S+)\].*?\[/url\]#i', $message, $matches);

... which so far seems to be working, and no reports from my users of false positives.

-- hugh
 
I'm going to look at this and find out if I can improve it, what you've done might be enough. Embedding the url with bb code was crafty, but I should be able to pick it up with out any issue

keeping it simple, this alone should avoid the embedded bbcode
[url(.*?)\[/url\]
This part was the mistake "^\["

I'll have to do some testing of my own, thank you for the above, I'll probably add an update soon with the above preg match (and a couple of other minor changes to other regex)

 
I thought about just removing the ^\[, but after about 30 years of working with regex, I've found that if you have two distinct variants of a given pattern to match, it's usually best to be specific about it, with an either/or using the |. Partly for readability, and partly just erring on the safe side, rather than creating a more generic expression to match both variants.

Just my $0.02. Anyway, thanks for the quick response, and I'll keep an eye out for your update.

BTW, I asked a question on the AnyApi discussion, if you have a minute, maybe you could reply there?

http://xenforo.com/community/thread...m-anti-fraud-any-api-thing.45358/#post-519821

-- hugh
 
tenants updated StopHumanSpam - Anti Human Spam with a new update entry:

StopHumanSpam - Fixes and Enhancement (bbcode urls + moderated threads/posts) with keywords

  • Fixed an issue related to bbcode embedded urls not being detected
  • Its now possible to moderate messages ~(threads / posts) with banned words (rather than just preventing creation)
  • stretched strings such as "wwwwwwhy" are no longer detected as sneaky urls
  • minor improvements (avoiding false positives)

Read the rest of this update entry...
 
Top Bottom