Better functionality to report and moderate illegal hate speech

Alpha1

Well-known member
As the first EU nation has now approved a bill to impose hefty multimillion fines on communities for not removing abusive content within 24 hours, we are in dire need of better functionality for content flagging, reporting and moderation. More information below.
We need to have functionality to:
  1. Have members and guests flag abusive content and define exactly what type of problem the content has.
    For example by selecting the type of rule breach from a drop down on the report interface.
  2. Have the content moderated if the content falls in the category hate speech, flaming, slander, insult or fake news.
  3. Show reports with abusive content on top in the report center and clearly mark them as urgent.
  4. Automatically assign the report to the responsible moderators.
  5. Show the amount of time passed since the report was opened.
  6. Send an alert to the responsible moderators, supermoderators and administrators.
  7. If the report is not resolved within X hours, display a warning that the report is overdue or send reminder alerts about this.
  8. Optimally there would be one click macros for the most common moderator actions.
    For example: delete post, warn member & resolve ticket.

Due to the extreme fines that webmasters face (up to €50 million), I suggest this for both XF1 and XF2.

Background information:
Germany bill imposing €50M fine for failure to remove online hate crime, fake news fast enough
EU court: pay damages for ineffective Post Report moderation system.
Google will let users flag offensive content
 
Last edited:
Upvote 34
The US senate is threatening a wide array of bills to regulate hate speech, fake news, fake accounts and GDPR like legislation:
https://theadminzone.com/threads/us...-speech-fake-accounts-data-protection.148201/
Biggest issue they will have with that is some aspects interact with the 1sst amendment... and there's as reason that it's the 1st amendment listed.
The founding fathers opinion of free speech was at the forefront of their minds. Just because some people don't like the speech doesn't mean it can be regulated. That is one area that the US is different than the EU - amongst several other ways.
Same way with "fake" news.
And technically, your account is a "fake" account as it's not who you REALLY are (your name by law).
 
This is now going to impact websites.

Google has now applied rigorous changes to its algorithms and a team of content quality moderators is evaluating websites.

If hate speech, upsetting or offensive content, deceitful/misinforming content, harmful pages, illegal images are found then you get the lowest rating. If you get the lowest rating then this is what happens with your traffic:
aug-1-traffic-drop.webp
A 60-80 drop in traffic is not out of the ordinary.

And conversely for pages that score well on page quality this has happened:
.webp
Source: SemRush.

Quite a lot of sites have been hit with this update. And as the Google Team reviews more sites, more and more websites will be affected.

IMHO this is an important matter that will only become more critical.
 

Attachments

Coming up in Canada:
 
A new law is introduced in the UK that applies to online communities that have UK users. Communities need to protect against problematic content in a wide array of topics. It considers abuse, hate speech, age restrictions, illegal content, reporting mechanisms with time deadlines and even suggests to use AI scanning of all content for rule breaches.

Ofcom sets out more than 40 safety measures for platforms to introduce from March

Every site and app in scope of the new laws has from today until 16 March 2025 to complete an assessment to understand the risks illegal content poses to children and adults on their platform.



As XenForo resides in the UK, it seems that XF may be required to add functionality to accommodate the new laws.
@Chris D is this something that you are looking into?
 
The EU has thankfully been quite slow to create and implement such laws but it keeps creating new restrictions. Current restrictions mainly apply to websites with a very large userbase like major social media and does not apply to most forums yet. So I assume no forums have been fined under EU regulation nor will be fined until more regulation comes in. We can likely expect the rollout of new regulations due to war, cyberwarfare and online disinformation campaigns.

This new UK law seems to be more relevant as it applies to forums, starts in March 2025 and there is a UK watchdog in place. @zappaDPJ sums it up nicely:
until I read though it after one of my corporate tech clients consulted their legal advisers.
....
I would also question whether Xenforo currently provides the necessary administrative functions to fulfil the requirements of the Act.
...
I'm not trying to be alarmist but if you read through the ACT it's clear that you will not be fulfilling your obligations under the Act just because your site holds nothing more than family friendly content.
I would expect that online forums will get fined by the watchdog for not complying after the laws comes into effect.

I'm not sure if this new UK law requires XenForo to do anything yet. The quick guide states:
Our Codes of Practice set out a range of measures in areas including content moderation, complaints, user access, design features to support and protect users, and the governance and management of online safety risks.
From reading the guidelines it seems clear they require:
  1. Content reported for specific areas of concern to be removed from search and general viewing.
  2. Allow people to easily report illegal content and operate a complaints procedure.
  3. Time deadlines for reported content.
  4. Age restrictions.
Some extra rules only apply to sites with very large userbase and are not relevant to most forums.
 
Not currently.

How many forums have been imposed fines due to the EU rulings from 7 years ago?
This.

I think this whole new law get people overreacting. It shouldn't change anything compared to what we're doing now. See something against your rules? Delete it. The whole GDPR got people freaking out like it was the end of the world for forums. Still hasn't changed a thing.

If you have illegal content and do nothing about it, you will be fined. But it's been like that ever since the beginning.
 
The EU has thankfully been quite slow to create and implement such laws but it keeps creating new restrictions. Current restrictions mainly apply to websites with a very large userbase like major social media and does not apply to most forums yet. So I assume no forums have been fined under EU regulation nor will be fined until more regulation comes in. We can likely expect the rollout of new regulations due to war, cyberwarfare and online disinformation campaigns.

This new UK law seems to be more relevant as it applies to forums, starts in March 2025 and there is a UK watchdog in place. @zappaDPJ sums it up nicely:

I would expect that online forums will get fined by the watchdog for not complying after the laws comes into effect.

I'm not sure if this new UK law requires XenForo to do anything yet. The quick guide states:

From reading the guidelines it seems clear they require:
  1. Content reported for specific areas of concern to be removed from search and general viewing.
  2. Allow people to easily report illegal content and operate a complaints procedure.
  3. Time deadlines for reported content.
  4. Age restrictions.
Some extra rules only apply to sites with very large userbase and are not relevant to most forums.
There are of course potentially software solutions to these issues depending on how far one would want to go. But there are human solutions to such things as well which - as they are now - will remain perfectly compliant.

1. Content can already be reported and then actioned to remove it from search/viewing.
2. Content can easily be reported already. Complaints can already be handled by the contact form, DM, or email.
3. This is a staffing and resource issue, not a software issue.
4. XenForo already imposes age restrictions.

Once again, it is important that we don't have a knee-jerk reaction to such things. Perspective is important. Is Ofcom targeting these rules to catch out professionally run forums? No, absolutely not. It's obvious the kinds of sites they're targeting this towards. It will be the bilge of the forum world, the kinds of forums which, based on our license agreement, we won't even do business with. The kinds of sites that bully, hate, divide and in actual cases, drive people to suicide.

But it's early days. We will keep a close eye on this and where QoL improvements are helpful, we'll consider them in due course. Until then, keep doing what you're doing - well moderated sites are not going to start racking up fines overnight.
 
If the UK sent me a fine, I'd throw it in the harbor just like their tea. Then block UK traffic and give a 😉 don't use a VPN rule.

I operate under the Constitution and interpretations of such by SCOTUS.

Though, QoL improvements for reporting would be nice.

Seems like an overreaction. These laws are broad, but they have a target audience. I don't even think they'd go after a big board, just social media with 100M+ MAU, which has broader implications on changing social perspectives than a forum does.
 
See something against your rules? Delete it.
It applies to illegal content in conversations as well. And that's where a lot of the issues targetted by this act take place.
The whole GDPR got people freaking out like it was the end of the world for forums. Still hasn't changed a thing.
If it has changed nothing for you then consider yourself lucky. We have hundreds of thousands of members so we get GDPR deletion requests all the time. Users are well aware that they can make a GDPR request. Unfortunately expect not only accounts but also posts to be deleted, which does not fall under GDPR. So then they threaten legal action and file a GDPR report to Google making the relevant pages disappear from its index.

Most likely users will also become aware of the OSA and file OSA reports/requests to forums. It seems likely that things will change in the sense of more reports and demands from users. And Google will likely start acting on it as well. Like they did with GDPR.

We are quite happy with the GDPR functionality that XF added for it. Though it would save a lot of work if it would be expanded so that we can allow users to delete their own account if certain criteria are met. For example users who beach the rules or have reports against them should not be able to delete their account and start harassing with a new account.

I agree that there is no need for panic. But for forums that have a significant number of users in the UK its relevant to keep track of the OSA.
 
Seems like an overreaction. These laws are broad, but they have a target audience. I don't even think they'd go after a big board, just social media with 100M+ MAU, which has broader implications on changing social perspectives than a forum does.
They will go after smaller boards. The regulations state as much. It even explicitly calls out forums. But perspective is still important. They're not going to go after small teams of people trying their best to run a decent site (even ones that contain a wide range of opinions). But some of the worst forums on the internet that have literally been instrumental in people ending their lives are smaller, and going after those seems reasonable.
 
So then they threaten legal action and file a GDPR report to Google making the relevant pages disappear from its index.
I'd just copy paste their old post into ChatGPT and instruct it to rewrite it with the same context but remove any PII. It's not theirs anymore.

🤷‍♂️
 
Back
Top Bottom