EU court: pay damages for ineffective Post Report moderation system.

So any forum/website that meets those 8 criteria is legally responsible for user-posted content?
It seems a little more complicated. The EU court speaks of levels of responsibility. What those levels exactly are is still unclear. Though we do know that a website which is purely a technical service provider has no responsibility in this regard and that the aforementioned 8 factors each increase the responsibility. So a website who has several of the 8 factors has less responsibility than Delfi, but does still have a responsibility for to avoid unlawful posts.

Courts in EU member states may each approach this 'level of responsibility' in a different manner, but its clear that freedom of speech does not apply. Most EU member states have implemented stiff laws that make the publishing of hate speech a crime.

The Sophos security blog have posted a short article which explains the case and sums it up nicely.
There are considerable mistakes in that article.


Also relevant is this German court ruling (Hamburg court) which states that forum owners should have unlimited liability, and are co-culpable for, any comments or discussions that take place on their sites, even if posted by third parties (eg. users) and without the forum owners knowledge. The report says:
"... Specifically, the court found that the forum operator could be found co-culpable by merely providing a platform for an inadmissible comment. In other words, the operator does not have to agree with the content or have entered the content directly. Furthermore, it does not matter whether the content came from the operator or from external parties, nor does it matter whether the forum provider has any knowledge of the contents. All that matters is that the forum operator provide its own website for the propagation of said content. The judges did, however, say that liability could be prevented if the operator of the website specifically states that the content of a particular comment does not represent its own opinion; it does not suffice if the forum provider generally states that the opinions in user comments are not its own."
"In addition, the court found that there was grounds for liability because Internet forums are a journalistic-editorial platform as specified in the revised Section 54 of the German Broadcasting Agreement (RStV), which stipulates that editors must check the content, origin and veracity of messages published with the utmost reasonable due diligence..."

Also relevant is this UK case where owlstalk.co.uk forum owners had a responsibility to provide details on anonymous forum users.
Dan Tench, a solicitor from Olswang, the firm representing Gentoo: “This case illustrates an increasingly important legal issue: proving who is responsible for the publication of anonymous material on the internet. This is likely to be a significant issue in defamation cases in the future.”
The recent EU / Delfi court case is relevant to this as one of the factors was that it was unlikely for Delfi to prove who actually posted the hateful comments. (due to proxy, dynamic IP, public/shared computers, etc) As Delfi could not prove the identity of the posters, Delfi had an increased level of responsibility.
 
Astonishing.

So the court believes that website owners should be able to identify all anonymous users, even though fully-funded law enforcement sometimes struggle to identify an experienced, determined user. The technology to automatically identify anonymous posters does not exist, yet an EU court seems to be demanding that website operators use it or be held personally responsible.

In the ruling, the court admonished Delfi for not allowing users to edit or delete their posts. In the same breath, they demanded that site owners hire staff to read all comments. Presumably this staff would have to also read every single version of every comment as they undergo edits by users?

Incoming ridiculous yet completely applicable analogy...

Effective immediately, all bridges must introduce anti-suicide technology. Any bridge operator who fails to install such technology will be held liable for any suicides. Bridge operators must hire crisis counselors at all times and see to it that all potential bridge-crossers are interviewed to ascertain their mental well-being and to screen for potential suicidal tendencies. That such a review process would make expedient travel impossible and would bankrupt all companies involved is not the court's problem. That anti-suicide technology does not exist is irrelevant.

To put it another way, the EU seems to want to hold pen and paper manufacturers responsible for what is written using them.
 
Last edited:
It's the EU.
Nonsensical laws and rulings are normal.

I'll be glad when it all collapses, hopefully precipitated by Greece defaulting at the end of this month.
 
So the court believes that website owners should be able to identify all anonymous users
It only the UK court who ruled that forum owners need to prove the identity of forum users. Lithuania and EU HRC concluded that this is practically impossible and therefore the site owner has responsibility.
In the ruling, the court admonished Delfi for not allowing users to edit or delete their posts. In the same breath, they demanded that site owners hire staff to read all comments. Presumably this staff would have to also read every single version of every comment as they undergo edits by users?
Yes, this is an insane paradox that should have been explained to the courts.
hopefully precipitated by Greece defaulting at the end of this month
I'm not so sure about that one. If Greece defaults then their new friend Vladimir will be happy to help out and station his 40 shiny new nuclear ballistic missiles there, while the EU watches its 312 billion Euro evaporate into thin air. But that is another story. I concur with the sentiment though and greatly despise the EU privacy creep.
 
I think that if the community software got some more user friendly feature to make users report, in a active and passive way, content then admins could relax.

Content is the king so we need to take care of it. We need the weapons to downsize the possibility that a Delfi case happen to us.

IMHO The majority of users doesn't use the report system because they see it like "do the spy". Adding a more personal system like Facebook does (hide this post) with some automation could help us.

Example:
If a "hide this post (or image or attachment or any content)" threshold, combined, eventually, to down votes is meet an automatic report is filed and/or an action is taken (soft delete/go to moderation queue)

The same if a threshold of words (aside the censor list) is met in a post/comment or maybe in a thread?

This measures will add overhead? Yes, but IMHO similar solutions are necessary to lower the risk of a Delfi case.
 
Last edited:
It's the EU.
Nonsensical laws and rulings are normal.

I'll be glad when it all collapses, hopefully precipitated by Greece defaulting at the end of this month.
HAve to agree, I moved from australia to the Netherlands, and ever since I have found so many aspects of doing business under the EU system extremely inhibiting and ludicrous at times
 
I think that if the community software got some more user friendly feature to make users report, in a active and passive way, content then admins could relax.
It doesn't matter how easy it is to report content, users just will use that functionality.
 
It doesn't matter how easy it is to report content, users just will use that functionality.
There is a difference between active and passive reporting. If users can just flag a post as problematic similar to Post Ratings, then this is something that many users will do. With such flagging function users will probably catch almost all problematic content.
Actively filing a Report against someone is not something many users will do. Post reports only catch a low amount of problematic content and requires moderators to actively search for it.
 
Last edited:
Facebook obvious has taken notice and is getting proactive about flagging 'hate speech' and avoiding liability:

Facebook said on Monday that it would work alongside the German government to quell racism and hate speech on its site.

As waves of migrants flood into the country seeking asylum, hate speech on social media sites has spiked, according to The Wall Street Journal.

The German Justice Ministry will work alongside Facebook and other sites to create a task force that determines whether flagged content violates German law. The intention is to speed the process of deleting hateful posts and comments.
More here: http://fortune.com/2015/09/15/facebook-hate-speech-germany/
 
Back
Top Bottom