Upgrade the ignore function to 'block' / Blocking Ignored Users

Razasharp

Well-known member
I think ignoring someone on a forum has never really worked from a users perspective. On vB you could still see the 'ignored' person had posted, and on XF when you ignore someone it doesn't prevent them from quoting you, seeing your threads, posting in them.

Usually when someone 'ignores' someone, what they really want to do is block them.

Any chance of this being included?
 
Upvote 92
You need it sometimes.

[...]

Otherwise you'd probably go absolutely crazy because people drive you nuts.
Personally I don't need an ignore or even block function offered by the software.

I've got nobody on ignore here or on any other forum.

I am pretty resilent against things I don't care, my brain just ignores such content :)
 
Personally I don't need an ignore or even block function offered by the software.

I've got nobody on ignore here or on any other forum.

I am pretty resilent against things I don't care, my brain just ignores such content :)
Yeah but it doesn't stop the pests from contacting you. You can be trolled by people who you're ignoring.
 
I've coded a custom addon for my forum that turns the ignore list into a vanish list. If a user vanishes another user, it's a two way thing and neither user will exist to each other on the forum. I was inspired by @AndrewSimm's nice two way ignore moderator addon.

I don't see you sharing it via resources. Is it possible you could share this with the rest of us?
 
Personally I don't need an ignore or even block function offered by the software.

I've got nobody on ignore here or on any other forum.

I am pretty resilent against things I don't care, my brain just ignores such content :)
Yes Kirby, but you don't run a Balkan forum. I need a "User ignore" like Facebook or like the Ignore Block/Essential

1725942906074.webp
 
No thanks...but if it is implemented, I hope it's permission based so we can set the "Can block other members" to "Never". I'd also like to think that would be the default. Imagine someone blocks you, then you're reading through a thread they're involved in, and you're only seeing one person's replies and it looks like they're talking to themselves. For a forum setting, that's just silly.
 
No thanks...but if it is implemented, I hope it's permission based so we can set the "Can block other members" to "Never". I'd also like to think that would be the default. Imagine someone blocks you, then you're reading through a thread they're involved in, and you're only seeing one person's replies and it looks like they're talking to themselves. For a forum setting, that's just silly.

No different from all other social media.. in 2024, people understand that the disrupted flow can be from content being removed, people deleting their own content (which they can already do in xenforo), and people being blocked. Just because you don't need it, doesn't mean other forums don't. Yes, I agree though, there should be permissions enabling users to block others.
 
No thanks...but if it is implemented, I hope it's permission based so we can set the "Can block other members" to "Never". I'd also like to think that would be the default. Imagine someone blocks you, then you're reading through a thread they're involved in, and you're only seeing one person's replies and it looks like they're talking to themselves. For a forum setting, that's just silly.
It may be stupid, but on the other hand you would have to block a lot of users, sooner or later we (team) would discuss threads alone.
I run a Balkan forum, our users are ethnic Serbs, Croats, Albanians, Bosnians, Greeks, Turks etc. there are a lot of controversial topics
 
You've gotta remember that if you can block on social media they can't search for you.
Ignore button you put someone on ignore who can see your posts and still abuse you.
What we want is some troll to say on here "oh i've finally been blocked".
 
No different from all other social media.. in 2024, people understand that the disrupted flow can be from content being removed, people deleting their own content (which they can already do in xenforo), and people being blocked. Just because you don't need it, doesn't mean other forums don't. Yes, I agree though, there should be permissions enabling users to block others.

I'm not saying it absolutely shouldn't be implemented. If the masses want it, then so be it. I'm just stating my opinion on it and asking that, if it is implemented, that we need a way to disable it, unlike the ignore system which cannot be disabled.
 
2013 suggestion...here goes nothing... +1

this is not a feature you don't personally need cos you are resilient, is not about YOU.
There's people that do not want to be contacted by certain individuals and have no way to stop it currently.
They are not technically breaking any rules so punishment is not warranted however being able to block is a basic feature of any social software.
 
In the context of the new Online Safety Act 2023, it would be nice to have this functionality. The Ignore function (muting) fulfills part of the obligations, but completely blocking would be much better.

I asked ChatGPT for a full rundown:
Sections of the Online Safety Act 2023 Referencing Blocking Other Users

The Online Safety Act 2023 outlines various provisions that require platforms to give users the ability to block other users, helping to protect individuals from harmful content or abusive interactions. Below are the relevant sections of the Act that specifically mention blocking functionalities:

1. Section 14 - Duty to Protect Users from Harmful Content
  • This section emphasizes the obligation of service providers to ensure that users are protected from harmful content. While it primarily focuses on content moderation, it implies the need for functionalities that allow users to manage interactions with other users, including blocking harmful or abusive individuals.
Key points:
  • Service providers are required to implement systems that protect users, and part of this protection involves allowing users to block other users or harmful content.
Relevant Citation:
  • Section 14(1): "A provider of a regulated service must take steps to ensure that users of the service are protected from harmful content, including the provision of tools that allow users to manage or block exposure to harmful interactions or content."
2. Section 15 - Empowerment of Users
  • This section outlines the responsibility of service providers to give users control over their online experience, which includes providing features that allow users to block other users.
Key points:
  • The section explicitly requires platforms to provide users with features that allow them to block or mute other users to manage harmful or abusive interactions.
Relevant Citation:
  • Section 15(2): "A service must include features that allow users to control the content they are exposed to, including the ability to block, mute, or filter out certain types of content or users."
3. Section 18 - Enforcement of Content Moderation and Reporting Features
  • This section outlines the requirements for platforms to provide users with features for reporting and blocking harmful users. It emphasizes the importance of user tools to help manage and mitigate harmful interactions, including blocking abusive individuals.
Key points:
  • Platforms must ensure that users have the ability to block or report harmful users as part of their content moderation features.
Relevant Citation:
  • Section 18(1)(b): "Service providers must provide users with mechanisms to block or report other users who may be engaging in harmful or abusive behavior."
4. Section 19 - Content Customization and User Preferences
  • This section mandates that platforms offer tools that allow users to customize their content exposure, which can include blocking specific users from interacting with them or posting content that appears on their feeds.
Key points:
  • Users must be able to control their exposure to harmful content and users, which includes the ability to block certain users.
Relevant Citation:
  • Section 19(2): "Users must be able to use the service in a way that allows for the customization of content preferences, including the ability to block or avoid certain content or users."

Relevant Sections of Ofcom's Website Referencing Blocking Other Users
Ofcom's website provides additional information on how the Online Safety Act 2023 mandates platforms to offer tools for blocking users. Below are the key sections of the Ofcom website that refer to the requirement for blocking functionality and how users can utilize this feature.

1. Page: "What does the Online Safety Act do?"
  • Ofcom outlines the main objectives of the Online Safety Act, including user empowerment and the need for platforms to provide mechanisms to block harmful content or users.
Key points:
  • The page emphasizes the need for platforms to provide features that allow users to block or mute users who may be engaging in harmful behavior.
Relevant Citation:
  • "The Act requires platforms to give users tools to manage and control the content they encounter. This includes features that allow users to block, report, or mute other users who engage in harmful behavior."
Link: What does the Online Safety Act do?

2. Page: "What are the duties of platforms?"
  • This page focuses on the specific duties of service providers under the Online Safety Act, including the requirement to implement blocking and reporting features.
Key points:
  • Ofcom explains that platforms must ensure users have access to mechanisms that allow them to block or report harmful users, helping to mitigate online abuse and harmful interactions.
Relevant Citation:
  • "Service providers will be required to implement user-friendly systems that allow users to control what they see, such as content filters, content reporting, and blocking functionalities."
Link: What are the duties of platforms?

3. Page: "What are user empowerment tools?"
  • Ofcom provides more detail on user empowerment tools that platforms must provide under the Online Safety Act. This includes functionalities for blocking users, muting, and reporting harmful content or behavior.
Key points:
  • The page discusses how platforms are expected to give users tools to block or mute users who might be engaging in harmful behavior or posting harmful content.
Relevant Citation:
  • "Empowerment tools will include filtering options, allowing users to block content, mute accounts, and set preferences to avoid harmful interactions."
Link: What are user empowerment tools?

4. Page: "What is harmful content?"
  • This page highlights the concept of harmful content and the importance of user controls to block or report harmful users.
Key points:
  • It specifies that platforms must provide tools for users to block or mute users who post harmful content, and to report those who engage in abusive behavior.
Relevant Citation:
  • "Platforms must ensure that users can control the harmful content they encounter, including the ability to filter out certain topics or block content providers."
Link: What is harmful content?


Summary

The Online Safety Act 2023 and Ofcom's website both emphasize the importance of providing users with tools to block other users as part of the wider responsibility to protect users from harmful content and abusive interactions online. Below is a summary of the relevant sections:

From the Online Safety Act 2023:
  • Section 14: Platforms must provide tools that allow users to block harmful content or users.
  • Section 15: Users must have the ability to block or mute other users.
  • Section 18: Service providers must implement reporting and blocking features for harmful interactions.
  • Section 19: Platforms must allow users to block or avoid certain users and content.
From Ofcom's Website:
  • "What does the Online Safety Act do?": Discusses the requirement for platforms to offer blocking features. Link
  • "What are the duties of platforms?": Explains the duties to provide blocking functionalities and reporting tools. Link
  • "What are user empowerment tools?": Highlights the blocking and muting features required under the Act. Link
  • "What is harmful content?": Mentions that platforms must enable users to block or mute harmful content and users. Link
Both the Online Safety Act and Ofcom's website highlight the importance of user control, specifically through features that allow blocking, muting, and reporting harmful users to create a safer online environment.
 
this is not a feature you don't personally need cos you are resilient, is not about YOU.
That is true of many moderator features here. There are complaints--"no, we don't need this feature," yet each community has its own personality and trust me, a forum that has a couple thousand online visitors at any one time has a report queue that is a constant flood of issues. If some of this can be stopped, for some of us a feature like this can be one in a handful of tools to keep the moderation staff's jobs just a little easier. Plus, I personally would start participating in forums with two-way blocking as there are certain members I'd rather not have read and respond to my posts. Granted, they could log out and see them, but to log out and back in constantly is inconvenient (and many don't stop to think they can open a separate session in an Incognito/InPrivate window or a different browser, especially since over 50% of visitors seem to be on mobile).

I'm happy to have an addon that does this, if there was one. (I believe the popular one has been removed?) But if it's a core feature, that's all for the better.
 
I have removed <xf:showignored wrapperclass="block-outer-opposite" /> from templates, so content stays hidden. Otherwise members just click 'show ignored content'.

And use force ignore when necessary.
 
Last edited:
Back
Top Bottom