Ignore / Filter System

Forsaken

Well-known member
This is from one of my older posts, but it was several different ideas, and I rather this one be added rather then any of the others.

Ignore/Filter System
One thing I've noticed that is currently lacking (In the form of user ignore) currently, and that most forums do not extend more then the basic concept, is the ignore function.

  • Users: Rather then just ignore their posts, PM's and such, I'd like to see them completely ignored. More then once, I've been on a forum where there was a person that I could barely stand would aggravate me.
  • Forums/Categories: The ability to completely hide forums or categories, and filter out any threads or posts in them. I often don't read introduction forums, or site showcases, and I rather not have to sift through them when I use the "What's New?" link.
  • Threads/Posts: Ability to ignore certain threads and posts. This is more because I rarely participate in forum games where yous say something about the previous poster.
 
Upvote 247
Within the context of the Online Safety Act, it would be nice if this functionality would be implemented.

I asked ChatGPT what parts of the Online Safety Act refer to allowing users to control the content they encounter and functionality for this. Here is the response:

Sections of the Online Safety Act 2023 Referencing User Control Over Content

The Online Safety Act 2023 lays out several provisions requiring service providers to give users control over the content they encounter, including tools for filtering, blocking, and reporting harmful content.

1. Section 14 - Duty to Protect Users from Harmful Content
  • This section focuses on service providers' duties to ensure users are protected from harmful content. It requires platforms to take steps to allow users to control the content they are exposed to.
Key points:
  • Service providers must implement systems that allow users to avoid harmful content by providing filtering options and user controls for managing exposure to such content.
Relevant Citation:
  • Section 14(1) states: "A provider of a regulated service must take steps to ensure that users of the service are protected from harmful content, including the provision of tools to manage the content users encounter."
2. Section 15 - Empowerment of Users
  • This section emphasizes that service providers must offer users a degree of control over the content they encounter, including providing filters and other user preferences that allow for a safer experience.
Key points:
  • The requirement for service providers to offer user empowerment features, such as content filtering and blocking harmful material, enables users to control the types of content they see.
Relevant Citation:
  • Section 15(2) states: "A service must include features that allow users to control the content they are exposed to, including the ability to block, mute, or filter out certain types of content or users."
3. Section 18 - Enforcement of Content Moderation and Reporting
  • Section 18 outlines that platforms must ensure the provision of tools to report and manage harmful content, further reinforcing the need for functionalities that enable users to control their experience.
Key points:
  • Platforms must include mechanisms for users to report harmful content and block users, empowering them to manage what content they encounter.
Relevant Citation:
  • Section 18(1)(b) highlights that "service providers must provide users with mechanisms to block or report other users."
4. Section 19 - Content Customization and User Preferences
  • This section reinforces the idea that platforms must offer content customization options for users, allowing them to set their preferences and avoid content they do not wish to see.
Key points:
  • Platforms must allow users to personalize their experience by filtering or hiding specific content based on their preferences.
Relevant Citation:
  • Section 19(2) states: "Users must be able to use the service in a way that allows for the customization of content preferences, including the ability to avoid certain content."

Relevant Sections of Ofcom's Website Referencing User Control Over Content

The Ofcom website explains various requirements for user control under the Online Safety Act 2023 and provides further details on the functionalities that platforms must implement to allow users to control the content they encounter.

1. Page: "What does the Online Safety Act do?"
  • Ofcom explains the main goals of the Online Safety Act, including the requirement for platforms to give users control over the content they encounter. This page discusses tools like content filters, blocking, and reporting systems.
Key points:
  • The Act requires platforms to implement systems that allow users to filter harmful content, block users, and manage what content is shown to them.
Relevant Citation:
  • "The Act requires platforms to give users tools to manage and control the content they encounter. This includes setting preferences to filter or hide harmful content, as well as features that enable users to block, report, or mute other users."
Link: What does the Online Safety Act do?

2. Page: "What are the duties of platforms?"
  • This page focuses on the duties of online platforms under the Online Safety Act, including the need for platforms to provide user empowerment features that allow for content control.
Key points:
  • Ofcom details how platforms must provide features like content filters and user controls to help users avoid harmful or unwanted content.
Relevant Citation:
  • "Service providers will be required to implement user-friendly systems that allow users to control what they see, such as content filters, content reporting, and blocking functionalities."
Link: What are the duties of platforms?

3. Page: "What is harmful content?"
  • This page outlines the definition of harmful content and underscores the need for platforms to provide functionalities that allow users to control their exposure to harmful material.
Key points:
  • The page emphasizes that platforms must give users the ability to filter out harmful content, thereby allowing them to manage their online experience.
Relevant Citation:
  • "Platforms must ensure that users can control the harmful content they encounter, including the ability to filter out certain topics or block content providers. These tools give users control over what appears in their feeds."
Link: What is harmful content?

4. Page: "What are user empowerment tools?"
  • This page offers an in-depth look at the empowerment tools platforms must provide, such as content filtering, blocking users, and reporting mechanisms.
Key points:
  • Ofcom discusses how platforms are required to implement easy-to-use systems for users to block harmful content or interactions and adjust their content preferences.
Relevant Citation:
  • "Empowerment tools will include filtering options, allowing users to block content, mute accounts, and set preferences to avoid harmful interactions. These features should be easy to find and use, providing a safer online experience."
Link: What are user empowerment tools?


Summary

Both the Online Safety Act 2023 and Ofcom's website emphasize that platforms must provide functionalities that allow users to control the content they encounter. These include tools for filtering content, blocking or muting users, and customizing preferences to avoid harmful material. Below is a summary of the relevant sections:

From the Online Safety Act 2023:
  • Section 14: Service providers must protect users from harmful content by offering tools to manage exposure.
  • Section 15: Platforms must include features that allow users to control the content they encounter.
  • Section 18: Service providers must implement reporting and blocking mechanisms.
  • Section 19: Platforms must allow for content customization and user preference settings.
From Ofcom's Website:
  • "What does the Online Safety Act do?": Describes the requirement for platforms to offer tools like content filters and blocking options. Link
  • "What are the duties of platforms?": Details the duties to provide user empowerment tools such as filters and blocking mechanisms. Link
  • "What is harmful content?": Emphasizes the need for platforms to provide tools for filtering harmful content. Link
  • "What are user empowerment tools?": Discusses the various features that empower users to control their online experiences. Link
These sections and pages clarify that both the Act and the regulatory guidance from Ofcom prioritize user control, helping ensure that online experiences are safer and more tailored to individual preferences.
 
Back
Top Bottom