Most efficient way to ban Australian under-16s from accessing a XF forum?

Growlithe

New member
Hi everyone. It looks like Australia's social media ban for people under 16 years old is coming into effect soon on December 10.


It appears that web forums count as social media, and there don't seem to be any exemptions made for anyone that I can see. How can we best take reasonable steps to prevent Australian users under 16 from having an account? I appreciate the help.
 
Whether your site is subject to the restrictions will depend specifically on the nature of your site and what the primary content and function of your site is.


Classes of excluded services
3. Services that have the sole or primary purpose of enabling end-users to share information (such as reviews, technical support of advice) about products or services (section 5(1)(c))​
  • The Explanatory Statement provides that features of these services typically include discussion forums that enable users to post technical support, advice and reviews about a specific product or service. For example, a service may primarily feature forums where representatives from hardware vendors provide technical support on how to use a product from that vendor, and is therefore excluded under this section.
  • However, if a service features discussion forums that primarily enable users to discuss news, entertainment and other types of content in addition to sharing information about products or services, the service is intended to be subject to the social media minimum age obligation, and is not excluded under this section.

4. Services that have the sole or primary purpose of enabling end-users to engage in professional networking or professional development (section 5(1)(d))​
  • Features of these services typically include facilitating connections between professionals and/or mentors that offer professional insights, including a focus on collaboration, sharing knowledge, career development and/or growth.
  • For example, a service may enable end-users to create a profile that outlines their professional background and career goals, allowing them to connect with potential employers or professional connections.
  • These services are primarily used to build professional networks and the posting of materials generally does not take place anonymously.

5. Services that have the sole or primary purpose of supporting the education of end-users (section 5(1)(e))​
  • Features of these services typically enable educators to distribute course materials, manage and track assignments and facilitate communication through announcements and discussion forums. Children and young people may also be able to use these services to access resources, submit work, collaborate with peers, and receive feedback on their work.
  • While these services are often integrated with other tools such as video conferencing, messaging and the ability to post material on the service, if their sole or primary purpose is to support the education of end-users, they are not intended to be captured as an age-restricted social media platform.
  • However, it is not intended that a service would be excluded merely because it contains some educative content.
  • For example, supporting the education of end-users is unlikely to be the sole or primary purpose of a video platform that hosts an array of content, but also includes tutorial-style videos covering history, science and mathematics. A service should not consider itself excluded merely become some content available to end-users on the service is educational.

... I run several sites that are for industry associations where users are professionals and members of an industry body - these are clearly excluded services.

But for other sites, it's not especially clear - and there will likely need to be a lot more guidance provided by the government about where they will be drawing the line.

My main site is not intended to be used by kids under the age of 16 in any case - and the user demographic is almost exclusively adult (site about property investing). Interestingly my second site is often used by kids under the age of 16, but I'd rather block them from accessing the site if I could. They younger kids frequently cause a lot more moderation work and I wouldn't have an issue blocking them.

The larger concern is availability of age assurance technology suitable for use in software like XenForo - it's a pretty new area, the technology is not widely deployed or used - and there has been no discussion about how small operators are expected to be able to implement such solutions given the focus has squarely been on large social media platforms.
 
Hi everyone. It looks like Australia's social media ban for people under 16 years old is coming into effect soon on December 10.


It appears that web forums count as social media, and there don't seem to be any exemptions made for anyone that I can see. How can we best take reasonable steps to prevent Australian users under 16 from having an account? I appreciate the help.
Make your site 18+
Go into your ACP > Options > User Registration > Change minimum age > 18 > save
 
Whether your site is subject to the restrictions will depend specifically on the nature of your site and what the primary content and function of your site is.


Classes of excluded services
3. Services that have the sole or primary purpose of enabling end-users to share information (such as reviews, technical support of advice) about products or services (section 5(1)(c))​
  • The Explanatory Statement provides that features of these services typically include discussion forums that enable users to post technical support, advice and reviews about a specific product or service. For example, a service may primarily feature forums where representatives from hardware vendors provide technical support on how to use a product from that vendor, and is therefore excluded under this section.
  • However, if a service features discussion forums that primarily enable users to discuss news, entertainment and other types of content in addition to sharing information about products or services, the service is intended to be subject to the social media minimum age obligation, and is not excluded under this section.

4. Services that have the sole or primary purpose of enabling end-users to engage in professional networking or professional development (section 5(1)(d))​
  • Features of these services typically include facilitating connections between professionals and/or mentors that offer professional insights, including a focus on collaboration, sharing knowledge, career development and/or growth.
  • For example, a service may enable end-users to create a profile that outlines their professional background and career goals, allowing them to connect with potential employers or professional connections.
  • These services are primarily used to build professional networks and the posting of materials generally does not take place anonymously.

5. Services that have the sole or primary purpose of supporting the education of end-users (section 5(1)(e))​
  • Features of these services typically enable educators to distribute course materials, manage and track assignments and facilitate communication through announcements and discussion forums. Children and young people may also be able to use these services to access resources, submit work, collaborate with peers, and receive feedback on their work.
  • While these services are often integrated with other tools such as video conferencing, messaging and the ability to post material on the service, if their sole or primary purpose is to support the education of end-users, they are not intended to be captured as an age-restricted social media platform.
  • However, it is not intended that a service would be excluded merely because it contains some educative content.
  • For example, supporting the education of end-users is unlikely to be the sole or primary purpose of a video platform that hosts an array of content, but also includes tutorial-style videos covering history, science and mathematics. A service should not consider itself excluded merely become some content available to end-users on the service is educational.

... I run several sites that are for industry associations where users are professionals and members of an industry body - these are clearly excluded services.

But for other sites, it's not especially clear - and there will likely need to be a lot more guidance provided by the government about where they will be drawing the line.

My main site is not intended to be used by kids under the age of 16 in any case - and the user demographic is almost exclusively adult (site about property investing). Interestingly my second site is often used by kids under the age of 16, but I'd rather block them from accessing the site if I could. They younger kids frequently cause a lot more moderation work and I wouldn't have an issue blocking them.

The larger concern is availability of age assurance technology suitable for use in software like XenForo - it's a pretty new area, the technology is not widely deployed or used - and there has been no discussion about how small operators are expected to be able to implement such solutions given the focus has squarely been on large social media platforms.
I appreciate the help, seems like I had completely missed this. Unfortunately our website counts as social media under these definitions. Thank you for the input!
 
IMO, if you're not fb, insta, snapchat, reddit, etc name brand names, you'll likely never be in trouble. they will follow deep pockets.

If you collected DOB and country, scrub your db and maybe just ban those accounts. That's 'reasonable' given the data you have on them.
Going forward, collect dob and just reject everyone under 16.

/admin.php?options/groups/usersAndRegistration/
 
You might find some useful bits and pieces in the UK Online Safety thread if you can plough through it (90 pages) there are a few links and posts regarding age-verification if it's something you need to implement. It depends somewhat on how rigid these laws are, you might find the existing "give your DoB" or "tick if over 16" type approaches are adequate.

Out of interest are these Australian laws global like the UK ones attempt to be? As in might all of us may have to block <16 Australians, or does it only apply to sites based out of Australia? (along presumably with the big social media players). Can't say I'm looking forward to wading through another 10,000 pages of government guidance!
 
Out of interest are these Australian laws global like the UK ones attempt to be? As in might all of us may have to block <16 Australians, or does it only apply to sites based out of Australia? (along presumably with the big social media players). Can't say I'm looking forward to wading through another 10,000 pages of government guidance!

It applies to anyone who can access the site from Australia.


Page 9, section 1.2: What is an 'age-restricted social media platform'? (emphasis mine):

An electronic service is not an ‘age-restricted social media platform’ if:​
  • None of the material on the service is accessible to, or delivered to, one or more end-users in Australia, or
  • The service is excluded in any legislative rules made by the Minister forCommunications.

Page 9-10, section 1.3 Approaches to determining location:

Providers will need to consider and employ methods to determine whether an end-user isordinarily resident in Australia to ensure that only children under the age of 16 who are ordinarily resident in Australia are prevented from having an account on their service. There are several ways this can be done by providers, including the use of locationinformation.​
(Notes: eSafety would not expect providers to take action on accounts holders who are not ordinarily resident in Australia, such as those temporarily visiting Australia.)​

Page 16, section 1.5 Other related measures

Age gates and self-declaration are generally not seen as sufficient for regulated contexts when used in isolation. Accordingly, eSafety does not consider the use of self-declaration, on its own without supporting validation mechanisms, to be reasonable for purposes of complying with the SMMA obligation​

Page 20, section 2.1 Reasonable steps guidelines - Oveview

eSafety considers the following would not constitute reasonable steps as their effect would be inconsistent with the objectives of the SMMA:​
  • Implementation that relies entirely on self-declaration to determine the age of existing or prospective account holders
  • Implementation where measures rely on age-restricted users holding an account for an unreasonable period of time before detection. Measures that require end-users to engage with a platform for an extended period of time, including to collect sufficient data to assess their age, would allow age-restricted users to be exposed to the harms that the SMMA seeks to address. What is reasonable will depend on the nature of the platform and other verification measures the platform has implemented as part of any layering approach
  • Implementation where measures do not reasonably prevent age-restricted users who have accounts deactivated or removed from immediately reactivating or creating a new account and regaining access to the age-restricted social media platform
Page 27, section 2.3.5 Proportionate

Proportionality and consideration of risk and harm are key components of determining what constitutes reasonable steps. Providers should consider the balance of the measures they implement having regard to their purpose, the risk of harm they mitigate and the impact they have on end-users​
Risk
What constitutes reasonable steps will depend on the risk profile of the service. Services may have a higher risk profile where they have comparatively higher:​
  • existing numbers of children and young people holding accounts
  • prevalence of features associated with harm to children and young people (such as algorithmic content recommendation, ‘likes’, persistent notifications and endless scroll)
  • prevalence of content associated with harm to children and young people (such as violent material and material that promotes unsafe eating habits.)

... I've summarised a few of the more important points from that document here. The eSafety Social Media Minimum Age (SMMA) Regulatory Guidance document seems to be the most comprehensive description of the expectations that I can find.

The key messages I've read are that it is intended to be proportinate - so a small forum with low or no budget will certainly not be held to the same standards as a large for-profit social media platform. How that will be enforced in practice is impossible to know at this point - the legislation hasn't even come into force yet (next month).
 
Going forward, collect dob and just reject everyone under 16.
It depends somewhat on how rigid these laws are, you might find the existing "give your DoB" or "tick if over 16" type approaches are adequate.

As per my previous post, the Regulatory Guidance makes it clear that self-declaration of age on its own is considered insufficient for the purposes of the SMMA regulations.

However, given the nature (and budget) of the sites we typically operate, it's not very clear what reasonable (and budget appropriate) steps we could take beyond this.

On ZooChat where we do have issues with under-age users, we take an active moderation approach and pro-actively ban users who we deem are likely to be too young for the site. It's an entirely manual process based on "gut feel" and observed user behaviour. I'm not sure if that would be considered sufficient - but it's the best we have at this point given there doesn't seem to be much else we can implement for a reasonable cost at this point.
 
Back
Top Bottom