Government tells digital platforms the right way to ban under-16s

The Federal government and the eSafety Commissioner have released their guidance for social media companies on how to implement the impending under-16 age ban — and they want to make sure older Australians don’t get caught up having to prove their age.

The age-ban takes force from 10 December, and until now the government has not said how it will judge whether platforms have complied.

The new guidance declares that social media platforms will need to have technology to determine which accounts are held by those under the age of 16, and deactivate or remove them “with kindness, care and clear communication”. They then need to prevent age-restricted users from creating new accounts, and mitigate circumvention measures.

The 55-page document makes clear the onus will be on the platforms themselves to determine what steps to take in order to make the rules work – rather than the Government laying out specific frameworks. It will also be on the platforms to prove the steps they take are “reasonable”.

The regulatory guidance was released on Tuesday (eSafety)

The guidance suggests such steps may include “systems, technologies, people, processes, policies and communications”.

Examples such as relying on self-declaration of age would not be considered reasonable, the document noted. It also flagged it does not want those who are of age being mistakenly removed or blocked from the platforms as a result of poor implementation measures.

“Providers should avoid unreasonable practices that risk over-blocking access or infringing on the rights of Australians,” the document said.

“For example, requiring all existing Australian account holders to prove their age using an age verification system may be unreasonable and is not necessary for compliance – particularly in circumstances where the provider could use existing data to infer with reasonable confidence that certain end users are over 16.”

Communications Minister Anika Wells also released some social media tiles to explain the guidelines (Facebook)

It also said Government-issued ID could not be the sole method offered for age verification.

The document does put forward some potential alternative options for determining users’ ages, such as the age of the account itself, looking for activity patterns consistent with school schedules, flagging accounts with young connections, and visual and audio content analysis (age estimation based on photos, video and voice files uploaded to the platform).

It concedes, however, that implementing these measures will be a moving target.

Step one for the social media platforms (Anika Wells Facebook)

“There is no one-size-fits-all approach for what constitutes the taking of reasonable steps,” the document said.

“Accordingly, while this document provides guidance and examples to assist providers, providers are required to make their own determination of what steps to take, and, if asked, to demonstrate to eSafety that those steps were reasonable in the circumstances.”

The guidance also noted the need to continuously evaluate and improve processes and approaches over time.

The maximum penalty for online service providers who don’t comply is currently $49.5 million.

eSafety Commissioner Julie Inman Grant said the guidance reflects extensive consultation with industry and stakeholders, as they prepare for the social media minimum age obligation to take effect.

Step two for the social media platforms (Anika Wells Facebook)

“As we work towards implementing this world-first legislation, we remain deeply engaged with industry to ensure they have all of the information they need to comply,” Inman Grant said.

“Our principles-based guidance recognises that there is no one-size-fits-all solution for industry, given the diversity of platforms and technology and to help technology companies meet their obligations in a way that is effective, privacy-preserving and fair.

“We have encouraged platforms to take a layered approach across the user journey, implementing a combination of systems, technologies, people, processes, policies and communications to support compliance.”

Step three for the social media platforms (Anika Wells Facebook)

She added: “Children, parents and carers are counting on services to deliver on their obligations and prepare their young users for this monumental change.

“This legislation puts the onus on platforms, not parents, carers or young people. To that end, eSafety will be assessing a platform’s compliance based on implementation of systems and processes, not based on individual user accounts”.

Get the latest media and marketing industry news (and views) direct to your inbox.

Sign up to the free Mumbrella newsletter now.

"*" indicates required fields

 

SUBSCRIBE

Sign up to our free daily update to get the latest in media and marketing.