Major tech platforms including Meta, TikTok and Snapchat will not be forced to be part of the federal government's $6.5 million age assurance trial, with the department relying on the "goodwill" of platforms to test the new technology.
Subscribe now for unlimited access.
or signup to continue reading
But in a separate move, the online regulator, the eSafety Commissioner, will require tech companies to come up with codes to prevent children from viewing online pornography and other harmful content.
Staff in the Department of Infrastructure, Transport, Regional Development Communications and the Arts fronted the Joint Select Committee on Social Media and Australian Society on Tuesday morning, when they outlined details of the $6.5 million trial funded in the most recent budget.
The department is yet to release a tender for an independent review of age assurance technologies, which enable social media companies and other tech platforms to block young people from viewing harmful content, but said it was having informal discussions with global giants about how the trial would work.
Liberal senator Sarah Henderson said this amounted to little more than a desktop "research project", a characterisation disputed by the department.
Liberal MP Andrew Wallace questioned whether tech platforms would be required to participate in the trial, to which the department responded it did not have the statutory powers to compel companies to participate.
"Are we relying on the goodwill of the platforms to take part in the trial?" he said.
"It seems a dangerous assumption to assume platforms who to date have been fairly recalcitrant in their approach to online safety, to do the right thing and jump on board."
Representatives of tech companies including Snapchat, Google, TikTok and Meta - which owns Facebook and Instagram - gave evidence to the same parliamentary inquiry last week, insisting they were taking measures to control harmful content and prevent it from appearing for young people.
Meta vice-president and global head of safety Antigone Davis said social media was instead a benefit for young people.
"I think that it is our responsibility as a company to ensure that teens can take advantage of those benefits of social media in a safe and positive environment."
Bridget Gannon, acting first assistant secretary at the Department of Infrastructure, Transport, Regional Development, Communications and the Arts, said it would be in the tech companies' interests to cooperate with the age assurance trial, given that other regulatory moves were afoot.
The committee will present its interim report by August 15, with a final report in November.
eSafety Commissioner begins content countdown
The eSafety Commissioner has started the clock on tech companies ensuring they protect children from graphic pornography.
Unlike the age assurance trial, the codes required by the eSafety Commissioner will be enforced, and could include age checks, safety measures and parental controls.
These could be imposed on the device itself - a phone, tablet or computer - or when children access an app or website.
"Kids' exposure to violent and extreme pornography is a major concern for many parents and carers, and they have a key role to play both from a protective and educative standpoint," eSafety Commissioner Julie Inman Grant said.
"But it can't all be on them, we also need industry to play their part by putting in some effective barriers to protect children."
This is the second round of codes imposed on the industry, with "phase one" covering terrorism content and child pornography.
A draft of the codes that cover content including pornography, disordered eating and self-harm, is due by October, with the tech platforms having until the end of the year to finalise the codes.