Australia’s online safety watchdog has welcomed moves by the world’s largest technology companies to create a framework for reporting on child sexual abuse material, but questioned whether they truly have the will to tackle the problem properly.
eSafety Commissioner Julie Inman Grant warned the technology giants she will use new legal powers to demand the companies reveal what systems they are using to check if child sexual abuse is on their platforms and how long it has been online.
eSafety Commissioner Julie Inman Grant is skeptical of technology companies’ will to tackle child sexual abuse online.CREDIT:EDWINA PICKLES
Speaking to The Sydney Morning Herald and The Age from an anti-abuse summit in Brussels, Inman Grant said a newly unveiled abuse reporting framework from the Tech Coalition, which sets out what kind of information firms should disclose, was a positive step.
“We’ve seen a lot of selective transparency in the past and you can’t have accountability if you don’t have full transparency,” Inman Grant said.
The new reporting framework is being unveiled on Thursday (AEST) at the WeProtect Global Alliance Summit in Brussels by the Tech Coalition, which counts Amazon, Apple, Meta, Microsoft, Snap Inc, Twitter and TikTok among its members.
It suggests companies report on how much child sexual abuse material they have removed or blocked from their services, a breakdown of the format of the material and how long it took, among other metrics.
But the framework is voluntary and does not require companies to report using the same metrics, meaning firms can evade comparisons of their work.
The Tech Coalition’s executive director, Sean Litton, said the code was designed to be broad to work for companies with very different business models, whether a streaming service that had to take down live videos of abuse, a social network or a private messaging app.
“The challenge is that each platform operates differently,” Litton said. “They collect different information and different information is more relevant based on their platform and how it works and how children are engaged on that platform.”
He flagged the possibility of future conversations about making the code mandatory for Coalition members, though the majority already comply with the framework.
Inman Grant said her dealings with tech giants had not revealed any “paragons of virtue” in tackling child sexual abuse. She recalled one company, which she did not name, that refused to give over information and said: “If we tell you, you might use it against us.”
“There have been many ways [of tackling child sexual abuse material online], but we’ve lacked will from the companies,” Inman Grant said.
Since that company refused to provide information on how it was tackling child sexual abuse, which Inman Grant had asked for because it was reporting far fewer instances of abuse than its competitors, the eSafety Commissioner has been handed new powers. From August Inman Grant will be able to issue lists of questions that must be answered on pain of substantial fines.
Most companies involved in the coalition, such as TikTok, offered only broad on record statements supporting its work. Microsoft, which has made a tool called PhotoDNA that identifies known images of abuse available to other firms, made its chief digital safety officer Courtney Gregoire available for interview.
Gregoire said the framework stood alongside government regulation, rather than being intended to replace it, in addressing a problem that is “globally growing”. She defended the framework, describing it as a way for companies, especially newer ones unfamiliar with dealing with abuse, to make themselves accountable.
“That’s why it starts with the concept of ‘acknowledge what you have committed to publicly’, help the public and all civil society and stakeholders understand what you have stated, [what] your goals and objectives are in detecting [child sexual abuse material], then describe how you do it, and then be transparent about the outcomes.”