The federal government’s online safety report was published yesterday, and in addition to calling for algorithmic transparency, it also takes aim at end-to-end encryption.
Chair of the Social Media and Online Safety Committee Lucy Wicks wrote in the report (PDF) that platforms have to “bear the ultimate burden of providing safety for their users”, rather than being able to set their own rules.
As things now stand, she wrote, platforms like Facebook and Twitter have been “enabling the proliferation of online abuse on their spaces.”
However, the report doesn’t level criticism only at platforms. Wicks added: “there is
also a need to focus on the conduct and behaviour of individuals who use technology in ways that harm others.”
A key recommendation among the 26 made in the report is that platforms address the way their algorithms can be gamed to amplify harms.
The report noted the opacity that surrounds how platforms’ algorithms work, and adds that they “do not provide publicly available detail in relation to how their algorithms work and whether the platforms are doing anything to address potential harms caused through their algorithms”.
With so little known, the report recommends a study by the eSafety Commissioner and the departments of Infrastructure, Transport, Regional Development and Communications, and the Home Affairs.
That study would, the report stated, examine questions of the operation of digital platforms’ algorithms, what kinds of harms they may cause, algorithmic transparency, and regulatory options.
The report also says the Digital Safety Review should make recommendations to the government "on potential proposals for mandating platform transparency."
The report acknowledges that the government isn’t yet equipped to legislate for greater transparency in how platforms deploy and use the algorithms that drive engagement with content, but rather it recommends the government create a roadmap to “build skills, expertise and methods for the next generation of technological regulation”.
That work could, the report suggests, be carried out by the eSafety Commissioner, and the departments of Infrastructure, Transport, Regional Development and Communications, and Home Affairs.
The report also recommends that the platforms report to the eSafety Commissioner “detailing detailing evidence of harm reduction tools and techniques to address online harm caused by algorithms”.
The report also opened up another front in Australia’s “encryption wars”, with eSafety, Infrastructure, and Home Affairs also tasked with examining “the need for potential regulation of end-to-end encryption technology in the context of harm prevention”.
Both eSafety and Home Affairs offered submissions critical of platforms’ use of end-to-end encryption.
eSafety’s submission, cited in the report, noted that encryption prevents inspection of traffic for harmful material, and therefore makes it difficult for law enforcement agencies to identify child abuse material.
Home Affairs said encryption “on digital platforms, including social media, is bringing Dark Web functionality to the mainstream”.
It said encryption “particularly on platforms used by children, brings with it important public safety risks”, and that the anonymity provided by end-to-end encryption is an enabler of predators.