‘Less safe’: Privacy groups warn against Australian ‘online safety’ law
Digital rights groups are warning against a proposed law in Australia which they say would force tech companies to spy on users’ messages.
Australia’s eSafety Commissioner Inman Grant published draft regulations last month which would extend the country’s Online Safety Act to private messages.
The Online Safety Act, which took effect in January 2022, requires social media companies to create safeguards against certain content. This includes what the act calls “Class 1 material,” which includes child sexual exploitation, pro-terror, or extreme crime and violence content, as well as “Class 2 material,” which includes crime and violence and drug-related content.
But now Grant wants to hold tech companies responsible for such content in users’ private messages. According to the proposed regulations, platforms will be held accountable if Class 1 or Class 2 material is found in “email, instant messaging, short messages services (SMS), multimedia message services (MMS) and chat, as well as services that enable people to play online games with each other, and dating services.”
The eSafety Commissioner is not telling companies what safeguards to implement, but privacy watchdogs are warning that they would need to deploy surveillance tools to comply with the law.
Currently, messaging services such as Whatsapp, Telegram and Signal use end-to-end encryption (E2EE) to keep messages private and block access to others — in some cases, even to the companies themselves.
Now messaging platforms might be forced to use a controversial technology called client side scanning (CSS) which scans messages for objectionable content before being sent to the recipient. This would necessitate installing hidden spyware on users’ phones.
In a joint letter last week, a coalition of digital rights groups called the Global Encryption Coalition called on Grant to protect user privacy and safety in E2EE.
“[A]s these standards have no specific safeguards for end-to-end encrypted services that people rely on for privacy and safety, end-to-end encrypted services will be forced to undermine the security and privacy of their services in order to comply. Contrary to the goals of the standards, this will leave everyone less safe online,” wrote the coalition, which includes the Center for Democracy & Technology, Global Partners Digital, the Internet Freedom Foundation, the Internet Society, Mozilla, Access Now, and Digital Rights Watch.
The proposed regulations follow the recent passage of the Online Safety Bill by the British Parliament. The bill would force tech companies to create “backdoor access” to encrypted messages and scan them for “harmful communications offences.” Harmful communications are sweepingly defined by the bill as that which causes “psychological harm amounting to at least serious distress.” UK officials are claiming the legislation is necessary to crack down on child trafficking and pornography.
In June, Australia’s government also published draft legislation that would force social media platforms to suppress information considered false by authorities.
According to the proposed law, the Australian Communications and Media Authority (ACMA) would be granted new powers to fine digital platforms millions of dollars for not sufficiently censoring “misinformation.”
“Misinformation and disinformation pose a threat to the safety and wellbeing of Australians, as well as to our democracy, society and economy,” says a government web page on the legislation.
Misinformation is defined as “online content that is false, misleading or deceptive, that is shared or created without an intent to deceive but can cause and contribute to serious harm.”
Disinformation, on the other hand, includes an intent to cause and contribute to serious harm, though it is unclear why it is not already covered by misinformation.
“Serious harm is harm that affects a significant portion of the Australian population, economy or environment, or undermines the integrity of an Australian democratic process,” a government fact sheet explains.
To “combat misinformation and disinformation,” the ACMA will be empowered to require digital platform providers “to keep certain records about matters regarding misinformation and disinformation.”
The ACMA will also be able to compel the industry as a whole to develop a set of practices regarding censoring “misinformation and disinformation.” This will necessarily include a penalty of [A]$2.75 million ($1.8 million) or 2% of global turnover — whichever is greater — for any digital platform that does not comply.
If the regulator finds this set of practices insufficient, however, it will be able to enforce its own “stronger form of regulation” on the industry.
Types of platforms that would be affected include social media platforms, search engines, news aggregators, podcasting services and instant messaging services, though private messages will as of now not fall under the scope of the ACMA’s new powers.
Australia’s Communications Minister Michelle Rowland expressed support for the legislation, saying the amendment would "essentially mean that the regulator is able to look under the hood of what the platforms are doing and what measures they are taking to ensure compliance.”