
Australia has moved to tighten control over the digital environment with the introduction of three new online safety codes, measures that raise pressing privacy and censorship concerns.
These codes, formalized on June 27 under the Online Safety Act, go beyond introducing digital ID checks for adult websites; they also place substantial obligations on tech companies, from search engines and internet service providers (ISPs) to hosting platforms.
Businesses that fail to comply face the threat of significant financial penalties, with fines reaching as high as 49.5 million Australian dollars, or about $32.5 million US.
The codes seek to restrict Australian users’ exposure to material classified under two categories: Class 1C and Class 2.
Class 1C encompasses “online pornography – material that describes or depicts specific fetish practices or fantasies.”
Class 2 covers a broader range of content, from “online pornography – other sexually explicit material that depicts actual (not simulated) sex between consenting adults” (Class 2A), to “online pornography – material which includes realistically simulated sexual activity between adults. Material which includes high-impact nudity” or “other high-impact material which includes high-impact sex, nudity, violence, drug use, language and themes. ‘Themes’ includes social Issues such as crime, suicide, drug and alcohol dependency, death, serious illness, family breakdown, and racism” (Class 2B).
Schedule 1 – Hosting Services Online Safety Code, companies that provide hosting services within Australia, including social media platforms and web hosts, are compelled to implement six compliance measures.
A core requirement obliges these services to manage the risks posed by significant changes to their platforms that could make Class 1C or Class 2 material more accessible to Australian children.
Schedule 2 – Internet Carriage Services Online Safety Code targets ISPs. It mandates the provision of filtering tools and safety guidance to users and empowers the eSafety Commissioner to order the blocking of material deemed to promote or depict abhorrent violent conduct.
The Commissioner has previously exercised similar powers, as in the directive to block footage of a stabbing circulated on X.
Schedule 3 – Internet Search Engine Services Online Safety Code directs search engine providers to roll out age verification for account creation within six months.
These platforms are also instructed to develop systems capable of detecting and filtering out online pornography and violent material by default, where technically feasible and practicable.
Additional stipulations include offering parental controls, preventing sexually explicit autocomplete suggestions, supplying crisis prevention resources, and mitigating accidental exposure to restricted content.
The measures extend to AI-powered search functions, although standalone AI applications are not covered.
These regulations significantly expand the surveillance and censorship capabilities of the Australian government, raising several alarms among privacy advocates concerned about the erosion of digital freedoms under the push for online digital ID.