In an effort to target online child sexual abuse and pro-terror content, Australia may cause global changes in how tech companies handle data.
Australia has decided to aggressively target online child sexual abuse material and pro-terror content. To do so, it plans to force all technology companies to actively scan content for such material.
Consequently, Australia might force sweeping global changes in how all technology companies handle data.
These new regulations have been adopted as the policy of choice by the Australian eSafety Commissioner. Through them, any tech company doing business with Australians will be required to actively scan their emails, online photo libraries, cloud storage accounts and dating sites for illegal content.
SEE: This mandate arises at the same time Australia considers AI regulations.
This includes services such as Apple iCloud, Google Drive and Microsoft OneDrive. It will also include content shared via online games and instant messaging.
The penalty for non-compliance is $700,000 per day.
Jump to:
Why the eSafety Commissioner is cracking down hard
In 2021, Australia passed the Online Safety Act. The objectives of that 200-page Act were simple:
- Improve online safety for Australians.
- Promote the online safety of Australians.
The first consequence of the Act was the establishment of the eSafety Commissioner. Part of the Commissioner’s role has been to create and enforce a framework whereby illegal or damaging material can be removed at the eSafety Commissioner’s request.
This has meant that the government may now determine basic online safety expectations for social media, electronic and internet services. It also means a technology provider may be required to block access to material that promotes, incites, instructs or depicts “abhorrent violent conduct.”
To help facilitate this, the eSafety Commissioner tasked the Australian IT industry with developing a proposal to combat illegal content. It was submitted in February; however, the eSafety Commissioner rejected it, specifically because it didn’t meet the Commissioner’s minimum expectations with regards to detecting and flagging “known child secular abuse material” in file and photo storage services, email and encrypted messaging services.
The eSafety Commissioner has also cited a 285% year-on-year increase in the reports of child sexual exploitation and abuse material during the first quarter of this year as the cause for this dramatic action.
What steps come next?
These regulations will apply equally to both Australian service providers and overseas vendors that supply services to Australians. They will take place within six months from the day the regulations are officially registered.
Once that happens, Australians will be able to lodge complaints for non-compliance with the eSafety Commission, which will be empowered to investigate and impose injunctions, enforceable undertakings and financial penalties.
The scope and universality of these requirements, unsurprisingly, will be of concern to privacy advocates. The fundamental expectation of privacy when sending an email will immediately be compromised if each one needs to be scanned.
This opens up new data security concerns, and following the Optus, Latitude Finance and Medibank data breaches in 2022 — which, combined, affected just about every Australian at least once — Australians are sensitive about anything that will make their data even less secure.
There are also concerns about how this content will be scanned. Tech companies will not be expected to manually scan each piece of content. Rather, the eSafety Commissioner’s expectation is that they will develop automation tools and leverage AI to be “trained” on known examples of illegal material to flag similarities with new content being created and shared.
However, this solution is imperfect. Several have tried, and it has yet to work as intended. Companies like Meta and Google have already developed automated tools to detect and flag illegal material. Apple was a forerunner with this and had announced plans to automatically detect child abuse material being sent to and from its devices back in 2021.
Despite being an unambiguously noble cause, it was so unworkable that Apple abandoned it within a year.
The reality is that this automation — “hashing,” to use the industry’s term — is imperfect and can be tricked, as well as raise false flags. The former issue undermines the entire intent of these systems. Criminals are famously good at adapting to the internet, so while these techniques might help identify individuals sharing images, the kind of syndicates that are the real problem will not be affected.
Meanwhile, given the damage that even being accused of distributing child abuse material can do to a person, there is a real concern about what tech companies passing flagged content to the authorities can do to innocent people. There is already one case of Google “catching” a father for taking a photo of his son’s groin at the request of their doctor to treat a condition.
Will the international community accept the mandates?
The eSafety Commissioner has expressed the hope that these new regulations will help push the rest of the world into compliance. Whether the rest of the world finds that acceptable remains to be decided.
While the eSafety Commissioner can only regulate the interaction of technology with Australian citizens, these laws may force global companies to change their approach at a systemic level, and this could cause a new wave of debate around digital rights globally.
Alternatively, platform holders and service providers may simply decide to close off services to Australia. That happened back in 2021 when Facebook protested the Australian government’s attempt to impose a royalty system on it to be paid out to news media organizations. For the short period of time that decision was in effect, Australian businesses of all sizes, as well as Australian consumers, were deeply impacted.
Despite these concerns, the eSafety Commissioner is quite firm on this approach. For those in the tech industry, anyone involved in the storage and sharing of data will need to prepare themselves for some substantial shifts in how data is handled and shared.