From compliance sweeps to quantum threats: Australia’s evolving privacy risk landscape

Steven Klimt, Sam Fiddian
Time to read: 5 minutes

The privacy and data protection landscape in Australia will continue to experience rapid change and heightened scrutiny. Over the next 12 to 18 months, organisations will face a range of legal risks as regulators ramp up enforcement, new technologies challenge established norms, and legislative reforms create new compliance obligations. We've identified six privacy and data protection risks on the horizon.


Time to revisit the risk register: increased enforcement activity

2025 saw the Privacy Commissioner beginning to flex her regulatory muscle:

  • issuing her high-profile decision concerning Kmart's use of facial recognition technology (FRT);

  • contesting Bunnings appeal against the Commissioner's decision on its use of FRT;

  • obtaining civil penalty orders in the many millions against Australian Clinical Labs (ACL); and

  • commencing Federal Court proceedings against Optus.

The Commissioner has continued this trajectory in 2026, kicking the year off with a compliance sweep of privacy policies targeting businesses that collect personal information face-to-face – including real estate agents, chemists, car dealerships, and licensed venues. This involves systematic reviews of organisations’ privacy documentation and practices, with a focus on transparency, adequacy, and alignment with the Australian Privacy Principles (APPs).

The Commissioner has signalled a shift in enforcement focus towards "systemic harms and market practices." As part of this shift, the Office of the Australian Information Commissioner (OAIC) has indicated not all complaints will proceed to investigation in an effort to enhance efficiency.

The Commissioner's words and actions paint a clear picture: she is prepared to use the options available under current legislation to promote privacy compliance and hold organisations to account, including by pursing civil penalties. She views the existing legislation as a "principles-based framework", and is likely to rely on APPs 3.5, 6.2 and 10.2 to argue that a "fairness" obligation is already built into the Privacy Act.

Action: Civil penalties of $5.8 million against ACL and $2.5 million against FIIG Securities for cybersecurity failures suggests that the days when risk registers, and the business behaviours they drive, consigned privacy compliance to low risk status are long gone. The prospect of regulatory actions is an increasing danger, justifying a recalibration of privacy compliance's place on the risk register and greater investment in privacy and cybersecurity compliance.

New technology vs privacy: lessons from the Bunnings appeal

The tension between technological innovation and privacy protection is a recurring theme in recent regulatory and judicial decisions. Bunnings' recent "success" in its appeal against the Commissioner's determination is a case in point. Bunnings’ use of FRT in its stores was challenged by the Commissioner, who found that the collection of biometric information breached the APPs imposing transparency, collection and notification obligations.

High-profile data breaches in recent years have promoted a focus on the collection and retention of sensitive personal information, with data minimisation seen to be the new norm. While that focus should remain, the Administrative Review Tribunal's decision demonstrates that business interests may still prevail over privacy interests in that new, privacy-invasive technologies may be deployed, but only after careful consideration of the privacy implications and where their use is sufficiently justified.

It also underscores the importance of:

  • conducting thorough and early privacy impact assessments;

  • being able to demonstrate that the use of technology is proportionate to the risks it is intended to address; and

  • ensuring detailed information about the exact type of personal information being collected, the method and technologies being used and the specific purposes for which the information is collected is included in privacy collection statements and Privacy Policies.

Action: The Tribunal's decision provides organisations with a clear blueprint for the assessment and adoption of new, privacy-intrusive technologies, alongside a clear warning organisations must have the correct policies, practices and notices to permit that use. If they don't, they may face regulatory action, even if their use of the technology is ultimately justified.

Risk Landscape 2026
FY26/27 privacy–cyber risk heat map
Likelihood of regulatory action vs severity of consequence
Low / High
Quantum decryption of archived data
Catastrophic breach — critical infrastructure
Medium / High
M&A — inherited cyber vulnerability
Major breach — inadequate response
High / High
Cyber incident — no tested response plan
ADM non-disclosure — December 2026
Low / Medium
Third-party AI vendor data misuse
Medium / Medium
FRT/biometric deployment without PIA
AI training on personal data without consent
High / Medium
OAIC compliance sweep — outdated privacy policy
Systemic data minimisation failures
Low / Low
Minor collection notice deficiency
Medium / Low
Incomplete supplier data processing terms
High / Low
Privacy policy not reviewed in 12+ months

Extreme
Critical
Elevated
Moderate
Low

Focus on automated decision-making (ADM)

By December 2026, organisations will need to have updated their privacy policies to address the use of ADM technology, which automates a decision-making process and can assist or replace the judgment of human decision-makers.

While ADM technology could increase the efficiency, accuracy and consistency of decisions, it also raises complex ethical and legal issues. Personal information can be used to train, test or deploy ADM, which often results in more significant data-collection, sharing and combining.

For decisions that may significantly affect an individual, privacy policies will need to address the use of ADM, including the kinds of personal information used and whether the decision is made solely, or substantially and directly, by the operation of the ADM process.

Action: While December seems like a while away, an audit of the use of ADM will be required to identify its use of personal information, both to ensure it can be adequately dealt with by the privacy policy and to check that its use otherwise accords with the APPS.

Cybersecurity and data breach response

The recent decisions involving ACL and FIIG Securities highlight the importance of implementing and regularly reviewing and testing cybersecurity policies and practices. ACL had acquired another business which, it was found, suffered from a litany of cybersecurity shortcomings. FIIG Securities was found to have failed to implement common measures such as multi-factor authentication. Helpfully, the decisions clarify that an organisation's cybersecurity obligations will vary depending on the complexity of its business and needs to be assessed holistically from the perspective of what steps a reasonable person would take.

The decisions also shine a light on data breach response readiness, with FIIG Securities found to be lacking an appropriate and tested cyber incident response plan and ACL criticised for its initial response to the cyber incident.

Action: If you don't have appropriate cybersecurity practices, including an effective data breach response plan, or fail to act promptly and transparently when a breach occurs, you face increased risk of regulatory action, litigation, and reputational damage.

A focus on M&A

The ACL decision is a cautionary tale for those involved in M&A acquisitions. ACL did not identify relevant vulnerabilities in MedLab's IT systems prior to its acquisition. Those remained for a significant period after the acquisition, and were then compounded by shortcomings in ACL's initial data breach response.

Action: Notwithstanding the associated costs, M&A players need to move beyond traditional IT due diligence to scrutinise the security posture of acquisition targets. Weaknesses identified need not necessarily scupper a deal, but if weaknesses are found, buyers should act promptly and not wait for lengthy ICT integration projects to complete before addressing issues. Take the quick wins provided by MFA, encryption and application whitelisting. Avoid integrating systems until you know what might lurk within.

From action on AI to quantum preparedness: the next frontier

Unsurprisingly, Artificial Intelligence (AI) remains top of mind for Australian organisations on a number of fronts. From a privacy and data protection perspective, we anticipate a particular focus on how AI systems collect, process and, in some cases, generate sensitive information.

The recent outcome in the Bunnings case underscores that the threshold for "collection" under the Privacy Act is a low one , confirming, and that even fleeting or transient handling of personal information (in this case for 4.17 milliseconds per image) may constitute "collection."

Against this backdrop, organisations should review all uses of AI, with reference to the OAIC’s Guidance on privacy and the use of commercially available AI products. Organisations should consider whether certain uses of AI constitute a collection, use and/or disclosure of personal or sensitive information and if so, ensure that appropriate compliance measures are put in place.

And while quantum computing may seem a distant threat, boards and risk committees are beginning to address the long-term implications for data security. The potential for quantum computers to break current cryptographic standards means that “post-quantum cryptography” is moving up the agenda for 2026 and beyond.

Action: If you don't assess your readiness for quantum-era threats, you may find yourself exposed as the technology matures. Proactive planning and engagement with emerging standards will be essential to future-proof data protection strategies.

Key takeaways

The next 12 to 18 months will be a period of significant change and heightened risk in privacy and data protection for Australian organisations. Increased enforcement activity, the deployment of new technologies, evolving regulatory obligations, and emerging threats such as quantum computing all demand a proactive and strategic approach to compliance.

You should:

  • Regularly review and update privacy policies and practices;

  • Conduct and document privacy impact assessments for new technologies and high-risk activities;

  • Ensure robust data breach response plans are in place and tested;

  • Carefully examine, and seek to balance, data minimisation with regulatory obligations; and

  • Begin planning for the long-term implications of quantum computing.

By taking these steps, organisations can not only reduce their legal risk, but also build trust with customers, regulators, and the broader community.

Disclaimer

Clayton Utz communications are intended to provide commentary and general information. They should not be relied upon as legal advice. Formal legal advice should be sought in particular transactions or on matters of interest arising from this communication. Persons listed may not be admitted in all States and Territories.