The perils of big data without strong privacy management processes

By John Carroll, Amanda Ryan and Sophie Utz

12 Apr 2018

Given the increasing use of  big data, organisations and agencies should take proactive steps to ensure their big data processes are privacy compliant.

As the trend continues towards the use of big data technologies, and with data sharing breaches increasingly on the radar of global citizens, regulators and legislators, rigorous compliance with core privacy principles is essential when handling and sharing personal information.

There's something about "big data"

Big data is a term used to describe data sets that are so large and complex that their management exceeds the capabilities of traditional data processing systems and requires the use of faster processing and smarter (autonomous and semi-autonomous) analytics.

While there is a growing demand in Australian business and government to leverage the efficiency and data management benefits of big data tool sets, their success is heavily dependent upon managing and safeguarding personal information in a way that is compliant with core privacy principles.

If the data within a big data set contains "personal information" within the meaning of the Privacy Act 1988 (Cth), as with non-big data sets, the management and use of that data must be in accordance with the requirements of the Privacy Act, and more specifically the Australian Privacy Principles (APPs).

That said, the very nature of "big data" carries with it an extra level of privacy complexity (discussed in more detail below).

Legal and reputational risks

The unique requirements for managing big data has been identified by the Office of the Australian Information Commissioner (OAIC), and consequently resulted in its release of the Consultation draft "Guide to big data and the Australian Privacy Principles". The draft Guide outlines key privacy management risks specific to dealing with big data sets.

First, because big data often involves storing larger amounts of data for longer periods of time and for multiple purposes, the risks arising from ensuring the security and storage of the data, and compliance with APP 11, are increased. It may also increase the likelihood of breaches, triggering obligations under the Privacy Amendment (Notifiable Data Breaches) Act 2017 which came into effect on 22 February 2018.

Secondly, because big data is often collected from a range of sources, risks arise from:

  • collection and specifically APP 3.6, which requires collection directly from the individual (except in certain circumstances);
  • ensuring the quality of the information and specifically APP 10, which includes taking steps to ensure that the information is accurate, up-to-date and complete;
  • complying with the de-identification requirements (in APP 6.4 and 11.2), as this may diminish the value of the data and the purpose of the big data set in the first place.

In addition to ensuring compliance with privacy laws when managing big data, organisations and agencies must also manage community expectations and the reputational risks that come from big data projects.

De-identification as a tool for mitigating legal and reputational risks

Where the Privacy Act only regulates dealings with personal information, de-identification can be an appropriate response to the privacy law and reputational challenges of big data. Fundamentally, de-identification safeguards privacy and confidentiality by using technology to strip data sets of their personal identification potential, while retaining the research utility of the information assets. This is specifically beneficial for agencies that continually collect and retain data including personal information.

Also, as a matter of risk management, data custodians should consider the obligations under APPs 6, 8 and 11 to de-identified data, not just as a matter of law at that time but also as a risk-management strategy. This strategy is particularly relevant in anticipation of de-identified data being transferred to another environment where it could become personal information, at which point the APPs would apply.

That said, de-identification may not be enough; data should be confidentialised if it is to be made publicly available. This involves ensuring that the personal information cannot be re-identified in the future.

The OAIC has released a joint paper with the CSIRO, "The De-Identification Decision-Making Framework", which provides useful guidance on adopting appropriate de-identification and confidentialisation steps as a tool that decreases the risk of privacy breaches arising.

Making sure your big data project isn't a big privacy problem

To ensure that your data policies and processes for managing personal information will not fall foul of privacy obligations or cause reputational damage, now is the time to review your data systems and processes to ensure privacy compliance.

Particular attention needs to be paid to big data projects to ensure that the additional layer of privacy complexity is managed.

We recommend that all organisations and agencies who deal with data, and in particular big data, take this opportunity to conduct a eight-step review of their data management processes:

  1. conduct an audit to gain an understanding of the nature of the personal information in your organisation or agency's custody and the basis upon which it has been collected;
  2. conduct an audit of your current information security processes and procedures to determine if they are adequate;
  3. seek advice to address any deficiencies in the current processes and procedures, and identify the legal obligations that impact on your data management systems;
  4. assess whether de-identification is required or appropriate to mitigate risks of data breach;
  5. develop (or update) a data breach response plan, to ensure that adequate processes are in place to mitigate a breach and address any legal obligations arising;
  6. prepare a privacy policy that contains systems and procedures that effectively manages the privacy risks associated with your data management practices (and for Australian Government Agencies, accords with the Privacy (Australian Government Agencies - Governance) APP Code 2017 which comes into effect on 1 July 2018);
  7. when considering proposals for the collection, use or disclosure of personal information, ensure that a thorough Privacy Impact Assessment is undertaken;  and
  8. repeat the above steps on a regular basis.

Get in touch

Disclaimer
Clayton Utz communications are intended to provide commentary and general information. They should not be relied upon as legal advice. Formal legal advice should be sought in particular transactions or on matters of interest arising from this communication. Persons listed may not be admitted in all States and Territories.