Support Centre

You have out of 5 free articles left for the month

Signup for a trial to access unlimited content.

Start Trial

Continue reading on DataGuidance with:

Free Member

Limited Articles

Create an account to continue accessing select articles, resources, and guidance notes.

Free Trial

Unlimited Access

Start your free trial to access unlimited articles, resources, guidance notes, and workspaces.

Australia - Data Protection Overview
Back

Australia - Data Protection Overview

October 2023

1. Governing Texts

On November 28, 2022, the Australian Government ('the Government') passed the Privacy Legislation Amendment (Enforcement and Other Measures) Act 2022 which came into effect on December 13, 2022 ('December 2022 Changes') which provides, among other things, increased fines under the Privacy Act 1988 (Cth) No. 119 1988 (as amended) ('the Privacy Act') to be in line with other recent changes to administrative fines in other areas. The maximum fine for a serious invasion or repeated invasions of privacy (i.e. breaches of the privacy law) has increased to up to the greater of AUD 50 million (approx. $32.1 million), three times any benefit obtained from the invasion breach and 30% of Australian annual revenue during the length of the breach or 12 months (whichever the greater).

This significant increase in potential fines, almost a twenty-three-fold increase, and the increased budget given to the Office of the Australian Information Commissioner ('OAIC') this year will lead to more own-motion investigations by the OAIC and foreshadows significant enforcement activity over the next 12-18 months.

This is just one part of a wider review of the Privacy Act that was conducted by the Attorney General's Department ('AG'), which resulted in the Privacy Act Review Report published on February 16, 2023. The Privacy Act Review Report made 116 proposed recommendations to change the Act and bring Australian privacy law more in line with the General Data Protection Regulation (Regulation (EU) 2016/679) ('GDPR'). On September 28, 2023, the Australian Government released its response to the AG's Privacy Act Review Report in which 106 of the 116 proposals were adopted as 'agreed' or 'agreed in principle' by the Australian Government, with further development of legislative proposals and targeted consultations to follow for these adopted proposals during 2024.

Another key recent development has been the increasing role of the Australian Competition and Consumer Commission ('ACCC') in enforcing 'consumer privacy', including recently succeeding in imposing an AUD $60 million (approx. $38.5 million) fine on Google LLC. The ACCC's recent enforcement activity demonstrates its heavy-handed enforcement approach to protecting consumers' privacy interests.

1.1. Key acts, regulations, directives, bills

The key legislation in Australia affecting private-sector organizations (and Federal Government agencies) Australia-wide is the Privacy Act and its Australian Privacy Principles ('APPs'). In addition to the Privacy Act/APPs, there is a Privacy Regulation 2013, a legally binding Privacy (Credit Reporting) Code and rules and guidelines, for example, in relation to privacy in the conduct of medical research and Tax File Numbers ('TFNs') which have the force of law and apply in specific areas/to specific types of information.

From  July 1, 2020 the consumer data right ('CDR'), introduced by amendments to the Competition and Consumer Act 2010 (Cth) and the Privacy Act, went live for limited data sharing in relation to the four major banks (as the first part of the so-called 'open banking regime'). The rest of the banking data subject to CDR was available for sharing by the big four banks from November 1, 2020, and then in 2021 the rest of the banking sector joined the CDR regime. The CDR is now being rolled out progressively in the retail energy, insurance, other financial services, and retail telecoms sectors and then, ultimately, all other sectors where there is significant consumer interaction and thus resulting 'consumer data'.

The key privacy-related 'legislation' overseen by the OAIC resulting from the introduction of the CDR regime is the CDR Privacy Safeguard Guidelines, which are legally binding statutory provisions which set out the privacy rights and obligations for participants in the CDR regime, effectively a CDR specific version of the APPs.

Also, on an Australia-wide basis, there are additional sector/information-specific laws such as those relating to Tax File Numbers ('TFNs'), personal electronic health records, credit information, and the CDR regime that also apply in addition to the Privacy Act/APPs. In addition, a number of Australian States also have their own privacy laws that regulate State Government agencies and private enterprise contractors to the State Governments and, in some cases in three States/Territories, health records processed by private sector organizations. Even where private sector organizations, as contractors to State Government agencies or as processing health records, are governed by a State privacy law this is generally in addition to their obligations under the Privacy Act/APPs (except for activities specifically required by their State contacts which are contrary to the Privacy Act/APPs).

1.2. Guidelines

Key non-binding Guidelines and Guides are issued by the OAIC and are available on the OAIC website. Of note are:

1.3. Case law

Noteworthy recent decisions, determinations, and undertakings obtained by the Privacy Commissioner include:

The investigation (and subsequent findings) of the OAIC against ClearviewAI in relation to the scraping of the personal information and sensitive information of millions of individuals (including Australian individuals) indicates the increasing scrutiny by the OAIC of personal information collection practices, including of offshore companies. The OAIC’s somewhat controversial position in the Uber Decision on the extraterritorial reach of the Privacy Act/APPs was legislated in the December 2022 Changes (see Section on territorial scope below).

As noted above, a key indicator of the ACCC’s recent activity in relation to ‘consumer privacy’ enforcement is Australian Competition and Consumer Commission v Google LLC (No 2) [2021] FCA 367 where Google was fined AUD $60 million (approx. $38.4 million) for making misleading representations to consumers about the collection and use of their location data on Android phones between January 2017 and December 2018.

2. Scope of Application

2.1. Personal scope

In addition to all Federal government agencies, the Privacy Act/APPs apply to all private sector organizations (collectively 'APP entities') other than:

  • those organizations (including all their related bodies corporate each) with less than AUD 3 million (approx. $1.9 million) annual turnover at any time (unless they use or disclose personal information for a benefit or collect and use health information);
  • registered political parties; and
  • State or Territory Authorities or Instrumentalities, although the notifiable data breaches ('NDB') provisions apply to all eligible data breaches involving TFNs (including in respect of the above).

2.2. Territorial scope

In the Uber Decision, the OAIC made clear its position on questions of extraterritorial scope. In this decision, the OAIC's interpretation of 'carrying on business in Australia' took into account the statutory object of the Privacy Act of 'protecting the privacy of individuals and the responsible handling of personal information collected from individuals in Australia'. The effect was that, even where an offshore entity (e.g. a cloud hosting provider located outside Australia) does not have direct engagement with individuals in Australia, is not involved in facilitating the transactions with or between those individuals and does not directly collect personal information from those individuals, the entity may nonetheless be carrying on business in Australia by reason of it being a vendor of services to an APP entity (even a related entity). At the time of the Uber Decision (before the December 2022 Changes) this was controversial as it was thought by many not to be supported by the existing law.

Following the enactment of the December 2022 Changes, however, there is no longer any controversy. There is no need for an entity to process personal information in Australia to be ‘carrying on a business’ in Australia and therefore be subject to the Privacy Act.

The Privacy Act/APPs will apply to all foreign organizations deemed to be 'carrying on business' in Australia, whether or not this includes actively collecting personal information in Australia. Any foreign entity (including a group entity) that provides products or services to an organization or individuals located in Australia (even if only one) will be ‘carrying on business in Australia' and subject to the Privacy Act/APPs. Of course, until that entity 'collects' (i.e. receives) or processes Australian-related personal information (including offshore or indirectly) these obligations 'lay dormant', coming to life once it processes any Australian-related personal information (even years later and if unrelated to the activity in Australia).

2.3. Material scope

All processing (i.e. collection, use, and disclosure) of personal information by APP entities is covered by the Privacy Act/APPs. However, the processing of de-identified or anonymous data (if it cannot reasonably be re-identified) is not covered by the Privacy Act/APPs.

In addition, all persons and entities (including usually excluded entities e.g., State government agencies) dealing with TFNs are covered by:

Processing exempted from the Privacy Act/APPs includes purely personal/domestic processing of personal information (i.e. individuals in a non-business capacity), employee records once held by the employer, political acts and practices (e.g. related to Members of Parliament), small businesses (e.g. under the AUD 3 million (approx. $1.9 million) turnover threshold and not otherwise subject to the Privacy Act/APPs) engaged under a Commonwealth contract and by media organizations if done in the course of journalism.

3. Data Protection Authority | Regulatory Authority 

3.1. Main regulator for data protection

The Privacy Commissioner, who sits within the OAIC, is the relevant regulator under the Privacy Act/APPs.

3.2. Main powers, duties and responsibilities

The Privacy Commissioner is charged with enforcing the Privacy Act/APPs, including receiving and resolving complaints, undertaking own motion investigations and, as a result of any relevant determination, seeking an enforceable undertaking, publishing determinations/decisions, and issuing guidance in respect of the interpretation and enforcement of the Privacy Act/APPs.

The Privacy Commissioner can also seek the imposition of a fine for a serious invasion of privacy (i.e., breaches of the APPs) or repeated invasions of privacy (i.e. repeated breaches of the APPs) which was recently increased as noted in Section on governing texts above.

Please see Section below on penalties for further information.

4. Key Definitions

In Australia data protection is generally known as 'privacy' and, for the purposes of this Guidance Note unless otherwise specifically noted, we limit our comments to the privacy law under the Privacy Act and APPs. The Privacy Act/APPs regulate the collection, use, holding, and disclosure of the personal information of living individuals by APP entities.

Data controller: Unlike European law, there is no concept of 'data controller' under Australian privacy law. Each APP entity that obtains/receives personal information (even in the role of what may be considered a 'data processor' under the GDPR will effectively be considered a data controller under Australian law and has its own primary and separate privacy obligations under the Privacy Act/APPs.

Data processor: Unlike European law, there is no concept of a 'data processor' under Australian privacy law. Each APP entity that obtains/receives personal information (even in the role of what may be considered a 'data processor' under the GDPR) will effectively be considered a data controller under Australian law and has its own primary and separate privacy obligations under the Privacy Act/APPs.

Personal data: Referred to as 'personal information' in the Privacy Act/APPs, personal data is defined to mean information or an opinion about an identified individual or an individual who is reasonably identifiable:

  • whether the information or opinion is true or not; and
  • whether the information or opinion is recorded in a material form or not, the information or opinion itself does not have to identify the individual or the individual does not need be reasonably identifiable from that information or opinion alone but includes where an individual is reasonably identifiable by other means or from other information reasonably obtainable when used with the information in question.

Sensitive data: A sub-set of personal information is 'sensitive information' which is defined to mean personal information which includes information or an opinion about an individual's racial or ethnic origin, political opinions, membership of a political association, religious beliefs or affiliations, philosophical beliefs, membership of a professional or trade association, membership of a trade union, sexual orientation or practices, criminal record and health information, genetic information, and/or biometric information used for automated biometric verification or biometric identification.

Health data: 'Health information' is part of 'sensitive information' (see above under 'sensitive data') and is defined to include information or opinion about the health (including an illness, disability, or injury) of an individual, health services provided or to be provided to an individual and an individual's expressed wishes about future provision of health services that, in all cases, is also 'personal information'. It also includes other personal information collected to provide (or in providing) a health service, collected in connection with the donation or intended donation by an individual of his or her body parts, organs, or body substances, and genetic information about an individual in a form that is or could be predictive of the health of that individual or genetic relatives.

Biometric data: 'Biometric data' is not a term defined in Australian privacy law but the equivalent 'biometric information' (undefined) is included in the definition of 'sensitive information' (see 'sensitive data' above). As 'biometric information' is not defined under the Privacy Act, it is to be given its ordinary dictionary meaning, not dissimilar to the definition of 'biometric data' under the GDPR.

Pseudonymization: This term is not defined in Australian privacy law but the concept is used in APP 2. It is an obligation under APP 2, where practicable, for APP entities to provide individuals with an option of using a pseudonym. 'Pseudonym' and 'pseudonymization', absent a specific definition in the Privacy Act, are given their ordinary dictionary definitions which, in practice, will be similar to the definition in the GDPR.

5. Legal Bases

There are no set GDPR-style 'legal bases' for processing under the Privacy Act/Apps. Personal information which is not sensitive information (or other restricted information) and which is reasonably necessary for one or more of the entity's functions or activities can be collected directly from the individual with an appropriate notice (e.g. a privacy statement or policy) provided at or prior to collection. This notice must detail the matters prescribed by APP 5. No additional 'legal basis', in the GDPR sense, needs to be established to collect such personal information.

5.1. Consent

'Consent' (meaning express or implied consent) is required under APP 3.3 for the collection of sensitive information, including health information, from an individual. Again, even with consent the sensitive information can only be collected if it is also reasonably necessary for one or more of the entity's functions or activities.

5.2. Contract with the data subject

While this is not a 'legal basis' for collection, subject to meeting the requirement of APP 3, where there is a contract between the entity and the individual this will usually provide any required consent for the collection.

5.3. Legal obligations

'Legal obligations' (e.g. the requirement or authorization by or under Australian law or a court/tribunal order) are exceptions from the requirement to obtain consent to collect relevant sensitive information and, in some cases, to provide notice to collect personal information. However, such does not avoid the general obligation under APP 5 to notify individuals of the prescribed matters (APP 5.2) at or before the time of or, as soon as practicable, after the collection of that information.

5.4. Interests of the data subject

Again, similar to 'legal obligations' noted above, an entity can dispense with obtaining consent from an individual for the collection of sensitive information where such information is reasonably necessary to assist the location of a person that has been reported missing or which is necessary to lessen or prevent a serious threat to the life, health, or safety of any individual or to public health or safety.

5.5. Public interest

Consent for the collection of sensitive information may also be dispensed with by the entity collecting it where such is reasonably necessary to lessen or prevent a serious threat to public health or safety, find a missing person, where the unlawful activity or misconduct of a serious nature is suspected, or it is reasonably necessary for an entity's diplomatic or consular functions or activities.

5.6. Legitimate interests of the data controller

The entity is able to collect sensitive information without consent where it does so as regard to suspected unlawful activity or misconduct of a serious nature, for the establishment, exercise, or defense of a legal claim, or for the purposes of a confidential alternative dispute resolution process.

5.7. Legal bases in other instances

As noted above, the main precondition to (or 'legal basis' for) collecting any personal information (including sensitive/health information) is to ensure that the information collected is reasonably necessary for one or more of the entity's functions or activities. Even where an exception permits the collection of sensitive information without consent, the entity is still obliged to meet this precondition to the collection.

Where a law or court order expressly requires an entity to collect the specified information then that will be sufficient to establish that the precondition has been met. However, where the law or court order only permits the collection of such information then, arguably, in some cases meeting the precondition must be established before the entity is entitled to collect that information.

6. Principles

The key obligations of all APP entities (whether they would be considered data controllers or data processors under the GDPR) under the Privacy Act/APPs include:

  • to take reasonable steps to implement practices, procedures and systems that will ensure compliance with the APPs (APP 1.2);
  • only collect personal information that is reasonably necessary for one or more of the APP entity's functions or activities (APP 3.2), by lawful and fair means (APP 3.3), and directly from the individual, unless it is unreasonable or impracticable to do so (APP 3.6);
  • at all-time seek to minimize the personal information/sensitive information collected, exploring other ways to meet business purposes. In other words, APP entities should not assume that collecting personal information is always required to meet their requirements;
  • at or before the time or, if that is not practicable, as soon as practicable after an APP entity collects personal information about an individual, take such steps as are reasonable in the circumstances to notify the individual of the matters in APP 5.2, or otherwise ensure that the individual is aware of such matters (APP 5.1);
  • only use the personal information collected for the notified purpose(s) for collection, unless a secondary purpose is permitted by the APPs (but exercise extra caution with secondary purposes) or consented to by the individual (APP 6.1);
  • to take reasonable steps to ensure that the personal information that the APP entity collects, uses, or discloses is accurate, up-to-date, and complete (APP 10);
  • to take reasonable steps in the circumstances to protect the personal information held by the APP entity from misuse, interference, and loss and from unauthorized access, modification, or disclosure (APP 11.1);
  • take reasonable steps to delete or de-identify personal information when it is no longer required for the notified purposes for which it was collected;
  • to notify all eligible data breaches as soon as practicable to the OAIC and all affected individuals; and
  • develop and implement:
    • a data breach response plan; and
    • data destruction policy.

As regards the information security obligations in APP 11.1, it is important to note that this is not a fixed or static obligation (i.e. it is not a 'one size fits all'). The bigger you are, the more personal information you collect, the more sensitive the information is, the more centralized the data holdings are etc., the greater the security obligations are (i.e. the security measures that need to be taken to satisfy the obligations).

A helpful start to understanding one's information security obligations under APP 11.1 is the Privacy Commissioner's guide to securing personal information and the recent Uber Decision. The latter also notes that APP entities without the relevant expertise internally need to engage appropriate external experts to assist with the preparation and implementation of policies, information security, and in relation to data breach assessment and response.

7. Controller and Processor Obligations

7.1. Data processing notification

No registration with or notification to the OAIC is generally required. However, at or prior to the first collection of personal information about an individual, an APP entity is required to notify that individual of certain mandatory matters (as set out in APP 5.2) either by a privacy collection statement or by including the relevant matters in, and notifying, the privacy policy of the APP entity to that individual. Also, all eligible data breaches must be notified to the OAIC and all affected individuals.

7.2. Data transfers

As regards the obligations and requirements attached to the offshore disclosure (including transfer) of personal information, please see our separate Australia – Data Transfers Guidance Note.

7.3. Data processing records

'Data processing records' are not specifically provided for in or required by Australian privacy law. While APP 1 requires an APP entity to take such steps as are reasonable in the circumstances to implement practices, procedures, and systems relating to the entity's functions and activities that ensure compliance with the APPs (APP 1.2), the concept and keeping of 'data processing records' (or records of processing activities ('RoPA')) is not common under Australian privacy law.

7.4. Data protection impact assessment

A Privacy Impact Assessment ('PIA') is contemplated by Australian privacy law but, apart from government agencies, is not mandated. However, arguably a PIA is, if not required, highly recommended to fulfil one's obligations under APP 1.2. The guidance and recommendations of the OAIC are that a PIA should be used for any new, changed/varied or altered process, method, or technology used that processes any personal information. In addition, the OAIC may, in writing, direct an agency to conduct a PIA, within a specified period if an agency proposes to engage in an activity or function involving the handling of personal information about individuals, and the OAIC considers that the activity or function might have a significant impact on the privacy of individual (Section 33D(1) of the Privacy Act)

If personal information is intended to be collected, stored, used, or disclosed in a project, a PIA is usually recommended as part of the project planning process under the 10 Steps to Undertaking a Privacy Impact Assessment ('the 10 Step Checklist'). According to the 10-Step Checklist, a PIA entails the following stages:

  • a threshold assessment;
  • planning, including assessing the project's scope, identifying who will conduct the PIA, and who will be consulted;
  • developing a description of the project;
  • identifying and consulting with the project stakeholders;
  • mapping information flows, including matters of collection, use, disclosure, information quality, security, retention, and access;
  • conducting a privacy impact analysis and compliance check with reference to the APPs;
  • considering options for removing, minimizing, or mitigating any privacy risks identified through the privacy impact analysis;
  • make recommendations to remove or mitigate the identified risks;
  • prepare a report that sets out all of the PIA information; and
  • monitor the implementation of the PIA recommendations.

In turn, the key elements of a PIA report include (the PIA Guide):

  • project description;
  • PIA methodology;
  • description of information flows;
  • the outcome of privacy impact analysis and compliance checks, including positive privacy impacts and privacy risks that have been identified, and strategies already in place to protect privacy;
  • recommendations to avoid or mitigate privacy risks;
  • description of any privacy risks that cannot be mitigated, the likely community response to these risks, and whether these risks are outweighed by the public benefit that will be delivered by the project; and
  • if necessary, more detailed information (e.g. about consultation processes and outcomes) can be provided in appendices.

In particular, the OAIC highlights that undertaking a PIA can assist APP entities to Guide to Undertaking PIAs (September 2021) ('the PIA Guide'):

  • describe how personal information flows in a project;
  • analyze the possible impacts on individuals' privacy;
  • identify and recommend options for avoiding, minimizing, or mitigating negative privacy impacts;
  • build privacy considerations into the design of a project; and
  • achieve the project's goals, while minimizing the negative and enhancing the positive privacy impacts.

According to the OAIC, a PIA is useful for a full range of activities and initiatives that may have privacy implications, including (the PIA Guide):

  • policy proposals;
  • new or amended legislation;
  • new or amended programs, activities, systems, or databases;
  • new methods or procedures for service delivery or information handling; and
  • changes to how information is stored.

In order to determine whether a PIA is indeed recommended, APP entities should consider routinely conducting 'threshold assessments' for each individual project. Regardless of whether an APP entity proceeds to a PIA, they should keep a record of its threshold assessments with the following information (the PIA Guide):

  • brief description of the project;
  • consideration of whether the project involves the collection, storage, use, or disclosure of personal information, including key privacy elements;
  • whether, based on the threshold assessment, a PIA is required; and
  • details of the person or team responsible for completing the threshold assessment.

Furthermore, the Privacy Impact Assessment Tool (May 2020) ('the PIA Tool') is intended to assist APP entities to conduct PIAs, report its findings, and respond to recommendations. It provides a template that can be adapted to suit the size, complexity, and risk level of their project.

Role of the OAIC

The OAIC does not have a role in the development, review, endorsement, or approval of the PIA process for private-sector organizations, and the OAIC's power to direct agencies to undertake a PIA does not apply to private-sector organizations (the PIA Guide). However, there are potential benefits for organizations by conducting a PIA, such as demonstrating a commitment to good privacy practice, as well as compliance with privacy legislation. The OAIC encourages organizations to undertake PIAs for projects that involve the handling of personal information and share their findings publicly (the PIA Guide).

Remote working

Entities should consider the following in assessing personal information handling in remote working arrangements (the Remote Working Guidance):

  • governance, culture, and training;
  • ICT security;
  • access security;
  • data breaches; and
  • physical security.

7.5. Data protection officer appointment

A data protection officer ('DPO') (or rather, in Australian terminology, a privacy officer) is not mandated by law in Australia but it is recommended by the Privacy Commissioner and, arguably, recommended (if not essential) in practice to comply with APP 1.2.

In practice we are seeing more and more privacy officer roles where a substantial part of the job description (or, for large APP entities, some chief privacy officers whose sole responsibility) is privacy compliance. We are fast approaching the point where, for other than the smallest APP entities with limited personal information, it will be difficult to establish that reasonable steps have been taken to ensure compliance with the Privacy Act/APPs (APP 1.2) without having a privacy officer.

As a DPO is not compulsory under Australian privacy law, there are no stated/legislative requirements for the position. In practice, a privacy officer is usually from/in the risk or in-house legal functions but it is recommended that they also have some IT and business knowledge/experience. In addition, it is recommended that the privacy officer be responsible for handling internal and external privacy enquiries, complaints, and access as well as correction requests, as well as being part of the data breach response team (Step 1 of the Management Framework and Part 2 of the Data Breach Preparation and Response ('the Data Breach Guide').

7.6. Data breach notification

Australia has mandatory notification of all 'eligible data breaches'. Unless a specific limited exemption applies, all eligible data breaches must be notified to the OAIC and all affected individuals as soon as practicable after the entity:

  • becomes aware of the eligible data breach;
  • becomes aware of reasonable grounds to believe an eligible data breach has occurred; or
  • is directed to do so by the Privacy Commissioner.

An 'eligible data breach' occurs if:

  • there is unauthorized access to, unauthorized disclosure, or loss of personal information held by an APP entity (i.e. a data breach); and
  • a reasonable person would believe that such a data breach is likely to result in serious harm to any of the individuals to whom the information relates.

To assist with assessing what a reasonable person might think, a non-exhaustive list of relevant matters to be considered has been included in the Privacy Act (Section 26WG).

Where there are no reasonable grounds to believe an eligible data breach has actually occurred but there are reasonable grounds to suspect that there may have been an eligible data breach, the entity must take all reasonable steps to undertake an assessment within 30 days (after the entity becomes aware of reasonable grounds to suspect such may have occurred) to determine whether or not an eligible data breach has occurred. Once such an assessment is completed (i.e., as soon as there are reasonable grounds to believe an eligible data breach has actually occurred) the entity will then have to notify the eligible data breach as soon as practicable, assuming it finds reasonable grounds for believing that an eligible data breach has actually occurred. However, this provision should not be used to automatically get 30 days to determine what to do in the case of an eligible data breach.

APP entities may use the usual means by which they communicate with the relevant affected individuals, if practicable, to notify all affected individuals of the eligible data breach. If not practicable, the APP entity must consider other means by which to notify the eligible data breach but, simply because it is impracticable to notify each individual personally, this does not obviate the need for notification and other appropriate means must be devised to notify the affected persons. As a deterrent to doing nothing, the provisions require, at a minimum, that the required notice be prominently published on the entity's website or that it is otherwise widely publicized.

The recent Uber Decision has made it clear (and eliminated any ambiguity) that having (and implementing) an appropriate data breach response plan that details, at least, certain key issues is required in order to comply with APP 1.2.

7.7. Data retention

In addition to the security obligations noted above, APP 11.2 requires that APP entities delete or de-identify all personal information in their possession once all legal requirements to keep it in an identified form have passed, it is not required for threatened or current litigation and it has been used for the notified purpose(s) for which it was collected (APP 11.2). That is, personal information cannot be kept indefinitely and all document/records/data retention policies must include appropriate provisions requiring deletion/de-identification of personal information in those records, etc. in accordance with APP 11.2. The Uber Decision has also made it clear that having (and implementing) an appropriate data destruction and retention policy is required in order to comply with APP 1.2.

Data analytics

The de-identification/deletion obligation raises significant issues for those APP entities that wish to keep personal information beyond the time limits permitted by the Privacy Act/APPs for data analytics purposes (including the training of artificial intelligence/machine learning algorithms), especially if data analytics was not an original stated purpose for the collection.

7.8. Children's data

There are no specific provisions in Australian privacy law dealing with children's personal information. However, under the general law the age of majority in Australia is 18 years of age. While this is appropriate for contracting (e.g., online T&Cs), the OAIC has given guidance that, subject to a consideration of the capacity of each relevant individual, a person at least 15 years old can generally be notified of a privacy collection statement and/or consent to the collection their sensitive information.

7.9. Special categories of personal data

Under Australian privacy law the 'special categories of personal information' are, subject to our comment below, mostly captured under 'sensitive information' and, while there are no significant separate sensitive information-specific provisions, in practice the obligations are applied more rigorously with respect to sensitive information. For example, the obligation to take reasonable steps to secure personal information against unauthorized disclosure, use, and/or loss are more rigorously applied in respect of holdings of 'sensitive information'. That is, more information security measures are expected as reasonable where one holds sensitive information.

Additional specific requirements (more onerous than for sensitive information) are included in or incorporated into Australian privacy law for 'TFN information' and 'credit information'.

7.10. Controller and processor contracts

As previously noted, there is no distinction under Australian privacy law between data controllers and data processors. That is, data processors have the same primary obligations and responsibilities as data controllers under the Privacy Act/APPs. As there is no separation between controllers and processors in Australia there is thus no mandated agreement requirements or obligations (e.g. like the Standard Contractual Clauses ('SCCs')). However, it is recommended that any third-party service provider arrangement should be documented (i.e. by agreement), especially where the processor is outside Australia, and should include purpose limitations, compliance with the Privacy Act/APPs (for offshore providers in particular) and provisions relating to the notification of and responsibility for NDBs.

8. Data Subject Rights

8.1. Right to be informed

As noted above, there is an obligation to notify all individuals whose personal information an entity collects of certain prescribed matters detailed in APP 5.2 at, or prior to, the collection of that information. If this is impracticable then notification must occur as soon as possible after the collection of that information. This is, in effect, Australian privacy law's 'right to be informed'. APP 5.2 provides the prescribed matters that must be notified and these include who is collecting, the purpose(s) for the collection, what use will be made of the information, and to whom it may be disclosed (and whether any of those disclosures are to recipients outside of Australia).

8.2. Right to access

The right to access the personal information held by the APP entity about that individual is covered by APP 12.1.

8.3. Right to rectification

The right to seek correction of the personal information held by the APP entity about that individual is covered by APP 13.1 and the right to have any correction notified to third parties to whom the personal information was provided by the APP entity is covered by APP 13.2.

8.4. Right to erasure

There is no specific 'right to erasure' currently given to individuals under Australian privacy law. However, there are obligations imposed on the entity to provide access to and correct personal information, together with an obligation to keep the information collected current. Also as noted above, under APP 11.2 the entity is obliged to delete or de-identify personal information (whether or not requested by the individual to do so) once it has been used for the notified purpose(s) of collection and is no longer required by law to be kept in an identifiable form.

8.5. Right to object/opt-out

The right to request not to receive direct marketing and to not have the individual's personal information disclosed or used for direct marketing is covered under APP 7.6. Also, any personal information collected under consent (e.g., sensitive information or personal information wrongly collected with consent) will be subject to the individual withdrawing their consent to further processing.

8.6. Right to data portability

Currently, there is no general 'right to data portability' under Australian privacy law, although there is the right to access the personal information held about one by an APP entity. However, the CDR regime does impose a data portability requirement for certain specified 'consumer data'.

8.7. Right not to be subject to automated decision-making

There is currently no right provided under Australian privacy law to request not be subject to automatic decision-making, unless such results in discrimination in which case there are possible actions under legislation other than privacy legislation.

8.8. Other rights

The right to not identify oneself when dealing with an APP entity (i.e. deal anonymously), unless impracticable or required by law, is covered by APP 2.

9. Penalties

The Privacy Commissioner has the ability to impose enforceable undertakings, award compensation/reimburse costs and damages, seek to impose a fine, and/or publish public determinations/decisions specifying full details of the infringement (in the case of a complaint) and the results of the Privacy Commissioner's investigation.

The ultimate sanctions available to the OAIC/Privacy Commissioner are:

  • to apply to the court to have a fine of up to the greater of AUD 50 million (approx. $32 million) and 30% of revenue for the period of the breach or 12 months (whichever is longer) for entities and AUD $2.5 million (approx. $1.6 million) for individuals imposed for a serious breach or repeated breaches of the APPs; and
  • enforceable undertakings requiring the APP entity to remediate its technology/IT, data handling, and cyber security (whatever it takes) in order to ensure compliance with privacy law/the APPs.

The OAIC may also request information from an entity regarding its compliance with the NDB scheme (see section on data breach notification above) following an actual or suspected data breach of that entity.

9.1 Enforcement decisions

The OAIC case against Facebook seeking to levy fines under the Privacy Act is the first such 'enforcement' action taken by the OAIC in respect of penalties that can be sought to be imposed by the OAIC for a serious invasion or repeated invasions of privacy (i.e. breaches of the APPs). The OAIC sought to impose up to the then AUD 1.9 million (approx. $1.22 million) fine in relation to each of the individuals impacted by the alleged serious invasion of privacy resulting from the Cambridge Analytica activities. In Australia, this is a group of some 320,000 which, even if only a token fine per person is applied by the court, will be a significant amount of money.

Prevailing 'wisdom' was that the fine would be applied to the activity or incident as a whole (i.e. almost irrespective of the number of individuals impacted). That is, up to AUD 1.9 million (approx. $1.22 million) in total, not up to AUD 1.9 million x 320,000 affected individuals.

Feedback