17 June 2025
In recent years, Australia has made significant strides to bolster its privacy regulatory framework in the face of growing complexities and challenges posed by the digital economy. To mark Privacy Awareness Week, we provide an overview of the key recent developments in Australia’s privacy landscape, including the commencement of the new statutory tort for serious invasions of privacy.
The Privacy and Other Legislation Amendment Act 2024 (Cth), passed in November 2024, introduced ‘tranche 1’ of reforms to the Privacy Act 1988 (Cth) (Privacy Act) delivering:
We discuss the reforms in ‘Privacy Act reforms: work to be done, but more to come.’
There is currently no indication from the government regarding the timeline for the next tranche of privacy reforms, including whether it will include the much-anticipated 'fair and reasonable' test, which has attracted significant interest from commentators. In a recent interview, Australian Privacy Commissioner, Carly Kind, noted that Australia’s privacy law is moving towards a framework that is more closely aligned with the European GDPR, but it also goes a step further in a new direction with the proposed novel ‘fair and reasonable’ test, which diverges from a consent-based model.
The Commissioner’s recent determinations and commentary signal a continued shift towards greater accountability for businesses and individuals’ control over their personal information.
Against this backdrop, we summarise five recent key developments in Australia’s privacy landscape.
The new statutory tort for serious invasions of privacy commenced on 10 June 2025. This tort operates as a standalone cause of action. This means that individuals now have a direct legal avenue to seek redress for 'serious' privacy breaches, independent of the existing Privacy Act and Australian Privacy Principles (APPs) regulatory framework.
Importantly, in the employment context, the new tort may limit an employer's ability to rely on the employee records exemption within the Privacy Act when their actions constitute a serious invasion of privacy.
This significant development underscores the increasing focus on individual privacy rights in Australia. Businesses should review their privacy practices and governance frameworks to account for this new legal risk, particularly concerning how they handle personal information and interact with employees.
As part of the reforms brought by the Privacy and Other Legislation Amendment Act 2024, the OAIC is developing the Children’s Online Privacy Code (Code), which will be registered by 10 December 2026.
The Code will aim to strengthen privacy protections for children (individuals under the age of 18) in digital environments by setting obligations and requirements for prescribed APP entities, including social media services and ‘designated internet services’ (as defined in the Online Safety Act 2021 (Cth)). The OAIC has recently suggested that it may also require education technology service providers and providers of certain wearables or smart toys that are likely to be accessed by children (e.g. baby monitors) to comply with the Code.
The Code is expected to set out how one or more of the APPs will be applied in relation to children's personal information (e.g. age-appropriate disclosure or collection notices). It may also include additional requirements that are not inconsistent with the APPs.
The OAIC is actively progressing the Code’s development. On 16 May 2025, the OAIC released age-specific worksheets and resources to gather feedback directly from children and parents, due 30 June 2025. On 12 June 2025, the OAIC released its Children’s Online Privacy Code Issues paper (Issues Paper), which sets out questions to foster feedback that will influence the Code’s development. The OAIC will also conduct a 60-day public consultation on the Code, which is expected to occur in 2026.
Businesses should take stock of their current practices in relation to the personal information of children and consider whether they would like to make a submission to the Issues Paper. Submissions close on 31 July 2025.
The recent amendments to the Privacy Act signal a clear intent to grapple with the privacy implications of artificial intelligence (AI). The aim is to provide individuals with greater control and understanding of how their data is leveraged in increasingly sophisticated automated systems.
Businesses that have arranged for a ‘computer program’ – a broad term encompassing pre-programmed rule-based processes, AI and machine learning processes – to make decisions that could ‘reasonably be expected to significantly affect the rights or interests of an individual’ will be required to disclose this in their privacy policies. This includes detailing the types of personal information used and whether decisions are fully automated or substantially assisted by AI.
Furthermore, the OAIC has issued guidance emphasising ‘privacy by design’ for AI systems, the need to ensure accuracy of AI-generated personal information (even ‘hallucinations’), and the requirement for consent when sensitive information is used to train AI models.
The amendments that require automated decision-making in privacy policies to be disclosed take effect on 10 December 2026. In the meantime, APP entitles should:
Concurrently, the Australian Government is considering introducing a risk-based legislative framework to regulate AI.
Although the recent amendments to the Privacy Act do not directly address the employee records exemption, recent decisions of the OAIC indicate a tightening of the employee records exemption’s scope.
In a case concerning an employer informing other staff about an employee’s life-threatening medical episode at work (in the determination of ALI and ALJ (Privacy) [2024] AlCmr 131), the OAIC applied a narrow interpretation of employee record exemption by finding it applies to any action which has an ‘absolute, exact or precise connection’ to the employment relationship between a private sector employer and the employee. This decision highlights the need for private sector employers to exercise caution when relying on the employee records exemption.
In its response to the Privacy Act Review Report, the federal government indicated that it would require further consultation with regulated entities and employee representatives on how enhanced privacy protections for private sector employees may be implemented. However, the government has not commenced any public consultation process to date. Any reforms to the employee records exemption will need to strike a balance between protecting the personal information of private sector employees and not imposing onerous compliance burdens or administrative and operational challenges on employers.
In the meantime, employers should be prepared to review and amend their privacy practices to ensure compliance with more stringent privacy regulations. This would include conducting audits of employee records and collection practices, completing privacy impact assessments and updating privacy documentation (such as privacy policies, collection notices and data breach plans and policies).
The OAIC indicated that it is closely monitoring privacy issues that have been identified in other jurisdictions around connected vehicles. The Commissioner is hoping to prevent those issues arising in the Australian market by setting market-leading industry standards around privacy practices for connected vehicles.
The Commissioner acknowledged that there are current complexities around whether data collected by Things (IoT) devices (e.g. vehicle data and driving data collected by a connected vehicle) is data about the device or data about the individual. The former interpretation allows for the argument that the relevant data should not be governed by the Privacy Act.
The upcoming privacy reforms, which propose to expand the definition of personal information to capture information that ‘relates’ to an individual, is likely to impact this analysis.
It is clear from the Commissioner’s remarks that she expects that manufacturers of IoT devices should, even prior to upcoming privacy reforms, treat data collected by IoT devices as personal information, and that robust, transparent privacy notices must be provided to consumers whose data is being collected by the devices. The Commissioner also expressed concerns about the use of any data collected by these devices for secondary purposes (e.g. in connected vehicles, the use of data for insurance purposes) and stated that informed consent should be obtained for those purposes.
These remarks are a timely reminder for manufacturers of IoT devices to audit the information collected by their devices and consider whether they may need to uplift current privacy practices so that the information collected by those devices is treated in accordance with the APPs.
Authors
Head of Technology, Media and Telecommunications
Special Counsel
Special Counsel
Associate
Associate
Associate
Associate
Tags
This publication is introductory in nature. Its content is current at the date of publication. It does not constitute legal advice and should not be relied upon as such. You should always obtain legal advice based on your specific circumstances before taking any action relating to matters covered by this publication. Some information may have been obtained from external sources, and we cannot guarantee the accuracy or currency of any such information.
Head of Technology, Media and Telecommunications