Skip to content

Upcoming Changes to the Privacy Act (Cth): New Disclosure Requirements for Automated Decision-Making

Home » Not For Profit Charity » Upcoming Changes to the Privacy Act (Cth): New Disclosure Requirements for Automated Decision-Making

PREVIOUSLY KNOWN AS

The Privacy and Other Legislation Amendment Bill 2024 (Cth) introduces significant amendments to the Privacy Act 1988 (Cth) that will strengthen transparency obligations under the Australian Privacy Principles (APPs). These reforms include the introduction of mandatory disclosures about automated decision-making systems 

These reforms will affect most institutional operators of student management systems, human resource platforms and AI-enabled administration tools.  Does your organisation you use software systems and AI tools to conduct the following activities?: 

  1. Academic marking and assessments  
  1. Recruitment screening 
  1. Fraud and plagiarism detection 
  1. Billing and accounts  

These amendments come into force on 10 December 2026. We recommend that you start compliance planning immediately, as implementation may require policy amendments, vendor audits, governance reviews, and board approval processes.  

 

Legislative Framework

 

Amendments to APP 1.3 

The Bill amends APP 1.3 of the Privacy Act 1988 (Cth) to require APP entities to include in their privacy policies: 

  • A statement that the entity uses automated systems to make, or substantially assist in making, decisions that may have a significant effect on individuals; 
  • The kinds of personal information used in such automated decision-making processes; 
  • The types of decisions made using automated systems; and 
  • The categories of decisions that may significantly affect individuals’ rights or interests. 

These obligations apply where an organisation uses technology to make decisions solely or substantially through automated means, without meaningful human intervention at the decisional stage. 

 

What Is Changing?

 

APP entities will be required to update their privacy policies to include additional information where they use automated systems to make, or substantially assist in making, decisions that could significantly affect individuals. 

Organisations will need to disclose: 

  • That automated decision-making is used 
  • The kinds of personal information used in that process 
  • The kinds of decisions made using those systems 
  • The types of decisions that may significantly affect individuals’ rights or interests 

The requirement focuses on decisions made solely or substantially by automated systems. If automated systems are used in your organisation, consideration must be given as to whether this should be disclosed in your organisation’s privacy policy.   

 

What Is Automated Decision-Making?

 

“Automated decision-making” encompasses algorithmic processing, artificial intelligence tools, and software systems that determine or materially influence outcomes affecting individuals. This includes, but is not limited to: 

  • Academic assessment tools (automated marking, grade calculation, progression decisions) 
  • Human resources systems (recruitment screening, applicant ranking, performance evaluation) 
  • Risk management platforms (fraud detection, plagiarism identification, safeguarding alerts) 
  • Financial systems (billing automation, fee calculation, debt recovery triggers) 
  • Enrolment and admissions (eligibility assessments, waitlist prioritization) 

The threshold question is whether the system makes or substantially assists in making a decision—not merely whether it collects or processes data. 

 

What Does “Significantly Affect” Mean?

 

Decisions are likely to be considered “significant” where they impact a person’s  

  • Legal rights (contractual entitlements, regulatory standing) 
  • Financial interests (fees, debts, entitlements, employment) 
  • Educational outcomes (grades, progression, awards, admissions) 
  • Reputation or opportunities (professional references, certifications, disciplinary records) 
  • Access to services (enrolment, participation, facility use).  

The test is effects-based, not process-based.  Both adverse and beneficial effects may be significant if they materially alter an individual’s position. 

Some practical examples: 

Likely to require disclosure – schools that use automated marking software to produce a “pass” or “fail” mark for students, calculate ATAR-equivalent scores, rank scholarship applicants, screen job candidates, or trigger debt collection processes. 

May require disclosure – timetabling software that allocates class placements, room booking systems with priority algorithms, or CRM platforms that segment stakeholder communications based on predictive scoring. 
 

What This Means for You

 

Many organisations already use software platforms that incorporate some level of automation or algorithmic processing. Organisations rely on these tools to increase efficiency; however, these new amendments create particularly compliance obligations. 

Even where the system is provided by a third-party vendor, your organisation remains responsible for compliance with the APPs. 

Key implications include: 

1. Privacy policy updates
Privacy policies will need to be reviewed and updated to include the required disclosures. We recommend:  

– Clearly identify which systems involve automated decision-making 

– Describe the categories of personal information processed by those systems 

– Explain the nature of the decisions made (e.g academic results/employment suitability) 

– Specify which decision types may significantly effect individuals. 

We note that generic or boilerplate language is unlikely to satisfy these specific statutory requirements.  Policies should be tailored to the organisations actual systems and processes. 

2. Vendor Assessment

Organisations should understand whether their student management systems, CRM platforms, fundraising tools, HR software, or AI tools involve automated decision-making within scope. We recommend:  

 Vendor audits by obtaining detailed technical documentation on how the vendor systems process personal information and generate decisions. 

– Review contracts to ensure they include APP compliance warranties, audit rights and indemnification provisions. 

– Confirm that the vendors system aligns with the disclosed privacy practices and that data flows are properly documented. 

Reliance on vendor assurances alone will likely be insufficient.  Organisations should independently verify the nature and scope of automated processing. 

3. Internal governance

Boards and leadership teams should be aware of where automated tools are being used in decision-making processes and assess risk exposure.  Tiven the nature of the scope of the significant enforcement powers of the Office of the Australian Information Commissioner (OAIC) we recommend: 

 Comprehensive mapping of all systems that may involve automated decision-making across academic, administrative, HR and safeguarding functions; 

– Perform appropriate risk assessments based on decision significance and data sensitivity and incorporate into risk framework; 

  • Establish internal protocols for implementing any AI tools or algorithmic systems, including privacy impact assessments and legal review; and 
  • Educate staff responsible for system administration, vendor management and privacy compliance. 

We note that the Enforcement powers of the OAIC include issuing determinations requiring corrective action, imposing civil penalties (up to $50m for serious and repeated interferences with privacy) and publishing findings and requiring public reporting for compliance failures which risks reputational damage. 

 

Practical Steps to Take Now 

 

We recommend organisations c conduct a health check of internal governance systems to ensure that you can meet your compliance obligations.  

At a minimum, you should 

  1. Perform a system audit across departments and identify all automated decision making systems. 
  1. Engage with your vendors through initial dialogue, contract renegotiation or amendment and ensure you understand the technical architecture of their automated processes. 
  1. Seek legal review – engage Vocare Law to assist in privacy compliance assessment specific to your organisations technology environment.  

 

For medium term planning, we would suggest you consider policy drafting, governance frameworks – particularly incorporate into a broader information governance framework, and ensure there are appropriate channels to brief governance committees on compliance obligations risks and exposure.   

 

Vocare Law can assist through 

  • Privacy compliance audits 
  • Policy drafting and review 
  • Framework drafting 
  • Vendor contract negotiation 
  • Board and executive training 
  • Representation for enforcement matters. 

 

Have a Question ?

 

If you need assistance or legal advice regarding the above, Vocare Law can well assist. Contact our office on 1300-VOC-LAW / 1300-862-529 or email: enquiry@vocarelaw.com.au

This article was written by Timothy Whincop and Lukas Lim from Vocare Law’s Sydney office.

**The information contained herein does not, and is not intended to, constitute legal advice and is for general informational purposes only. 

(Sydney based)

Leading our Sydney and NSW team, Timothy Whincop has an extensive background as an in-house legal practitioner in a global not-for-profit organisation spanning numerous years. Tim was the prior Principal Director of House Counsel Advisory, a firm focused on commercial law, charity law and dispute resolution based in Sydney. Tim has cultivated expertise in handling complex legal and business matters across various global jurisdictions.

His ability to tackle challenges and offer sound advice is complemented by his unwavering commitment to upholding the highest standards of integrity in his work.

Tim’s unique proficiency lies in his extensive work with clients in the NGO and NFP sectors. where he consistently provides valuable insights and candid, forthright legal advice.

Back To Top
Search