APP 1.7 in force since 10 June 2025
OAIC enforcement begins 10 December 2026 — days
Civil penalties up to $50 million
OAIC reviewed 23 organisations in January 2026 — not one was compliant
~80,000 new APP entities from AUSTRAC enrolment July 2026
APP 1.7 in force since 10 June 2025
OAIC enforcement begins 10 December 2026 — days
Civil penalties up to $50 million
OAIC reviewed 23 organisations in January 2026 — not one was compliant
~80,000 new APP entities from AUSTRAC enrolment July 2026
APP 1.7 · AUSTRAC · Privacy Act compliance

Governance
claims are
everywhere.
Proof is
nowhere.

APP 1.7 is already law. Most organisations still cannot name every AI system processing personal information, say where automated decisions happen, or show what should already appear in their privacy policy. Attesta is the fixed-fee path from uncertainty to a complete governance record — built on the evidence base that governments use, delivered for the mid-market.

0/23
Organisations compliant in OAIC's January 2026 review
$50M
Maximum civil penalty from 10 December 2026
48h
Fixed delivery. Five documents. Complete governance record.
~80k
New APP entities from AUSTRAC enrolment from July 2026
The accountability moments

Six situations. One gap. Attesta closes it.

Every day, someone asks an Australian organisation to prove what it governs. The answer is almost always the same: we don't have the evidence.

The Regulator

"Name every AI system that processes personal information. Show me the ADM Declaration. Prove you took reasonable steps."

OAIC · AUSTRAC · APRA
The Board

"Post-Robodebt, our NEDs face personal liability for AI decisions they cannot document. What can we actually produce?"

s.180 · Corporations Act
Enterprise Procurement

"Your enterprise buyer sent an AI governance DDQ. The deal stalls until you can answer it. You don't have a methodology."

SaaS · B2B · Procurement
The Acquirer

"Due diligence has surfaced AI systems with no governance documentation. That's a valuation risk we need quantified."

M&A · Capital Raise · IPO
The Insurer

"PI renewal now includes AI governance questions. Without documentation, your coverage is at risk or your premium goes up."

PI Insurance · Aon · Marsh
The Privacy Officer

"The OAIC reviewed 23 organisations in January 2026. Zero were compliant. We need documentation before December."

APP 1.7 · Dec 2026
THE RESPONSE TO THIS PROBLEM
Why Attesta exists

Built because compliance only works when it is practical enough to use and strong enough to defend.

Most organisations that face these situations reach for point compliance solutions. They start with the legislation and stop there.

Everyone else is selling compliance. We're building capability. The compliance comes with it — faster, cheaper, and grounded in evidence that holds.

APP 1.7 is where we start. It is not where we stop.
How we built it

An AI governance evidence platform. Not a framework. Not a consultancy. Evidence.

Four independent evidence bases. No other product in Australia has operationalised all four.

Every Attesta product is grounded in the same four independent evidence bases — not legal opinion, not vendor frameworks, not best-practice guidelines. The documentation changes. The evidentiary standard does not. Rae Dev spent 40 hours operationalising the MIT AI Risk Repository — turning 1,612 classified risks across 65 taxonomies into a product framework no vendor has replicated.

"A policy document is not evidence. A committee meeting is not a control. The question is not whether governance exists. It is whether governance can be proved — on demand, at the moment of accountability."

— Sam Banerjee · Beyond Probabilistic Governance · UTS DSI · 2026

Pillar 01 · Privacy Act
APP
What must be done

The Privacy Act and APP framework define the legal obligation and the minimum disclosure standard. Every Attesta assessment satisfies this floor as its baseline — not as its ceiling.

Pillar 02 · MIT
1,612
What might be missed

The MIT AI Risk Repository gives Attesta a structured way to identify categories of risk that simple legal review can overlook. 1,612 risk classifications mean the assessment goes beyond what the legislation names.

Pillar 03 · CSET
200
What goes wrong in practice

200 annotated incidents · 127 confirmed harms · 94% post-deployment. CSET incident research brings consequence, patterns and practical risk framing. This is how governments think about AI risk — not how law firms do.

Pillar 04 · AIID
AIID
Why this is not theoretical

The AI Incident Database anchors governance decisions in documented real-world failures, not abstract compliance language. When Attesta identifies a risk, it is a risk that has already materialised somewhere.

S
Sam B
Founder · Attesta · AI Decoded Pty Ltd · PhD Candidate · UTS Data Science Institute · Human Centred AI Lab

Sam's research at UTS identifies the Evidentiary Gap — the distance between what a board says it governs and what it can prove. That gap is what Attesta was built to close.

His published research, featured by the UTS Human Centred AI Lab in April 2026, examines why current governance frameworks will not survive regulatory scrutiny. Supervised by Distinguished Prof. Fang Chen, UTS DSI.

Four-time founder. 28 years in technology, risk and organisational strategy.

PhD · UTS DSI HCAI Lab AI Governance Research S.180 Corporations Act Director Liability Four-time founder
R
Rae D
Co-founder · Compliance Practitioner
Attesta · AI Decoded Pty Ltd

Every Attesta assessment is personally delivered by Rae D — 18 years as a compliance practitioner inside Australian financial services and professional services firms.

Not a generalist. Not a law firm associate. A practitioner who has spent two decades inside the organisations Attesta serves. Rae spent 40 hours operationalising the MIT AI Risk Repository — turning 1,612 classified risks into a product framework no vendor has replicated.

When you receive your Attesta deliverable, Rae's name is on it.

18 years compliance Australian financial services MIT taxonomy · 40 hours APP 1.7 OAIC framework
The compliance gap

This is not a policy update. It is a discovery exercise.

Copilot is in every Microsoft 365 licence. Gemini is in Gmail and Workspace. Zoom transcribes and summarises every meeting by default. Xero, MYOB, LEAP, Karbon and Practice Ignition all process client data with AI-enabled features you activated when you accepted their terms of service.

None of this is typically disclosed. None of it appears in most privacy policies. The issue is not whether you use AI. The issue is whether you can identify every relevant system — and produce a record that shows you govern it.

The OAIC reviewed 23 organisations in January 2026. Not one was compliant. This is not a minority problem. It is an industry-wide gap — and enforcement begins 10 December 2026.

ADM Declaration trigger

If any of your systems make or influence decisions about individuals — scoring a client, routing a workflow, ranking a candidate — your organisation must also publish an ADM Statement of Declaration on your website. This is a separate obligation from the privacy policy disclosure.

AI systems active in your organisation Disclosure required
M
Microsoft Copilot
M365 · email, documents, meetings
Required
Active if M365
Copilot is active by default in all Microsoft 365 Business and Enterprise plans. It processes emails, Teams conversations, documents and calendar data. Every user in your organisation is subject to Copilot processing unless explicitly disabled at the tenant level.
Z
Zoom AI Companion
Transcripts, summaries, meeting records
Required
Active by default
Zoom AI Companion generates meeting summaries, action items and transcripts by default for all Zoom Pro, Business and Enterprise accounts. Client conversations are being processed and stored.
G
Google Gemini
Gmail, Docs, Sheets, Drive, Workspace
Required
Workspace exposure
Gemini for Google Workspace processes email content, drafts documents and analyses files in Drive. Active across Gmail, Docs, Sheets and Slides for all Workspace accounts.
X
Xero AI features
Transaction categorisation, reconciliation
Required
Financial data
Xero uses machine learning to auto-categorise transactions, suggest reconciliation matches and generate financial insights. It processes client financial data including names, ABNs and transaction histories.
L
LEAP / practice management
Client matters, documents, workflow
Likely
Professional privilege
LEAP's AI features assist with document drafting, precedent matching and matter management. They process confidential client matter data. Disclosure is likely required and legal professional privilege considerations apply.
K
Karbon / Practice Ignition
Practice management, client onboarding
Likely
Workflow data
Karbon uses AI for task prioritisation and email triage. Practice Ignition processes client personal information through proposal generation. Both create disclosure obligations for professional services practices.
C
ChatGPT / Claude (staff use)
Unstructured usage, no formal controls
Likely
Shadow AI risk
Staff use of consumer AI tools creates Shadow AI risk. If staff input client names, matter details or personal information into ChatGPT, Claude or Gemini, disclosure obligations may be triggered and data governance controls are almost certainly absent.
Each row is a separate disclosure obligation under APP 1.7. Most organisations have six or more active systems — none disclosed. Click any row to expand.
Two triggers. One obligation.

Different entry points. The same governance problem.

AUSTRAC enrolment and the December 2026 enforcement date bring the same obligation through very different doors.

Managing partners at professional services firms

Law firms, accounting practices, financial planners, conveyancers and mortgage brokers who enrolled under AUSTRAC from July 2026 became APP entities under the Privacy Act. Most were not told this chain exists.

"I just became an APP entity and nobody told me. I don't have the internal resource to deal with this — and I need it fixed before a client or a regulator asks."
Law firmsAccounting practicesFinancial plannersMortgage brokersConveyancers

Company secretaries and general counsel at ASX mid-cap

Mid-cap ASX companies face board accountability, NED personal liability and the December 2026 enforcement date simultaneously. A director who cannot show a defensible AI governance record is personally exposed.

"I don't know which AI systems we actually use — and I cannot answer the board's questions. I need a defensible record before the next audit committee meeting."
ASX mid-capCompany SecretariesGeneral CounselAudit committeesNEDs
AUSTRAC chain
Approximately 80,000 new entities enter scope from July 2026.

AUSTRAC enrolment automatically makes an organisation an APP entity under the Privacy Act 1988 (Cth) — which triggers APP 1.7 compliance. Law firms, accounting firms, conveyancers and financial planners affected are typically not aware this chain exists.

The APP 1.7 path

Action-plan clarity. Not consulting fog.

Fixed fee. Fixed output. Fixed delivery. Three steps from uncertainty to a complete governance record.

Step 01
Free

Eligibility Checker

5 questions · 3 minutes · immediate result

Confirms whether APP 1.7 applies to your organisation. No account required.

  • APP entity status confirmed
  • Primary obligation identified
  • Next step recommended
Start the checker →
Step 03
$2,497

Full Assessment

5 documents · 48-hour delivery

Five publication-ready documents. Ready to insert into your privacy policy, file with your board, and show to the OAIC.

  • AI System Inventory
  • Privacy Policy Paragraph (insert-ready)
  • ADM Statement of Declaration
  • Website ADM Declaration
  • Insertion Guide
Get the Full Assessment →

A law firm engagement for equivalent work costs $8,000–$15,000 and takes three to four weeks — often without a guaranteed Governance Declaration or Website ADM Declaration.

ATTESTA · $2,497 · 48 HOURS

Attesta delivers documents. This is not legal advice. Clients should have their existing solicitor review the output.

The Attesta platform

APP 1.7 is the entry point. The platform goes wider.

The total AI governance gap extends across eight obligations. APP 1.7 is the most urgent. The full platform addresses each — fixed fee, fixed output, practitioner-delivered.

APP 1.7 Assessment

Fixed-fee assessment. Five documents in 48 hours. The complete APP 1.7 governance record.

AI Risk Register

Recurring subscription for continuous AI risk monitoring and governance register maintenance.

GET IN TOUCH →

Board Governance Briefing

High-seniority liability protection for NEDs and board members. Defensible record for audit committees.

GET IN TOUCH →

Shadow AI Audit

Operational audit identifying unsanctioned AI tool usage and unmanaged workflow risk.

GET IN TOUCH →

Vendor Due Diligence

AI vendor risk assessment framework. Know what your vendors are doing with your data before you sign.

GET IN TOUCH →

EU AI Act / ISO 42001

Gap assessment for international obligations and AI management standards.

GET IN TOUCH →

DDQ Response Framework

High-urgency sales enablement for growth-stage companies facing AI governance questionnaires.

GET IN TOUCH →

Sector Risk Libraries

Granular sector-specific AI risk intelligence. 187 healthcare risks. 143 financial services risks.

GET IN TOUCH →
Start where the urgency is

Three minutes to find out where you stand.

The Eligibility Checker confirms whether APP 1.7 applies to your organisation. Free. No account required. Immediate result.

If your Attesta deliverable does not meet the APP 1.7 disclosure standard, we will revise it until it does — at no additional cost.

Attesta delivers documents. This is not legal advice. Clients should have their existing solicitor review the output. AI Decoded Pty Ltd.