← Back to Blog

AI Data Privacy in New Zealand: What Kiwis Need to Know

When you paste a client email into ChatGPT, where does it go? A plain-English guide for New Zealand professionals.

📅 February 2026⏱️ 8 min readBy Caelan Huntress

Every day, New Zealand professionals are pasting client information, contract details, medical notes, and business strategy into ChatGPT. Most haven't stopped to ask where that data actually goes. This guide answers that question — plainly, without the technical jargon.

⚠️ This is not legal advice

Privacy law is complex and situation-dependent. If you're in a regulated industry (law, medicine, finance), consult your professional body or legal counsel for guidance specific to your practice.

What Happens When You Use ChatGPT?

When you type into ChatGPT, your message travels from your browser to OpenAI's servers — located primarily in the United States. OpenAI processes your prompt, generates a response, and (depending on your settings) may retain your conversation.

By default, OpenAI's terms allow them to:

  • Store your conversations
  • Use your conversations to improve their models
  • Share data with third parties under certain conditions
  • Retain data even after you delete it (for some period)

You can opt out of some of this by disabling chat history in settings. ChatGPT Enterprise offers stronger privacy controls. But in all cases, your data is being processed on overseas servers, subject to US law.

Why This Matters in New Zealand

The Privacy Act 2020

New Zealand's Privacy Act 2020 governs how personal information is collected, stored, and used. The key principles relevant to AI use:

  • Purpose limitation: Personal information should only be used for the purpose it was collected
  • Security: You must take reasonable steps to protect personal information you hold
  • Disclosure: You should tell people when their information may be shared with overseas parties

Sending a client's personal information to a US-based AI company raises questions about compliance — particularly the disclosure and security principles. The Office of the Privacy Commissioner (OPC) has not yet issued definitive AI-specific guidance, but the existing framework applies.

Professional Obligations

Many New Zealand professions have additional confidentiality requirements that go beyond the Privacy Act:

ProfessionRelevant ObligationRisk Level
LawyersLegal professional privilege, NZLS rules🔴 High
AccountantsClient confidentiality, CPA/CA obligations🔴 High
Doctors / healthcareHealth Information Privacy Code🔴 High
Financial advisersFMA requirements, client information rules🔴 High
HR professionalsEmployee information, Employment Relations Act🟡 Medium
Teachers / educatorsStudent data, school board obligations🟡 Medium
Government employeesOfficial Information Act, government data policy🟡 Medium
Business consultantsContractual confidentiality (varies)🟢 Lower

Common Privacy Mistakes with AI

1. Pasting Client Emails Verbatim

When you copy an email from a client and paste it into ChatGPT to help draft a response, you've just sent that client's personal information to an American company. If the email contains identifying details, complaints, or sensitive context, this may breach your confidentiality obligations.

Better approach: Anonymise or paraphrase before using AI, or use a locally-hosted AI that never transmits data externally.

2. Uploading Confidential Documents

ChatGPT allows document uploads. Uploading a legal brief, financial report, or medical record for AI analysis sends that document to OpenAI's servers. Even if their privacy controls are robust, your client didn't consent to this.

Better approach: Use an enterprise plan with explicit data handling agreements, or use a local AI tool.

3. Strategy and IP Discussions

Asking AI to help you plan a client acquisition strategy, develop a new product, or refine a proprietary process exposes business intelligence to a third party. For most businesses this is an acceptable trade-off, but for competitive-sensitive work, it's worth considering.

The Local AI Alternative

This is the primary reason privacy-conscious New Zealand professionals choose a personal AI assistant over cloud-only tools.

When your AI runs on dedicated hardware in your office, everything changes:

  • ✅ Your data never leaves your premises
  • ✅ No third-party servers involved (for local processing)
  • ✅ You control exactly what data is retained
  • ✅ Client conversations stay in New Zealand
  • ✅ You can demonstrate data handling controls to clients and regulators

WHAT LOCAL PROCESSING MEANS

OpenClaw runs AI models locally on your Mac Mini. When you ask it to summarise a client email, the email never leaves your hardware. The AI processes it right there.

When you ask it to search the web or use cloud models for complex reasoning, only the specific query goes out — not the underlying client data.

Practical Privacy Guidelines for AI Use

Regardless of what AI tools you use, these practices reduce your risk:

  1. Anonymise before you AI-ify. Replace names, addresses, and identifying details with placeholders before passing anything to a cloud AI tool.
  2. Read the terms. Know what OpenAI, Anthropic, and Google do with your data. They update their policies periodically.
  3. Use enterprise plans for sensitive work. ChatGPT Enterprise and Claude for Enterprise offer stronger data protection than consumer plans.
  4. Have a data handling policy. Know what types of information you will and won't put into AI tools. Document it.
  5. Tell clients. If AI is material to how you handle their information, disclosure is both ethical and, in many professions, required.
  6. Consider local AI for the sensitive stuff. A dedicated AI installation gives you the capability of cloud AI with the privacy of on-premises data handling.

The NZ Regulatory Landscape is Evolving

The Office of the Privacy Commissioner released initial AI guidance in 2024, and professional bodies are beginning to develop AI-specific policies. The landscape is moving fast.

Our recommendation: get ahead of the regulations rather than wait for them. Professionals who establish good AI data hygiene practices now will be well positioned when formal rules arrive — and will have the client trust that comes from demonstrating they took privacy seriously early.

For structured guidance on responsible AI use in your professional context, GenAI Training NZ offers workshops that include privacy and compliance considerations tailored for NZ organisations.

Frequently Asked Questions

Does ChatGPT store my conversations?

By default, OpenAI stores your conversations and may use them to improve their models. You can disable chat history in settings, but this varies by plan. Even with history off, your data is still processed on their servers.

Is using ChatGPT for client information a privacy risk?

Potentially, yes — especially in regulated industries. If you paste a client's personal information, contract details, or medical records into ChatGPT, that data is transmitted to and processed on OpenAI's servers in the US. This may conflict with your professional privacy obligations.

Does the Privacy Act 2020 apply to AI use?

Yes. The New Zealand Privacy Act 2020 requires that personal information is handled securely and not used for purposes beyond what was collected. Sending client data to overseas AI servers raises compliance questions that many professionals haven't yet considered.

How does a local AI assistant protect my data?

A local AI assistant (like OpenClaw) runs on hardware in your own home or office. Your data never leaves your premises unless you explicitly send it. The AI processes everything locally, so client information, business strategy, and private conversations stay private.

What industries have the highest AI privacy risk?

Legal, medical, accounting, financial services, and government are highest risk — these professions handle sensitive personal data that's subject to specific confidentiality rules. Education and HR are also high risk due to student and employee privacy requirements.

Keep Your Client Data in New Zealand

A dedicated AI assistant runs on your hardware. Your data stays on your premises.