Guide March 2026 · 10 min read

AI & Data Privacy: What Finance Teams Need to Know

Your data is your responsibility. This guide covers what every finance team should check before using AI tools – from data processing agreements to GDPR compliance and the EU AI Act.

The #1 Rule: Free Tier = No Business Data

Never use free or consumer AI plans for business data

Every major AI provider uses free-tier data for model training by default. The line is always the same: consumer plans train on your data, business plans do not. This is the single most important thing to get right.

The good news: all four major AI providers – OpenAI (ChatGPT), Anthropic (Claude), Google (Gemini), and Microsoft (Copilot) – offer business plans with proper Data Processing Agreements (DPAs) and no-training guarantees. The bad news: their free tiers offer none of these protections.

Who Trains on Your Data?

This is the question that matters most. Here is the answer, broken down by plan type:

Provider Free / Consumer Business / Enterprise
OpenAI Yes – Free, Plus, Pro all train by default No – Team, Enterprise, API
Anthropic Yes – Consumer Claude since Sept 2025 No – Team, Enterprise, API
Google Yes – Free Gemini trains by default No – Workspace paid editions
Microsoft Yes – Consumer Copilot No – M365 Copilot

All four providers offer opt-out for consumer plans, but opt-out is not retroactive and is harder to enforce across a team. The only safe approach for business use is a paid plan with a DPA.

Provider Comparison: DPAs, Residency & Retention

Beyond training, here are the three things that matter for GDPR compliance: whether a DPA exists, where your data is stored, and how long it is kept.

O

OpenAI (ChatGPT)

DPA

Updated Jan 2026. Covers Team, Enterprise, API.

EU Data Residency

Available since Feb 2025. In-region GPU inference since Jan 2026. No extra cost.

Data Retention

Enterprise: custom (min 90 days). API: 30 days default, Zero-Data-Retention available.

A

Anthropic (Claude)

DPA

Auto-incorporated with Commercial Terms. Includes Standard Contractual Clauses.

EU Data Residency

API: EU regions since Aug 2025 (AWS). Consumer claude.ai: US only.

Data Retention

API: 7 days (reduced Sept 2025). Zero-Data-Retention addendum available.

G

Google (Gemini for Workspace)

DPA

Cloud Data Processing Addendum (CDPA). Automatic for EEA users.

EU Data Residency

Follows existing Workspace data region settings. Admin-controlled.

Data Retention

Session-only. No prompt or response storage beyond the user session.

M

Microsoft (M365 Copilot)

DPA

Microsoft Products and Services DPA. Enterprise-grade.

EU Data Residency

EU Data Boundary service. In-country processing for Germany expanding 2026.

Data Retention

Governed by M365 data lifecycle policies. Inherits your existing compliance setup.

Watch out: Subprocessors

Microsoft Copilot added Anthropic as a subprocessor in January 2026. For EU/EFTA/UK tenants, this is disabled by default because Anthropic data is excluded from EU data residency. Check your admin settings.

Data Classification: What Can Go Into AI?

Before any data enters an AI tool, your team needs to know what level of sensitivity it has. Here is a practical four-tier model:

1

Public

Published financials, press releases, public pricing. Safe to use with any AI tool.

2

Internal

Internal budgets, org charts, general policies. Use only with business-tier AI tools with a DPA.

3

Confidential

Financial forecasts, unpublished results, customer lists, vendor contracts, strategic plans. Enterprise AI tools only – with DPA, EU residency, and no-training guarantees.

4

Restricted

Bank account details, payment card data, employee PII (salaries, tax IDs), credentials, trade secrets. Do not use with external AI tools. Consider on-premise or Zero-Data-Retention deployments only.

GDPR Compliance Checklist

If your finance team operates in the EU, here is what you need to verify before rolling out any AI tool:

Signed Data Processing Agreement (DPA)

Covers your specific use case, data types, and jurisdictional requirements.

Confirmed no-training policy

Written confirmation that your data is never used for model training.

EU data residency enabled

Data stored and processed within the EU. Check provider-specific availability above.

Data Protection Impact Assessment (DPIA)

Required when processing personal data (employee info, customer data, supplier details).

Lawful basis identified

Consent, legitimate interest, or contractual necessity documented for each use case.

Internal AI usage policy

Clear guidelines on which tools are approved, what data can be used, and who can authorize exceptions.

Sub-processor review

Know which third parties handle your data. Update your Records of Processing Activities (ROPA).

AI literacy training

Required since February 2025 under the EU AI Act. Staff must understand privacy implications.

The EU AI Act: What You Need to Know

The EU AI Act entered into force on August 1, 2024, with phased enforcement. Here is what is already in effect and what is coming:

Already in effect (since Feb 2025)

  • Prohibited AI practices banned (social scoring, manipulative AI)
  • AI literacy obligations: staff using AI must have sufficient understanding of the tools

Coming August 2026

  • Full enforcement for high-risk AI systems (credit scoring, employment decisions)
  • Transparency requirements: users must be informed when interacting with AI
  • Risk management systems, technical documentation, fundamental rights impact assessments

What counts as high-risk for finance?

Most general-purpose AI use (drafting, summarizing, analyzing) falls under minimal risk with no specific obligations. However, AI used for creditworthiness assessment or credit scoring is classified as high-risk. AI used for automated decision-making affecting individuals may also qualify. Penalties: up to 35 million EUR or 7% of global turnover.

Your Action Plan

Here is what to do this week:

1

Audit your current AI tools

List every AI tool your team uses. Check: is it a free or paid plan? Is there a DPA? Where is data stored?

2

Upgrade to business plans

Move all team members off free/consumer tiers immediately. The cost is minimal compared to the compliance risk.

3

Classify your data

Use the four-tier model above. Make sure everyone on the team knows what level of data they are working with.

4

Write an AI usage policy

One page is enough. Which tools are approved, what data levels are allowed per tool, who authorizes exceptions.

5

Enable EU data residency

Check your admin settings for each provider. Most offer EU storage at no extra cost on business plans.

6

Train your team

AI literacy is legally required since February 2025 under the EU AI Act. Cover the basics: what data goes where, what to avoid, when to ask.

The bottom line

Using AI in finance is not risky – using it carelessly is. With a business-tier plan, a signed DPA, EU data residency, and clear internal policies, your team can leverage AI confidently and stay fully compliant. The setup takes a day. The productivity gains last forever.

Stay compliant, stay informed

Subscribe to our newsletter for updates on AI privacy, new regulations, and practical guides for finance teams.