Compliance

Consent capture in AI voice calls: GDPR and PECR guide

Consent capture in AI voice calls governs your entire outbound programme under GDPR and PECR. Get the lawful basis framework UK enterprises use pre-launch.

DILR.AI COMPLIANCE INTELLIGENCE Consent capture in AI voice calls GDPR · PECR 2003 · Data (Use and Access) Act 2025 · EU AI Act 2024/1689 £17.5m Max PECR fine from June 2025 (was £500,000) £550k ICO AI voice avatar fine September 2025 119 PECR fines issued 2019–2025 Source: ICO enforcement register · Data (Use and Access) Act 2025 · PECR Regulation 19

Most enterprise operations teams treat consent as a pre-launch checkbox. Review the form language, get legal to sign off, go live. The problem is that consent architecture under UK GDPR and PECR is not a document — it is a live operational system, and the cost of getting it wrong has fundamentally changed.

The Data (Use and Access) Act 2025, which received Royal Assent on 19 June 2025, aligned PECR maximum penalties with UK GDPR. Maximum fines for automated calling violations can now reach £17.5 million or 4% of global annual turnover — up from the previous ceiling of £500,000. That is not an incremental change in enforcement risk. It is existential.

That legislative shift arrived alongside the ICO's September 2025 enforcement against two energy firms using AI-powered voice-avatar technology. Green Spark Energy and Home Improvement Marketing made a combined 11.9 million automated marketing calls without prior consent, presenting AI-driven systems as human callers named "Jo," "Helen," and "Ian." Combined fine: £550,000 — issued under the old cap. ICO Head of Investigations Andy Curry was direct: "Advances in technology may make detection harder, but the rules remain the same."

The message for enterprises considering or scaling AI voice outbound programmes is specific: the ICO classifies AI-generated conversational voice systems as automated calls under PECR Regulation 19. There is no "conversational AI exemption." Legitimate interests — available as a lawful basis for live human-agent marketing calls — does not apply to AI voice outbound marketing. Prior explicit consent, from a named organisation, covering the specific automated call type, is the only lawful route. This guide explains the regulatory framework and how to build a consent architecture that holds under scrutiny.

Key takeaway

Before your first AI voice campaign goes live, consent architecture must satisfy three simultaneous tests: PECR Regulation 19 prior consent (automated-call-specific, not generic), UK GDPR Article 7 conditions (specific, granular, auditable), and TPS/CTPS screening compliance. All three are mandatory. None substitutes for the others. Review your GDPR and compliance documentation requirements before configuring any outbound campaign.

Four Requirements for Valid Consent Under PECR and UK GDPR DILR.AI 01 Named consent Names your organisation specifically 02 Type disclosed Covers AI/automated calls explicitly 03 Purpose specific Marketing scope defined, not bundled 04 Auditable record Timestamped, retrievable on ICO request

The UK regulatory framework for outbound AI voice is a two-layer structure. PECR governs the electronic communication itself — whether you are permitted to make the call at all. UK GDPR governs the personal data processed during and after the call. Both apply simultaneously, and neither substitutes for the other.

PECR Regulation 19 is unambiguous: you must not use an automated calling system for direct marketing purposes unless the called party has previously consented to receive such calls from you. The ICO's published guidance and its September 2025 enforcement make clear that AI-generated voice systems — including LLM-driven conversational agents — meet the definition of an automated calling system. They are not "live speech" from a human agent, and no degree of conversational sophistication changes that classification.

The enforcement record confirms the severity. Between 2019 and September 2025, the ICO issued 119 PECR monetary penalty notices totalling approximately £10.5 million under the old £500,000 cap. Those cases are now precedent — under DUAA 2025 penalty alignment, equivalent violations carry potential exposure up to £17.5 million.

The GSE / HIM energy case is the clearest AI-specific enforcement on record. Both firms deployed voice-avatar technology: pre-recorded scripts voiced by actors, played by automated systems presenting as named human callers. The ICO executed a search warrant at the director's residence in March 2024. The September 2025 fines — £250,000 for GSE, £300,000 for HIM — were issued under the old cap. The same conduct under current rules would trigger exposure in the millions.

Two Regulation 19 requirements trip enterprises that know the basics but miss the detail. First, consent must specifically name your organisation. Data subjects who opted in to receive "marketing from our partners" have not consented to receive calls from you. Generic consent from a third-party data broker creates liability for both the broker and the caller.

Second, consent for live agent calls does not transfer to AI voice calls. If your CRM holds consent records from an existing live-call programme and you switch to AI voice delivery, those records are insufficient under Regulation 19 without explicit disclosure that calls will be automated or AI-generated.

There is also a widespread enterprise misconception that B2B automated calls to corporate subscribers carry lower risk because PECR's electronic mail rules do not apply to corporate subscribers. This is legally incorrect. The ICO's direct marketing guidance is explicit: the corporate subscriber exemption covers electronic mail only, not automated voice calls. Regulation 19 applies equally to business contacts. Enterprises running AI voice outbound to corporate contacts without prior consent are not in a safer regulatory position than B2C operators.

Lawful basis under UK GDPR: what applies to AI voice marketing

PECR governs whether the call is permitted. UK GDPR Article 6 governs the lawful basis for processing the personal data involved. Both must be satisfied. The practical mapping across call types is:

Call typePECR requirementUK GDPR Article 6 basis
AI voice outbound marketingPrior explicit consent (Reg 19)Consent — Article 6(1)(a)
Live human agent marketing callNo prior consent — screen TPS/CTPSLegitimate interests — Article 6(1)(f)
AI voice inbound (customer service)PECR Reg 19 not applicableLI or contract — Art 6(1)(b)/(f)
AI voice appointment reminders (existing customers)Not direct marketingLegitimate interests or contract

For consent to meet UK GDPR Article 7 conditions it must be: freely given, specific, informed, and demonstrated by unambiguous positive action. Crucially, the burden of proof rests with the controller — not the regulator. If you cannot produce evidence that a data subject gave consent, at what point, via what mechanism, and to which specific organisation and call type, that consent does not exist in law.

The HelloFresh case — a £140,000 ICO fine issued January 2024 — illustrates what happens when consent fails the granularity test at enterprise scale. HelloFresh bundled consent inside an age-confirmation checkbox that did not mention SMS, and retained data for marketing purposes 24 months post-cancellation without disclosure. For AI voice programmes, the equivalent failure is obtaining consent to "receive marketing communications" without disclosing that delivery will be by an automated AI voice system. You will have consent records. You will not have valid consent. That distinction is what the ICO enforces.

This is also where your enterprise AI voice business case must account for compliance infrastructure costs — not just the per-call economics. Consent architecture, suppression list management, and audit logging are implementation requirements, not optional enhancements.

£17.5m
Max PECR fine from June 2025
11.9m
Unconsented AI voice calls — GSE & HIM (2023)
£550k
Combined ICO fine — AI voice avatar enforcement, Sept 2025
119
PECR monetary penalties issued 2019–2025
See it in action

DILR.AI's outbound voice agents include built-in consent validation, TPS/CTPS screening logic, and audit-ready call logging — the compliance infrastructure every enterprise programme needs before campaign launch, explored in detail on our outbound solutions page or live in the Dilr Voice platform.

Consent compliance is not a point-in-time decision — it is a continuous operational system that must run on every campaign, for every contact, every time. The enterprises that avoid ICO enforcement are those that treat consent as infrastructure: automated, auditable, and integrated with campaign flow before the first call is made.

Valid consent for outbound AI voice marketing requires four components operating simultaneously — the same four steps in the infographic above are not a checklist to complete once. They are standards your consent capture mechanism must meet on every opt-in event.

Named and specific consent statement. The consent request must identify your organisation by name, state that communications will be delivered by an automated AI voice system (not just "by phone" or "by marketing communications"), and list the specific categories covered. Bundled consent — combined with unrelated terms, tucked inside a general privacy notice, or implied by pre-ticked boxes — fails Article 7's specificity requirement regardless of how many contacts are technically on your list.

Unambiguous affirmative action. Passive behaviour — scrolling past terms, completing a form that includes implied consent, or silence on a previous call — does not constitute consent. The data subject must take a clear, active step. This distinction matters particularly for enterprises sourcing prospect data from third-party brokers, where the consent capture mechanism is outside your control. If you cannot verify what the data subject was shown and what action they took, you cannot demonstrate consent.

Timestamped, auditable consent record. The ICO requests consent audit trails during investigations. Records must capture: who consented, when, via which mechanism, to which specific organisation, and for which call type. Systems that store consent as a binary "opted-in: yes/no" flag without underlying evidence fail this standard. For AI voice in financial services specifically, FCA expectations on record-keeping add a further layer of documentation obligation.

Real-time opt-out with suppression synchronisation. Withdrawal must be as easy to execute as consent was to give. In practice: every AI voice call must offer an in-call opt-out option; suppression list update must be immediate and permanent; and opt-outs received through any channel — email unsubscribe, web preference centre, direct request — must prevent AI voice outreach within the same campaign cycle. Failing to honour opt-outs promptly contributed to enforcement in both the HELM and HelloFresh cases.

You can see how compliant enterprise deployments handle this end-to-end — from consent capture at the data collection stage through to call-level suppression logic.

Every AI voice call generates personal data in two distinct forms: the call recording and the AI-generated transcript. Both are subject to UK GDPR Article 5(1)(e)'s storage limitation principle — retained only as long as necessary for the stated purpose, then deleted. Retention obligations vary materially by sector:

SectorUse caseRetention periodRegulatory basis
Financial servicesRegulated callsMinimum 5 years (up to 7)FCA SYSC 9.1
General enterpriseQuality assurance90 daysUK GDPR Art 5(1)(e)
Customer serviceDispute resolution6 monthsICO guidance
HealthcarePatient interaction recordsNHS Records Management ScheduleNHS Code of Practice
Collections / debtEvidence of agreements6 yearsLimitation Act 1980

AI-generated transcripts are derived personal data — not anonymised outputs — and carry identical retention and deletion obligations to the source recording. Enterprises must define and enforce documented retention policies covering both formats, with automated deletion at policy expiry. The absence of a documented retention policy, or failure to enforce it operationally, constitutes a UK GDPR Article 5 violation independent of the consent question.

A further compliance layer arrives on 2 August 2026: EU AI Act Article 50 requires that persons interacting with AI systems be informed they are doing so, unless it is obvious from context. For UK enterprises with EU-facing operations, this is a disclosure obligation that sits above and alongside PECR consent. The ICO's September 2025 enforcement — which specifically called out voice-avatar systems designed to make callers believe they were speaking with a human — signals that UK regulators are already applying equivalent reasoning under existing domestic law, with or without the EU Act's formal entry into force.

The mermaid above shows the decision logic that should run before every contact enters an outbound AI voice campaign. Every "No" path leads to suppression. The enterprise security posture for your voice programme — including how consent records and suppression data are protected — is part of your enterprise security posture and is increasingly evaluated during enterprise procurement.

Pre-launch compliance checklist
  • PECR Reg 19 consent Specific to organisation + automated call type
  • UK GDPR Article 7 Freely given, specific, informed, unambiguous
  • Consent records Timestamped, auditable, per-contact retrievable
  • TPS/CTPS screening Current lists, applied at campaign configuration
  • Suppression list Integrated, real-time, cross-channel synchronised
  • Retention policy Documented, covers recordings + transcripts separately
Next step

Deploy AI voice outbound with compliance built in from day one

DILR.AI's outbound voice platform includes consent validation, TPS/CTPS screening, audit-ready call logging, and configurable suppression logic — the infrastructure your legal and operations teams need before the first campaign goes live. Built for UK enterprise compliance, not bolted on after the fact.

consent AI voice calls GDPRPECR automated calling consentlawful basis AI voice callsGDPR outbound voice automation compliancevoice AI GDPR compliance frameworkDILR.AI voice automationICO AI calling guidance

Related articles

← Previous
Business case AI voice: the enterprise framework

One email, once a month. No hype. Just what we learned shipping.