DPDP Act

DPDP Act vs GDPR vs IT Act: Complete Comparison for Indian AI Teams

Side-by-side comparison of India's DPDP Act 2023, EU's GDPR, and IT Act 2000. Understand differences in consent, penalties, scope, and what they mean for AI governance.

12 min readUpdated 2026-05-04

Origins and legislative intent

The DPDP Act 2023 is India's third major attempt at data protection legislation after the Personal Data Protection Bills of 2018 and 2021 were withdrawn. Unlike the EU's GDPR, which emerged from a strong civil rights tradition treating data protection as a fundamental right, the DPDP Act reflects a more pragmatic approach balancing individual protections with India's digital economy growth ambitions.

The GDPR (Regulation 2016/679) has been in force since May 2018 and has been shaped by four years of enforcement decisions, case law, and regulatory guidance from the European Data Protection Board. The DPDP Act, by contrast, is nascent — its Rules have not yet been published and enforcement has not begun. Indian companies must comply with an Act whose implementation details are still being worked out.

The IT Act 2000 (Section 43A and Rule 5 of the IT Rules 2011) provided India's previous data protection framework. It covered 'sensitive personal data or information' (SPDI) for body corporates, required reasonable security practices, and prescribed compensation for wrongful gain or loss. The DPDP Act supersedes the SPDI rules but does not repeal the IT Act entirely — sections relating to cybercrime, intermediary liability, and electronic contracts remain in force.

Penalty structures: ₹250 crore vs €20 million vs compensation

GDPR's maximum fine is €20 million or 4% of global annual turnover, whichever is higher. For large multinationals, this translates to penalties in the billions — Meta was fined €1.2 billion by the Irish DPC in 2023. DPDP penalties are capped in absolute terms (₹250 crore maximum) rather than as a percentage of revenue. For large Indian conglomerates or multinationals operating in India, the DPDP penalty cap may be less consequential than GDPR's turnover-based ceiling.

For Indian startups and SMEs, however, ₹250 crore is far more punishing relative to their scale than the equivalent GDPR fine would be for their EU revenue. A ₹250 crore penalty would be terminal for most Indian startups. This asymmetry means smaller Indian companies face proportionally higher penalty exposure under DPDP than smaller EU companies face under GDPR.

The IT Act's compensation mechanism was largely civil — Section 43A allowed individuals to claim compensation from negligent body corporates. DPDP's penalties are regulatory and do not automatically translate into individual compensation. However, affected individuals may pursue civil remedies alongside DPDP penalties, creating cumulative financial exposure.

Cross-border transfers: adequacy vs trusted countries

GDPR's cross-border transfer mechanism (Chapter V) requires that transfers to non-EEA countries occur only under an adequacy decision, standard contractual clauses, binding corporate rules, or specific derogations. This is a well-developed framework with decades of case law, including the landmark Schrems II ruling that invalidated the EU-US Privacy Shield.

DPDP's cross-border transfer provision (Section 11) takes a different approach: transfers are permitted to countries that the Central Government notifies as 'trusted' jurisdictions. The trusted country list has not yet been published. Until it is, Indian companies must rely on Section 7 (deemed consent) or explicit consent for international data transfers. This creates uncertainty for cloud-based AI products that process Indian data in US or EU data centres.

The IT Act had no cross-border transfer restriction — India was an extremely permissive data transfer environment. The DPDP Act changes this significantly. Companies accustomed to routing Indian customer data freely to US-based AI providers will need to reassess this practice once transfer rules are established.

What to do if you serve both EU and Indian customers

For companies serving both EU and Indian customers with AI products, the pragmatic approach is to comply with the stricter standard on each dimension. Use GDPR-level consent mechanisms (which exceed DPDP's requirements) as your baseline. Implement DPDP-specific Indian PII detection and redaction as an addition to your existing GDPR controls. Treat your audit trail obligations as met if they satisfy both frameworks.

One important difference: GDPR requires appointing a Data Protection Officer (DPO) for certain organisations (Article 37). DPDP requires appointment of a DPO only for Significant Data Fiduciaries (SDFs) under Section 10. If you have already appointed a GDPR DPO, this individual can take on DPDP responsibilities, but ensure their mandate explicitly covers DPDP obligations and Indian-specific requirements.

Consider the IP address question. Under GDPR, IP addresses are personal data (as confirmed by CJEU in Breyer). Under DPDP, IP addresses are arguably personal data when they can be linked to an individual. For consistency, treat IP addresses as personal data under both frameworks and ensure your AI systems do not log or process raw IP addresses in prompts sent to model providers.

DPDP Act pillar implementation addendum

A pillar page should also connect the legal idea to a concrete implementation path. Start with ownership: name the product owner, engineering owner, security reviewer, and compliance reviewer for this topic. Then map the systems that can create, store, transform, or transmit the relevant personal data. The map should include frontend forms, backend APIs, queues, warehouses, LLM prompts, embedding stores, admin exports, vendor dashboards, and customer-success tooling.

Next, document the lawful purpose and the user-facing notice. The notice should be clear enough that a data principal understands what is processed, why AI may be involved, what categories of personal data are affected, and how consent or withdrawal works. If the workflow supports children, healthcare, financial services, employment, or government delivery, treat that context as higher risk and add stricter review before allowing personal data into model calls.

The engineering control should run before data leaves the application boundary. Scan the full prompt package, not just the user's message. That means system instructions, retrieved snippets, tool outputs, attachments, OCR text, chat history, and structured JSON all need inspection. When a high-confidence identifier is found, redact, tokenise, block, or route to a safer model depending on the policy. Keep the original sensitive value out of general logs unless a protected exception is approved.

Audit evidence should be designed for reconstruction. A reviewer should be able to answer: when did the request happen, which application sent it, which data type was detected, which rule fired, what action was taken, which provider received the final payload, and who approved any exception. Without that trail, teams are left with policy claims rather than proof. With it, they can respond faster to buyer diligence, internal audits, breach triage, and regulator questions.

Finally, make the process repeatable. Add sample payloads to tests, run scheduled scans against logs and representative documents, check sitemap and page health for public guidance, and keep the DPDP scanner linked from the page so readers can move from learning to action. The goal is not to freeze the system; it is to make every future AI workflow easier to review, safer to launch, and easier to explain.

#dpdp#gdpr#it-act#comparison#global-compliance

Check your own workflow

Run a free DPDP scan before this risk reaches production.

Scan prompts, logs, documents, and API payloads for Indian PII exposure, missing redaction, and audit gaps. Backlinks: learn hub, developer docs, pricing, and the DPDP scanner.