Industry

EdTech AI in India: DPDP Compliance and Student Data Protection

How Indian EdTech platforms build DPDP-compliant AI tutoring and personalisation systems — children's data rules, parental consent, and LLM governance.

12 min readUpdated 2026-05-04

EdTech's DPDP Challenge: Children at Scale

Indian EdTech platforms like BYJU'S, Unacademy, Vedantu, and thousands of smaller platforms serve tens of millions of students, many under 18. DPDP Section 9 applies to all of them — verifiable parental consent is required before processing a child's personal data, and behavioural tracking and targeted advertising to children is prohibited.

The scale challenge: getting verifiable parental consent for 50 million students is a significant operational problem. OTP-based parent verification (sending OTP to parent mobile) is the pragmatic approach — high coverage, reasonable assurance. Biometric verification is too high-friction for mass market.

AI Personalisation and Student Data

AI personalisation in EdTech processes: learning pace data, quiz performance, time spent per topic, engagement patterns, difficulty level adjustments. This is all personal data under DPDP. For minor students, each personalisation feature requires parental consent.

The tension: AI personalisation is the core value proposition of modern EdTech. You can't get consent for every ML feature individually without destroying UX. Pragmatic approach: a single 'personalised learning' consent that covers all data-driven personalisation features, clearly described in a student-friendly (and parent-friendly) notice.

LLM AI Tutors: DPDP Compliance Checklist

For LLM-powered AI tutors: (1) All conversation logs are personal data — 30-day retention maximum unless explicit consent for longer retention, (2) Student names, school names, and performance data must be redacted before LLM API calls, (3) The AI tutor must not retain information about a student across sessions unless explicitly consented to (and the session linkage is stored in your system, not in the LLM), (4) Parental access — parents must be able to see what their child's AI tutor has discussed.

One specific risk: AI tutors that ask students personal questions to build rapport ('what's your name? where do you go to school?') are collecting personal data. This must be disclosed and consented to.

Industry operational checklist

EdTech AI in India: DPDP Compliance and Student Data Protection should be reviewed as an operating control, not only as a reference article. The minimum checklist is a data inventory, a stated processing purpose, owner approval, PII detection at the AI boundary, redaction or tokenisation where possible, retention limits, vendor transfer records, and a tested user-rights workflow. This checklist gives engineering and compliance teams a shared language for deciding what must be blocked, what can be allowed in shadow mode, and what needs human review before production release.

For AI systems, the review should include prompts, retrieved context, tool call arguments, model responses, logs, traces, analytics events, exports, and support attachments. Many incidents happen because teams scan only the visible form field while sensitive data moves through background context or observability tooling. CrewCheck's recommended pattern is to place the scanner at the request boundary, record the policy version, and keep audit evidence that shows which identifiers were detected and what action was taken.

A practical rollout starts with representative samples from production-like traffic. Run a DPDP scan, sort findings by identifier sensitivity and blast radius, fix Aadhaar, PAN, financial, health, children's, and precise-location exposure first, then move to consent wording, retention, deletion, and vendor review. Use shadow mode when false positives could disrupt users, and promote to enforcement only after the exceptions have owners and expiry dates.

This page is educational and should be paired with legal review for final policy interpretation. The operational proof should still come from repeatable evidence: scanner results, audit exports, pull-request checks, policy configuration, and a documented owner for the workflow. That combination is what makes the content useful during buyer diligence, board review, regulatory questions, or an incident investigation.

#EdTech#education#DPDP#children#AI#India

Check your own workflow

Run a free DPDP scan before this risk reaches production.

Scan prompts, logs, documents, and API payloads for Indian PII exposure, missing redaction, and audit gaps. Backlinks: learn hub, developer docs, pricing, and the DPDP scanner.