glossary

Context Window

The maximum amount of text (measured in tokens) that a language model can process in a single request, including both the prompt and the generated response

Definition

The maximum amount of text (measured in tokens) that a language model can process in a single request, including both the prompt and the generated response.

Why It Matters for AI Governance

Larger context windows increase the risk of personal data exposure because more information can be stuffed into a single prompt. Governance controls must scan the entire context window for PII, not just the user's immediate input.

How CrewCheck Handles This

CrewCheck's LLM gateway applies context window-related controls at the request boundary. Every AI call passes through detection, policy evaluation, and audit logging — ensuring that context window is addressed consistently across all teams and providers.

The governance dashboard provides real-time visibility into context window events, with drill-down capabilities for compliance officers and exportable evidence for auditors.

#context-window#glossary#ai-governance

Ready to govern your AI workflows?

Try CrewCheck's live demo — no sign-up required.

Try Live Demo