# Tokenization
One-sentence definition: Replacing sensitive data with non-sensitive tokens stored in a secure token vault or via vaultless algorithms.
## Key Facts
- Reduces scope by keeping originals in a controlled system.
- Reversible via tokenization service; secure key/algorithm management.
- Format-preserving options keep schema compatibility.
- Monitor access to detokenization; log every reveal.
- Plan for availability of token service (HA/DR).
- **Verify:** check official (ISC)² CBK and current exam outline.
## Exam Relevance
- Pick tokenization to minimize exposure in distributed systems.
**Mnemonic:** “Swap and safeguard.”
## Mini Scenario
Q: Need to store PANs across microservices—what approach?
A: Tokenize with centralized detokenization service and strict RBAC.
## Revision Checklist
- Define tokenization vs encryption.
- Name 2 control requirements for token vault.
- Address availability concerns.
## Related
[[Key Management Basics (Asset Security)]] · [[Data Encryption Overview (Asset Security)]] · [[Pseudonymization vs Anonymization]] · [[Cloud Data Protection (SaaS, PaaS, IaaS)]] · [[Data Loss Prevention (DLP)]] · [[Domain 2 - Index]]