# Data Quality and Integrity Controls
One-sentence definition: Processes and checks that ensure data is accurate, complete, consistent, and reliable across its lifecycle.
## Key Facts
- Dimensions: accuracy, completeness, consistency, timeliness, uniqueness.
- Controls: validation rules, referential integrity, constraints, deduplication.
- Reconciliation between sources; exception queues and stewardship.
- Integrity verification: hashes, digital signatures, checksums (end-to-end).
- Data quality metrics & SLAs tied to business impact.
- **Verify:** check official (ISC)² CBK and current exam outline.
## Exam Relevance
- Pick controls that prevent/identify corruption and stale data.
**Mnemonic:** “**ACCT-U**” → Accurate, Complete, Consistent, Timely, Unique.
## Mini Scenario
Q: Reports show duplicate customers—what control?
A: Deduplication rules and unique key constraints.
## Revision Checklist
- Name 3 data quality dimensions.
- Give one preventive and one detective control.
- Tie a metric to business impact.
## Related
[[Hashing and Checksums for Data Integrity]] · [[Data Catalogs and Metadata Management]] · [[Master Data Management (MDM)]] · [[Data Warehouse and Data Lake Security]] · [[Logs and Telemetry as Sensitive Data]] · [[Domain 2 - Index]]