AI Governance

AI Governance
Framework
Template

A practical framework for UK businesses to audit AI systems, classify risk levels, and document their compliance approach — aligned to the EU AI Act and UK domestic requirements.

Organisation Details — complete before use

 
 
 
 
v1.0
 

Template provided by Nerdster.ai — nerdster.ai  |  [email protected]

For UK businesses. Not legal advice. Seek qualified counsel for compliance decisions.

List every AI tool, model, or system your organisation uses or is evaluating. Include third-party tools (ChatGPT, Copilot, etc.), internally built systems, and AI features embedded in existing software (e.g., AI in your CRM). If unsure, ask department heads to submit a list.
System Name Vendor / In-house Primary Purpose Data Processed EU AI Act Risk In Production? Owner / Dept
e.g. ChatGPT OpenAI Drafting, research Staff queries Limited Yes Marketing
EU AI Act Risk Level Reference
Unacceptable
Banned. Social scoring, real-time biometric surveillance, manipulation of vulnerable groups.
High Risk
Strict obligations. CV screening, credit scoring, medical devices, law enforcement, critical infrastructure.
Limited Risk
Transparency required. Chatbots, deepfakes, emotion recognition. Must disclose AI interaction.
Minimal Risk
Voluntary codes. Spam filters, AI in video games, recommendation engines, productivity tools.
Complete a block for each AI system classified as High Risk or Limited Risk. For Minimal Risk systems, a brief entry in the inventory (Section 1) is sufficient unless required by your sector regulator.

System Assessment — duplicate this block for each system

System name
 
Vendor / provider
 
Risk classification
 
Business purpose
 
Data inputs (types of data processed)
 
Does it process personal data?Yes / No / Partially
Does it process special category data?Yes / No
Does it produce decisions affecting individuals?Yes / No / Assists only
Is there a human review step before decisions are enacted?Yes / No / Partial
Is the system explainable / auditable?Yes / No / Partially
DPIA completed? (if processing personal data at scale)Yes / No / In progress
System disclosed to affected individuals?Yes / No / Not applicable
Bias / accuracy testing completed?Yes / No / Planned for:
Responsible ownerName / role:
Tip: For High Risk systems used in HR, credit, healthcare, or law enforcement, a full technical conformity assessment and registration in the EU AI Act database may be required from August 2026.
EU AI Act — General Obligations
GDPR / UK GDPR
Sector-Specific (mark applicable)
Section 4 — Review Schedule
Review Type Frequency Owner Last Completed Next Due
AI Systems InventoryQuarterly   
Risk AssessmentsAnnually / on change   
Compliance ChecklistQuarterly   
Vendor DPA reviewAnnually   
Staff AI awareness trainingAnnually   
Need help completing this framework? Nerdster offers AI governance audits for UK businesses — we map your AI exposure, classify risk, and produce a compliance-ready report. nerdster.ai/services or email [email protected]