In This Article
Understanding Nonconformities
A nonconformity occurs when an organization fails to meet a requirement of ISO 42001. Nonconformities can be classified as:
- Major: Absence of or complete failure to implement a required element, or situations that raise significant doubt about AIMS effectiveness
- Minor: Single observed lapse in meeting a requirement that does not indicate systemic failure
Major nonconformities must be addressed before certification can be granted. Minor nonconformities allow certification but require corrective action plans with evidence of closure.
Top 10 ISO 42001 Nonconformities
1. Incomplete AI System Inventory
Finding: Organization has not identified all AI systems in scope, particularly third-party AI services or embedded AI components.
Clause: 4.3, 8.1
Fix: Conduct comprehensive AI system discovery including procurement records, vendor contracts, and technical architecture reviews. Document each system's role (developer, provider, user).
2. Generic AI Risk Assessment
Finding: Risk assessments use generic IT risk methodology without addressing AI-specific risks such as bias, explainability, or human oversight.
Clause: 6.1.2
Fix: Develop AI-specific risk categories covering fairness, transparency, accountability, safety, and societal impact. Ensure risk criteria reflect AI system characteristics.
3. Missing or Incomplete AI Impact Assessments
Finding: Impact assessments not conducted for all in-scope AI systems, or assessments do not address impacts on affected individuals.
Clause: 6.1.4, 8.4
Fix: Establish systematic impact assessment process triggered by new AI deployments and significant changes. Include stakeholder impact analysis and mitigation measures.
4. AI Policy Lacks Responsible AI Commitments
Finding: AI policy is generic IT policy with AI terminology added, without substantive commitment to responsible AI principles.
Clause: 5.2
Fix: Revise AI policy to explicitly address fairness, transparency, human oversight, accountability, and safety. Ensure top management approval and communication.
5. Insufficient Competence Evidence
Finding: Personnel responsible for AI governance lack documented competence in AI-specific topics.
Clause: 7.2
Fix: Define competence requirements for AI governance roles. Provide training on ISO 42001, AI ethics, and technical AI concepts. Maintain training records.
Competence requirements should be proportionate to responsibility. Those approving AI systems for deployment need different competencies than technical developers - but all need documented evidence.
6. Statement of Applicability Gaps
Finding: Statement of Applicability does not address all Annex A controls, or exclusions lack adequate justification.
Clause: 6.1.3
Fix: Review each Annex A control and document inclusion rationale or exclusion justification. Ensure excluded controls are genuinely non-applicable based on risk assessment.
7. No Evidence of Management Review
Finding: Management review not conducted, or meeting did not address all required inputs and outputs.
Clause: 9.3
Fix: Schedule management reviews at defined intervals. Use structured agenda covering all Clause 9.3 inputs. Document decisions and actions as meeting minutes.
8. Internal Audit Deficiencies
Finding: Internal audits not conducted against ISO 42001 requirements, auditors not independent, or findings not addressed.
Clause: 9.2
Fix: Establish audit program with trained auditors independent of audited activities. Use ISO 42001 audit criteria. Track findings to closure.
9. Third-Party AI Not Governed
Finding: AI services from third parties (APIs, SaaS) not included in risk assessment or supplier management processes.
Clause: 8.1, Annex A
Fix: Inventory all third-party AI services. Conduct risk assessments for each. Establish contractual requirements and ongoing monitoring for AI suppliers.
10. Nonconformity Process Not Followed
Finding: Previous nonconformities or AI incidents not documented or corrective actions not verified for effectiveness.
Clause: 10.2
Fix: Implement systematic nonconformity tracking. Document root cause analysis, corrective actions, and effectiveness verification. Learn from AI incidents.
Common Root Causes
Implementation Too Rushed
Organizations pursuing aggressive certification timelines often skip foundational activities like comprehensive AI inventory or thorough risk assessment, leading to surface-level compliance that auditors identify.
IT-Centric Approach
Treating ISO 42001 as an extension of ISO 27001 without recognizing AI-specific dimensions results in generic risk assessments and policies that miss the standard's core purpose.
Lack of AI Expertise
Implementation teams without AI background may not understand how to assess AI-specific risks or what controls are appropriate for different AI system types.
Siloed Implementation
When AIMS implementation is isolated in compliance or IT functions without involvement from AI development teams, documentation may not reflect actual practices.
Prevention Strategies
Comprehensive Gap Assessment
Before implementation, conduct thorough gap analysis against all ISO 42001 requirements. Identify areas requiring the most effort and allocate resources accordingly.
AI-Specific Training
Ensure implementation team understands AI concepts, common AI risks, and how ISO 42001 requirements apply specifically to AI systems.
Cross-Functional Involvement
Include AI developers, data scientists, product managers, legal, and compliance in AIMS development. Each brings essential perspective.
Pre-Certification Readiness Review
Conduct mock audit or readiness assessment before Stage 1 audit. Identify and address gaps before they become formal findings.
The organizations that achieve certification with minimal findings are those that view ISO 42001 as an opportunity to genuinely improve AI governance, not just a compliance checkbox.
Corrective Action Best Practices
Root Cause Analysis
For each nonconformity, go beyond the immediate symptom to identify root cause. Ask "why" multiple times until you reach the underlying issue.
Proportionate Response
Corrective actions should be proportionate to the finding. Minor documentation gaps need different responses than systemic failures in risk assessment.
Effectiveness Verification
Plan how you will verify that corrective action actually resolved the issue. This might include follow-up audits, monitoring, or management review.
Timeline Realism
Set realistic timelines for corrective action completion. Auditors prefer realistic plans that are met over ambitious plans that slip.