In This Guide
- A gap assessment is the essential first step before ISO 20000-1 certification — it reveals where you stand, what you need to fix, and how much effort certification will require.
- Use a clause-by-clause methodology covering Clauses 4–10 and every service management process in Clause 8 to ensure nothing is missed.
- A 4-level scoring scale (Not Addressed, Initial, Defined but Inconsistent, Conforming) provides enough granularity for prioritisation without over-complicating the analysis.
- The evidence pack should be organised by clause with an evidence matrix that maps each requirement to specific documents, records, and artefacts.
- How you present evidence matters: well-organised, clearly labelled evidence accelerates audits and builds auditor confidence in your SMS maturity.
What Is a Gap Assessment?
An ISO 20000-1 gap assessment is a structured evaluation of your current IT service management practices against the requirements of ISO/IEC 20000-1:2018. Its purpose is to identify where your organisation already conforms, where gaps exist, and what work is needed to achieve certification readiness.
Why Gap Assessments Matter
Pursuing certification without a gap assessment is like navigating without a map. You might eventually arrive, but the journey will be longer, costlier, and more frustrating than necessary. A well-conducted gap assessment delivers:
- Visibility: A clear picture of current maturity across all clauses and processes.
- Prioritisation: Understanding which gaps are critical (likely to result in nonconformities) versus those that represent opportunities for improvement.
- Resource planning: Realistic estimates of the time, people, and budget needed to reach certification readiness.
- Baseline: A documented starting point against which progress can be measured.
- Risk reduction: Identifying potential nonconformities before the auditor does, giving you time to remediate.
Gap Assessment vs. Internal Audit
A gap assessment is not the same as an internal audit, though both involve evaluating conformance. The key differences are:
- Timing: Gap assessments are conducted before the SMS is fully implemented. Internal audits are conducted on an operating SMS.
- Purpose: Gap assessments identify what needs to be built or improved. Internal audits verify that what has been built is working.
- Output: Gap assessments produce a remediation plan. Internal audits produce audit findings and corrective actions.
- Formality: Gap assessments can be less formal. Internal audits must follow the requirements of Clause 9.2.
Conduct your gap assessment at least 6–9 months before your target certification date. This gives adequate time to remediate gaps, implement new processes, allow processes to generate records and evidence, and conduct the required internal audit and management review before the certification audit.
Planning the Assessment
A successful gap assessment requires thoughtful planning. Rushing into clause-by-clause review without preparation leads to incomplete findings and unreliable conclusions.
Step 1: Define Objectives
Clarify what the gap assessment should deliver:
- Clause-by-clause conformance status
- Identification of major, minor, and observation-level gaps
- Prioritised remediation plan with effort estimates
- Evidence inventory (what exists, what is missing)
- Timeline estimate for certification readiness
Step 2: Assemble the Assessment Team
The assessment team should include:
- Assessment lead: Someone with ISO 20000-1 knowledge and audit experience. This could be an internal resource or an external consultant.
- Process owners: Individuals responsible for incident management, change management, service level management, and other SMS processes. They provide the most accurate view of current practices.
- IT management representative: Someone with authority to access records, approve interview schedules, and escalate issues.
- Documentation coordinator: A person responsible for collecting and organising evidence during and after the assessment.
Step 3: Confirm Scope Alignment
Before starting the assessment, confirm that the scope is clearly defined. The gap assessment must evaluate conformance within the defined SMS scope. Refer to your scope statement to ensure you know which services, locations, and organisational units will be assessed.
Step 4: Gather Existing Documentation
Collect all existing documentation before starting interviews and reviews:
- Service management policy and objectives
- Process documentation (procedures, work instructions, flowcharts)
- Service catalogue
- SLAs, OLAs, and underpinning contracts
- ITSM tool records (incidents, changes, problems, releases)
- KPI reports and dashboards
- Meeting minutes (CAB, service reviews, management reviews)
- Previous audit reports (internal or external)
- Risk register and risk treatment plans
- Continual improvement register
Step 5: Create the Assessment Schedule
Plan interviews and document reviews across the assessment period. A typical schedule for a medium-sized organisation:
- Week 1: Document review — policies, procedures, and process documentation.
- Week 2: Interviews with process owners and team leads — understanding actual practices.
- Week 3: Evidence sampling — reviewing records, reports, and ITSM tool data.
- Week 4: Analysis, scoring, and report writing.
Gap Analysis Methodology
The core of the assessment is a systematic, clause-by-clause review of ISO 20000-1:2018 requirements against your current practices. The methodology should be rigorous enough to identify real gaps but practical enough to complete within a reasonable timeframe.
Clause-by-Clause Approach
Work through each clause of the standard methodically. For each requirement:
- Identify the requirement: What does the clause specifically require? Break down compound requirements into individual assessment points.
- Review documentation: Is there a documented policy, procedure, or process that addresses this requirement?
- Verify implementation: Is the documented process actually followed in practice? Review records, interview staff, and sample evidence.
- Assess effectiveness: Is the process achieving its intended outcomes? Are KPIs defined and met? Are outputs used?
- Record the finding: Document the current state, identify the gap (if any), and rate the severity.
Clauses to Assess
ISO 20000-1:2018 has the following high-level structure:
- Clause 4 — Context of the Organisation: Understanding the organisation, interested parties, SMS scope, and the SMS itself.
- Clause 5 — Leadership: Management commitment, policy, roles, and responsibilities.
- Clause 6 — Planning: Risk and opportunity assessment, SMS objectives, and planning changes.
- Clause 7 — Support: Resources, competence, awareness, communication, documented information, and knowledge.
- Clause 8 — Operation: Operational planning, service portfolio (8.2), relationship and agreement (8.3), supply and demand (8.4), service design and transition (8.5), resolution and fulfilment (8.6), and service assurance (8.7).
- Clause 9 — Performance Evaluation: Monitoring, measurement, analysis, internal audit, management review, and service reporting.
- Clause 10 — Improvement: Nonconformity, corrective action, and continual improvement.
Assessment Techniques
Use a combination of techniques for a comprehensive assessment:
| Technique | What It Reveals | Best For |
|---|---|---|
| Document Review | Whether policies, procedures, and processes exist and are current | Clauses 4, 5, 7 (documented information) |
| Staff Interviews | Whether documented processes are understood and followed | All clauses; especially Clause 7 (competence, awareness) |
| Record Sampling | Whether processes generate expected records and evidence | Clause 8 (all operational processes), Clause 9 (reporting) |
| ITSM Tool Analysis | Volume, quality, and completeness of incident, change, and problem records | Clause 8.6 (resolution), 8.5 (change management) |
| Process Walkthroughs | End-to-end process execution from trigger to closure | Complex processes like change management, release management |
Scoring & Rating Gaps
A consistent scoring methodology transforms subjective observations into actionable data. It enables comparison across clauses, progress tracking over time, and objective prioritisation of remediation effort.
Recommended Scoring Scale
A 4-level scale provides adequate granularity without over-complicating the assessment:
| Score | Level | Description | Audit Risk |
|---|---|---|---|
| 0 | Not Addressed | No documented process, no evidence of activity. The requirement is not being met in any form. | Major nonconformity likely |
| 1 | Initial / Ad Hoc | Some activity occurs but it is inconsistent, undocumented, or person-dependent. Results vary. | Major or minor nonconformity likely |
| 2 | Defined but Inconsistent | A documented process exists and is partially followed. Gaps in execution, records, or coverage remain. | Minor nonconformity or observation likely |
| 3 | Conforming | Process is documented, implemented consistently, generating records, and achieving intended outcomes. | Conformance expected |
Gap Severity Classification
Beyond the maturity score, classify each gap by its likely audit impact:
- Critical Gap (Red): Requirement completely unmet. Will result in major nonconformity. Must be remediated before certification audit.
- Significant Gap (Amber): Partial conformance with material weaknesses. Likely minor nonconformity or could escalate to major. High priority for remediation.
- Minor Gap (Yellow): Process exists but evidence is thin or inconsistent. Likely observation or minor finding. Remediate to strengthen position.
- Opportunity (Green): Requirement is met but improvement is possible. Not a certification risk but represents maturity growth opportunity.
Producing a Gap Heat Map
Visualise assessment results in a heat map that shows each clause and its overall readiness score. This provides leadership with an at-a-glance view of certification readiness and helps focus investment where it is most needed. Group clauses into categories:
- Management System (Clauses 4-7, 9-10): Governance, planning, support, and performance evaluation.
- Service Portfolio (Clause 8.2): Service catalogue and portfolio management.
- Relationship & Agreement (Clause 8.3): Business relationship, SLM, and supplier management.
- Supply & Demand (Clause 8.4): Budgeting, demand management, capacity management.
- Design & Transition (Clause 8.5): Change management, service design, build, transition, release and deployment.
- Resolution & Fulfilment (Clause 8.6): Incident management, service request management, problem management.
- Service Assurance (Clause 8.7): Service availability, continuity, information security.
Building a Remediation Plan
The gap assessment is only valuable if it drives action. The remediation plan translates assessment findings into a structured programme of work that moves the organisation from its current state to certification readiness.
Priority Framework
Prioritise remediation actions using a two-factor model:
- Audit risk: How likely is this gap to result in a nonconformity during certification audit? Critical and significant gaps are highest priority.
- Effort to remediate: How much time, people, and investment does this gap require? Quick wins (high impact, low effort) should be tackled first to build momentum.
Remediation Action Structure
For each gap, document a remediation action with:
- Gap reference: Clause number and specific requirement.
- Current state: Brief description of what exists today.
- Target state: What conformance looks like for this requirement.
- Actions required: Specific tasks to close the gap (e.g., “Draft capacity management procedure,” “Configure ITSM tool for problem categorisation”).
- Owner: Named individual accountable for delivery.
- Deadline: Target completion date aligned with certification timeline.
- Dependencies: Other actions or decisions this depends on.
- Evidence deliverable: What evidence will be produced to demonstrate conformance.
Phased Remediation
Organise remediation into phases:
- Phase 1 (Months 1–2): Address critical gaps — missing policies, undefined processes, no service catalogue. These are the foundations.
- Phase 2 (Months 2–4): Address significant gaps — incomplete procedures, weak supplier governance, insufficient KPIs. Build on the foundations.
- Phase 3 (Months 4–6): Address minor gaps and polish — evidence consistency, record completeness, management review effectiveness. Generate operational evidence.
- Phase 4 (Months 5–6): Internal audit and management review — verify remediation effectiveness and generate the final required evidence.
Don’t attempt to remediate everything simultaneously. A phased approach ensures that foundational elements (policies, processes, tools) are in place before you try to generate operational records and evidence that depend on them.
Evidence Pack Structure
The evidence pack is your audit-ready documentation set. It demonstrates to the certification body that every requirement of ISO 20000-1:2018 is being met through documented, implemented, and effective processes. A well-structured evidence pack significantly accelerates the audit and builds auditor confidence.
Categories of Evidence
Evidence falls into five main categories:
- Policies: High-level statements of intent and direction (e.g., Service Management Policy, Information Security Policy). Typically approved by top management.
- Procedures: Documented processes describing how activities are performed (e.g., Incident Management Procedure, Change Management Procedure). Detail steps, roles, inputs, and outputs.
- Records: Evidence that processes have been executed (e.g., incident tickets, change records, problem investigation reports, CAB meeting minutes). Records are the most important evidence category for auditors.
- KPIs and Reports: Metrics that demonstrate process performance and effectiveness (e.g., SLA achievement reports, incident resolution time trends, change success rates).
- Service Reports: Reports delivered to customers and management demonstrating service performance, trends, and improvement actions (Clause 9.1).
Organising the Evidence Pack
Structure the evidence pack by clause for easy auditor navigation:
- Folder per clause: Create a top-level folder for each clause (Clause 4, Clause 5, etc.) and sub-folders for sub-clauses.
- Naming convention: Use consistent file naming: [Clause]-[SubClause]-[Description]-[Version/Date]. For example: 8.5.1-Change-Management-Procedure-v2.1.pdf.
- Evidence matrix: Include an evidence matrix document at the top level that maps every requirement to its corresponding evidence file(s).
- Index document: Provide a brief overview document listing all evidence items with their clause mapping and location within the pack.
Most certification bodies now accept digital evidence packs. Use a shared folder structure (SharePoint, Google Drive, or similar) with controlled access. Ensure links to ITSM tool records are stable and accessible. If providing screenshots, include the date, time, and context. Avoid dumping raw data — annotate evidence to show how it demonstrates conformance.
Evidence Matrix by Clause
The evidence matrix is the master reference document for your evidence pack. It maps every clause of ISO 20000-1:2018 to the specific evidence that demonstrates conformance. Auditors use this as their primary navigation tool.
Evidence Matrix Structure
| Clause | Requirement Summary | Policy / Procedure | Records / Evidence | KPIs / Reports |
|---|---|---|---|---|
| 4.1 | Understanding the organisation and its context | SMS Scope Document; Context Analysis | PESTLE/SWOT analysis records; stakeholder register | — |
| 4.2 | Understanding needs and expectations of interested parties | Interested Parties Register | Requirements documentation; customer feedback records | — |
| 5.1 | Leadership and commitment | Service Management Policy | Management review minutes; resource allocation records | — |
| 5.2 | Policy | Service Management Policy (approved, communicated) | Communication records; policy acknowledgements | — |
| 5.3 | Roles, responsibilities and authorities | RACI matrix; role descriptions | Appointment records; org charts | — |
| 7.1 | Resources | Resource management plan | Staffing records; budget approvals | Resource utilisation reports |
| 7.2 | Competence | Training and competence procedure | Training records; certifications; competence assessments | Training completion rates |
| 8.2 | Service portfolio / Service catalogue | Service Catalogue Management procedure | Service catalogue (current version); change history | Catalogue accuracy metrics |
| 8.3.2 | Service level management | SLM procedure; SLA templates | Signed SLAs/OLAs; service review meeting minutes | SLA achievement reports |
| 8.3.4 | Supplier management | Supplier management procedure | Supplier register; contracts; performance reviews | Supplier performance scorecards |
| 8.5.1 | Change management | Change management procedure; change categorisation | Change records; CAB minutes; post-implementation reviews | Change success rate; emergency change % |
| 8.6.1 | Incident management | Incident management procedure; priority matrix | Incident tickets (samples); major incident reports | MTTR; SLA compliance; incident volumes |
| 8.6.3 | Problem management | Problem management procedure; RCA methodology | Problem records; RCA reports; known error database | Problem resolution rates; recurring incident reduction |
| 8.7.1 | Service availability management | Availability management plan | Availability monitoring records; outage logs | Availability %; downtime trends |
| 8.7.2 | Service continuity management | Service continuity plan; BIA | Continuity test records; test results | RTO/RPO compliance; test frequency |
| 9.1 | Monitoring, measurement, analysis and evaluation | Measurement and reporting procedure | Service reports; dashboard exports | All process KPIs; service performance reports |
| 9.2 | Internal audit | Internal audit procedure; audit programme | Audit reports; finding tracker; auditor competence records | Audit completion rates; finding closure rates |
| 9.3 | Management review | Management review agenda template | Management review minutes; action tracker | Action completion rates |
| 10.1 | Nonconformity and corrective action | Corrective action procedure | NCR register; corrective action records; root cause analyses | NCR closure rates; recurrence rates |
| 10.2 | Continual improvement | Continual improvement procedure / CSI register | Improvement register; improvement initiative records | Improvement completion rates; benefit realisation |
This matrix is not exhaustive — your specific scope and services may require additional evidence. Use it as a starting framework and extend it based on your gap assessment findings.
Tips for Presenting Evidence
How you present evidence to auditors can be as important as the evidence itself. Well-organised, clearly labelled evidence creates a positive impression of SMS maturity and accelerates the audit process.
Before the Audit
- Pre-audit evidence review: Walk through the evidence pack clause by clause before submitting it. Check for gaps, broken links, outdated documents, and missing records.
- Cross-reference check: Verify that every requirement in the evidence matrix has at least one corresponding evidence item. Highlight any items where evidence is thin.
- Version control: Ensure all policies and procedures show current versions, approval dates, and approvers. Remove draft or superseded documents from the evidence pack.
- Record currency: Evidence records should be recent (within the last 12 months for most processes). Historical records are fine for demonstrating trend, but auditors want to see current operational evidence.
During the Audit
- Guide, don’t dump: When the auditor requests evidence for a specific clause, navigate them to the relevant section. Don’t hand over an unstructured folder and say “it’s in there somewhere.”
- Explain the context: Briefly explain what the auditor is looking at and how it demonstrates conformance. For example: “This is our monthly SLA achievement report for the Managed Desktop service. It shows we achieved 99.2% availability against a 99% target.”
- Be honest about gaps: If the auditor identifies an area where evidence is thin, acknowledge it honestly. Explain what you are doing to improve. Transparency builds trust; defensiveness raises flags.
- Have process owners available: Auditors may want to interview the people who execute processes. Ensure process owners are available and briefed on what the auditor may ask about their area.
Common Presentation Pitfalls
- Over-documenting: Producing excessive documentation that nobody uses in practice. Auditors value lean, effective documentation over voluminous binders that gather dust.
- Screenshot overload: Dozens of screenshots without context or annotation. Each screenshot should have a caption explaining what it shows and how it relates to the requirement.
- Missing operational evidence: Having policies and procedures but no records showing they are followed. This is the single most common issue — the SMS looks good on paper but lacks evidence of execution.
- Stale evidence: Using records from 18 months ago when recent records are available. Auditors want to see the current state of the SMS, not historical artefacts.
Auditors form an impression of your SMS maturity within the first hour of Stage 2. A well-organised evidence pack, confident process owners, and honest acknowledgement of improvement areas create a positive audit experience. Disorganised evidence, defensive responses, and inability to locate records create the opposite impression — and often trigger deeper sampling.
Common Gaps by Clause
Based on experience across hundreds of ISO 20000-1 assessments, certain gaps appear with predictable frequency. Knowing where organisations typically struggle helps you focus your gap assessment and remediation effort.
Clause 4 — Context of the Organisation
Common gap: The organisation has not formally documented its context, interested parties, or how these influence the SMS. Scope statements are vague.
Remediation: Conduct a formal context analysis, document interested parties and their requirements, and ensure the scope statement is specific and service-centric.
Clause 5 — Leadership
Common gap: Management commitment is assumed but not evidenced. The service management policy exists but is not communicated or reviewed.
Remediation: Ensure the policy is approved by top management, communicated to relevant parties, and reviewed at least annually. Document leadership participation in management reviews and resource allocation decisions.
Clause 7 — Support
Common gap: Competence records are incomplete. Training plans exist but lack evidence of execution. Knowledge management is informal.
Remediation: Build a competence matrix linking roles to required competencies. Maintain training records with evidence of completion. Establish a knowledge management approach (knowledge base, wiki, documented procedures).
Clause 8.2 — Service Catalogue
Common gap: The service catalogue exists but is incomplete, outdated, or lacks detail on supporting services and dependencies. No process for maintaining it.
Remediation: Review and update the catalogue. Include supporting services, dependencies, and owners. Establish a maintenance process linked to change management.
Clause 8.3.4 — Supplier Management
Common gap: Supplier governance is informal. Contracts exist but performance is not monitored against defined requirements. No regular supplier reviews.
Remediation: Establish a supplier register, define performance requirements, conduct regular reviews, and document outcomes. Focus on critical suppliers first.
Clause 8.5 — Service Design, Build and Transition
Common gap: Change management exists but lacks formal CAB process, change categorisation, or post-implementation review. Release management is ad hoc.
Remediation: Formalise the change process with categorisation (standard, normal, emergency), CAB governance, and PIR. Establish release management with defined approval gates.
Clause 8.6 — Resolution and Fulfilment
Common gap: Incident management is reasonably mature but problem management is reactive and inconsistent. Root cause analysis is not systematic. Known error database is absent.
Remediation: Implement a proactive problem management process with defined triggers, RCA methodology, and known error recording. Link problem resolution to incident trend analysis.
Clause 8.7 — Service Assurance
Common gap: Service continuity plans exist but have never been tested. Availability management is reactive (respond to outages) rather than proactive (plan for availability targets).
Remediation: Conduct continuity testing at least annually and document results. Establish availability targets, monitoring, and reporting for in-scope services.
Clause 9 — Performance Evaluation
Common gap: Service reporting exists but does not cover all required topics. Internal audit programme is immature or not yet executed. Management review does not address all required inputs.
Remediation: Define a service reporting framework covering all clause 9.1 requirements. Plan and execute the internal audit programme before the certification audit. Ensure management review follows the required agenda from Clause 9.3.
Clause 10 — Improvement
Common gap: Corrective actions are raised but not tracked to closure. Continual improvement is discussed but not systematically recorded or measured.
Remediation: Establish a corrective action register with closure tracking. Implement a CSI register capturing improvement initiatives, owners, status, and outcomes.
Frequently Asked Questions
How long does an ISO 20000-1 gap assessment take?
A thorough gap assessment typically takes 2–4 weeks depending on organisational size, scope complexity, and the number of services covered. Small organisations with a focused scope can complete the analysis in 1–2 weeks, while larger organisations with multiple services and locations may require 4–6 weeks. The timeline includes document review, interviews, evidence sampling, and report writing.
What is an evidence pack for ISO 20000-1?
An evidence pack is a structured collection of documents, records, and artefacts that demonstrate conformance with each clause of ISO 20000-1. It typically includes policies, procedures, process records, KPIs, service reports, meeting minutes, and audit trails organised by clause with an evidence matrix mapping requirements to specific evidence items. The evidence pack is submitted to the certification body and used throughout the Stage 1 and Stage 2 audits.
What scoring method should I use for gap analysis?
A 4-level maturity scale works well: 0 (Not Addressed) for no documented process or evidence, 1 (Initial/Ad Hoc) for some activity but inconsistent, 2 (Defined but Inconsistent) for documented processes with gaps in execution, and 3 (Conforming) for fully implemented and evidenced. This provides enough granularity to prioritise remediation without over-complicating the assessment.
Which clauses have the most common gaps?
The most common gaps are found in Clause 8.2 (Service Catalogue Management) where catalogues are incomplete or lack supporting-service detail, Clause 8.5.1 (Change Management) where CAB processes and change categorisation are weak, Clause 8.7.2 (Service Continuity Management) where plans lack testing, and Clause 8.3.4 (Supplier Management) where governance of external providers is informal. Clause 10.2 (Continual Improvement) is also frequently weak, with improvement activities discussed but not systematically recorded.
Should I hire a consultant for the gap assessment?
It depends on internal capability. If your team has experience with ISO management system audits and understands ISO 20000-1 requirements, an internal assessment can be effective and builds internal capability. A consultant brings objectivity, benchmarking from other organisations, and often accelerates the process. Many organisations use a hybrid approach: the internal team conducts the assessment with consultant review and validation. This combination provides both internal ownership and external perspective.