Key Takeaways
  • Data quality issues — incomplete data, manual errors, inconsistent units — are the most frequent ESG assurance finding category, accounting for roughly 40% of all observations
  • Boundary and scope inconsistencies between ESG and financial reporting are flagged in nearly every first-time engagement
  • Scope 3 estimation findings are the fastest-growing category as assurance providers scrutinise value-chain emissions claims
  • Most findings stem from treating ESG data collection as a year-end exercise rather than embedding it in operational processes
  • Organizations that invest in pre-assurance readiness assessments reduce formal engagement findings by 60-70%

Understanding ESG Assurance Findings

An ESG assurance finding is an observation made by the assurance practitioner where the organization's sustainability data, processes, or disclosures do not meet the requirements of the applicable criteria or where evidence is insufficient to support the reported information. Findings range from minor observations that do not affect the assurance conclusion to material issues that may result in a qualified opinion or scope limitation.

Unlike financial audits where accounting standards and control frameworks are well-established, ESG assurance operates across a diverse landscape of reporting frameworks (GRI, ESRS, ISSB, CDP), measurement methodologies (GHG Protocol, industry-specific standards), and data types (environmental metrics, social indicators, governance disclosures). This diversity means findings often relate to inconsistencies between the criteria selected and the measurement approach applied, rather than outright errors.

Understanding the most common findings — and their root causes — allows organizations to proactively address gaps before the formal assurance engagement begins. This article draws on patterns observed across ESG assurance engagements to categorise, explain, and provide practical fixes for the findings practitioners most frequently raise.

Finding Severity Classification

Assurance findings are typically classified into three severity levels:

  • Material finding: An issue that could cause a material misstatement in the reported ESG data. May result in a qualified conclusion or scope limitation. Requires remediation before the assurance statement can be finalised.
  • Significant finding: An issue that, while not individually material, indicates a systemic weakness that could become material if uncorrected. Reported in the management letter with a recommended timeline for remediation.
  • Observation: A minor issue or improvement recommendation that does not affect the assurance conclusion. Documented for management's consideration and continuous improvement.

Finding Category 1: Data Quality Issues

Data quality findings are the single largest category in ESG assurance, representing approximately 40% of all observations across engagements. They manifest in several distinct forms, each with different root causes and remediation paths.

Incomplete Data Sets

Finding: ESG metrics are not complete for all entities, sites, or periods within the reporting boundary. For example, energy consumption data is missing for 3 of 15 operational facilities, or water withdrawal data covers only 9 months of the 12-month reporting period.

Root cause: Decentralised data collection where individual sites are responsible for submitting data without a centralised tracking mechanism. Sites with staff turnover, operational disruptions, or less mature sustainability programmes fail to submit data on time — or at all.

Severity: Material if the missing data represents a significant proportion of the total metric (typically >5% for quantitative environmental data).

Fix:

  • Implement a centralised ESG data management platform with automated collection reminders and submission tracking
  • Establish data submission deadlines by site with escalation procedures for non-compliance
  • Define and document estimation protocols for unavoidable gaps — the criteria for when estimation is acceptable, the method to be used, and the maximum proportion of estimated data allowed
  • Run quarterly data completeness checks rather than waiting for year-end

Manual Transcription Errors

Finding: Numerical errors introduced during manual data transfer from source documents (utility invoices, meter readings, HR systems) into sustainability reporting spreadsheets. Typical errors include decimal point misplacements, unit transpositions, and copy-paste errors across tabs.

Root cause: Reliance on spreadsheet-based data collection where values are manually typed from source documents. No automated validation checks, range checks, or reconciliation processes exist for ESG data — unlike the three-way matching and reconciliation controls standard in financial accounting.

Severity: Ranges from observation (isolated error within tolerance) to material (systematic pattern affecting reported totals by more than materiality threshold).

Fix:

  • Automate data feeds from source systems (utility provider portals, building management systems, HR information systems) directly into the ESG data platform
  • Implement automated validation rules: range checks (flag values outside 2 standard deviations from historical mean), completeness checks (flag missing months), and unit consistency checks
  • Introduce a maker-checker process where one person enters data and a second independently verifies against source documents
  • Perform monthly variance analysis to catch anomalies early — a 300% increase in water consumption at a site should trigger investigation, not year-end surprise

Inconsistent Measurement Units

Finding: Different sites or business units report the same metric in different units without proper conversion. For example, one facility reports energy in kilowatt-hours (kWh), another in gigajoules (GJ), and a third in British thermal units (BTU) — and these are aggregated without conversion, or with incorrect conversion factors.

Root cause: Absence of a standardised data collection template that prescribes the required unit for each metric. Sites default to the unit provided by their local utility or measurement system. The consolidation team either does not notice the inconsistency or applies incorrect conversion factors.

Severity: Significant to material, depending on the magnitude of the conversion error and the proportion of data affected.

Fix:

  • Issue a group-wide ESG Basis of Reporting document that specifies the required unit for every metric (e.g., "all energy must be reported in MWh")
  • Build unit validation into the data collection platform — reject submissions in non-standard units or auto-convert with logged conversion factors
  • Maintain a conversion factor register with sources and version dates, reviewed annually for accuracy
  • Train site-level data providers on the required units and provide pre-formatted templates
Practitioner Insight

Data quality findings are almost always traceable to the absence of ESG-specific data governance. Organizations invest heavily in financial data controls but treat sustainability data as a side project. The single most impactful remediation action is establishing formal data governance over ESG metrics — ownership, definitions, validation rules, and reconciliation processes — parallel to what exists for financial reporting.

Finding Category 2: Boundary Inconsistencies

Finding: The ESG reporting boundary does not align with the financial reporting boundary, or entities are included/excluded without documented rationale. Common manifestations include:

  • Joint ventures consolidated in financial statements but excluded from ESG metrics
  • Acquired entities included in financial results from acquisition date but ESG data only captured from the following year
  • Leased facilities excluded from energy and emissions data despite being within operational control
  • Divested entities included in sustainability data beyond the divestiture date

Root cause: ESG reporting boundaries are set independently of financial reporting, often by the sustainability team without consulting finance or group reporting. The GHG Protocol's concepts of operational control, financial control, and equity share approach add complexity — and organizations frequently fail to document which approach they have selected and apply it consistently.

Severity: Significant to material. Boundary misalignment can systematically overstate or understate metrics. A missing joint venture contributing 20% of total emissions, for instance, represents a material boundary gap.

Fix:

  • Document the organisational boundary approach in the Basis of Reporting and explicitly state whether operational control, financial control, or equity share is used
  • Reconcile the ESG entity list to the financial consolidation schedule annually, explaining every inclusion or exclusion
  • Establish a trigger mechanism: whenever corporate transactions (acquisitions, disposals, joint ventures) occur, the sustainability team is notified to assess boundary implications
  • Include boundary alignment as a standing agenda item in quarterly sustainability governance meetings

Finding Category 3: Methodology Gaps

Finding: The calculation methodologies used for ESG metrics are not documented, inconsistently applied, or diverge from the criteria stated in the report. Examples include:

  • GHG emissions calculated using outdated emission factors (e.g., 2015 grid factors when the report references the GHG Protocol's requirement for "the most recent available data")
  • Different facilities using different emission factor sources for the same activity (e.g., one site using national grid factors, another using IPCC defaults, a third using supplier-specific factors) without documented rationale
  • Water stress assessments not conducted using a recognised tool (e.g., WRI Aqueduct) despite the report claiming alignment with GRI 303
  • Employee headcount calculated differently for gender diversity (FTE) versus safety metrics (total workforce including contractors) without disclosure

Root cause: Absence of a documented Basis of Reporting that specifies the exact methodology, data sources, emission factors, and calculation approach for each metric. When methodology is not documented, different people make different choices in different years, and consistency erodes.

Severity: Significant to material, depending on the impact of the methodology choice on reported values.

Fix:

  • Develop a comprehensive Basis of Reporting document — a stand-alone reference that specifies for each metric: definition, boundary, data source, calculation methodology, emission factor source and version, assumptions, estimation protocols, and responsible person
  • Review and update the Basis of Reporting annually, documenting any methodology changes and their impact on comparability
  • Centralise emission factor management — maintain a single, version-controlled register of all factors used, with sources and dates
  • Where methodology choices exist (e.g., location-based vs market-based Scope 2), document the rationale for the approach selected and report both where required by the criteria

Finding Category 4: Control Deficiencies

Finding: Internal controls over ESG data collection, aggregation, and reporting are absent, informally applied, or not operating effectively. Common control deficiencies include:

  • No segregation of duties — the same person collects, enters, aggregates, and reports ESG data without independent review
  • No formal review and approval process for ESG data before publication
  • Spreadsheet-based consolidation without version control, access controls, or change tracking
  • No reconciliation between ESG data and operational source systems (e.g., energy data in the sustainability report does not reconcile to utility invoices or building management system outputs)
  • Year-end "data gathering sprint" where 12 months of data is collected in 2-3 weeks, with no time for quality review

Root cause: ESG reporting has historically operated outside formal internal control frameworks. While financial reporting benefits from decades of COSO, SOX, and audit committee oversight, ESG data processes often lack equivalent governance infrastructure. The sustainability team may report to communications or corporate affairs rather than finance, which reduces exposure to control culture.

Severity: Significant. While individual control deficiencies may not cause a specific misstatement, their cumulative effect creates an environment where errors are likely to occur and go undetected.

Fix:

  • Map the end-to-end ESG data flow — from source measurement to published report — and identify control points where validation, review, or approval should occur
  • Implement a maker-checker process at minimum for all material ESG metrics
  • Migrate from spreadsheet-based consolidation to a dedicated ESG data management platform with built-in access controls, audit logging, and version management
  • Integrate ESG data controls into the organization's existing internal control framework (e.g., extend COSO or SOX-like testing to ESG processes)
  • Shift to quarterly data collection with review cycles rather than annual year-end exercises

Finding Category 5: KPI Definition Ambiguity

Finding: ESG key performance indicators (KPIs) are reported without clear, documented definitions, leading to inconsistent measurement across the organization or over time. Common examples include:

  • Employee headcount: Does it include contractors, temporary workers, part-time employees? Is it a point-in-time count or period average? FTE or headcount?
  • Waste diversion rate: Does "diversion from landfill" include energy recovery (incineration with energy capture)? What about waste-to-fuel? Is construction and demolition waste included?
  • Lost-time injury frequency rate (LTIFR): What constitutes a "lost-time" event — any absence beyond the day of injury, or only absences exceeding one calendar day? Are contractors included in the denominator (total hours worked)?
  • Gender pay gap: Mean or median? Base pay only or total remuneration? Full-time equivalent adjusted or actual pay?
  • Water consumption vs withdrawal: Are these terms used interchangeably (incorrectly), or is the distinction between water withdrawn and water consumed (withdrawn minus discharged) clearly defined?

Root cause: Reporting frameworks like GRI provide definitions, but implementation requires translating those definitions into operational measurement rules specific to the organization. When this translation is not documented, different sites interpret definitions differently, and year-over-year comparability is compromised.

Severity: Significant. Ambiguous definitions undermine the reliability and comparability of reported data, even if the underlying measurements are accurately captured.

Fix:

  • Create a KPI Definition Register — a controlled document specifying for each metric: the exact definition, inclusions and exclusions, measurement method, unit, frequency, responsible person, and the reporting framework reference it maps to
  • Circulate the KPI definitions to all data providers at the start of each reporting period with an acknowledgement requirement
  • Include the definitions (or a summary) in the published sustainability report's Basis of Reporting section — transparency about definitions builds assurance provider and reader confidence
  • Test definition understanding during internal ESG data audits — ask site contacts to explain how they calculate a metric and compare their answer to the documented definition

Finding Category 6: Lack of Audit Trail

Finding: Reported ESG data cannot be traced back to source evidence. The practitioner requests supporting documentation for a reported metric and the organization cannot produce it, or the link between source data and reported figures involves undocumented manual adjustments, aggregations, or estimates.

Common audit trail breakdowns include:

  • Utility invoices not retained for the reporting period — the sustainability team received data from facilities management but the original invoices were discarded
  • HR system data exports used for headcount metrics were not archived — the system was updated since, and the original data cannot be reproduced
  • Consolidation spreadsheet contains "hardcoded" values without references to source tabs or external documents
  • Adjustments made during consolidation (e.g., pro-rating for acquisitions, currency conversion for financial ESG metrics) are not documented

Root cause: Absence of a document retention policy for ESG evidence. Financial reporting has clear retention requirements (typically 7+ years), but ESG evidence is often not subject to equivalent retention rules. Additionally, when ESG data moves through multiple people and systems before reaching the report, each handoff creates a potential audit trail break.

Severity: Significant to material. If the practitioner cannot verify reported data against source evidence, they cannot complete assurance procedures for those metrics — potentially resulting in a scope limitation.

Fix:

  • Establish an ESG evidence retention policy requiring source documents to be retained for a minimum of the assurance engagement period plus one additional year (to support year-over-year comparisons)
  • Maintain a structured evidence pack for each reporting period — organised by metric, with source documents, calculation workpapers, and reconciliations
  • Ensure the consolidation process is fully referenced — every cell in the consolidation should trace to either a source data submission, a documented calculation, or a clearly described adjustment
  • Use an ESG data management platform with inherent audit trail capabilities — automatic logging of data entries, changes, approvals, and source document attachments
Pre-Assurance Tip

Before the formal assurance engagement begins, conduct a "documentation dry run" — select 5-10 metrics at random and attempt to trace each from the published report figure back to source evidence. If you cannot do it in under 30 minutes per metric, your audit trail has gaps that the assurance provider will find.

Finding Category 7: Scope 3 Estimation Issues

Scope 3 greenhouse gas emissions are the fastest-growing source of ESG assurance findings. As organizations expand their emissions reporting beyond Scope 1 and 2, the inherent complexity and estimation uncertainty of value-chain emissions create multiple finding opportunities.

Common Scope 3 Findings

  • Incomplete category screening: Organizations report selected Scope 3 categories without documented rationale for excluding others. The GHG Protocol requires a screening assessment of all 15 categories with documented exclusion justification.
  • Undisclosed estimation methods: Spend-based, activity-based, or average-data methods are used without disclosure of which method applies to which category, or the rationale for the method selected.
  • Stale or inappropriate emission factors: Spend-based calculations using generic economy-wide factors when sector-specific factors are available. Or factors from databases that have not been updated for several years.
  • No supplier-specific data: Organizations default to secondary data (industry averages, spend-based estimates) without attempting to collect primary data from key suppliers — even when those suppliers have the data available.
  • Double counting: Emissions counted in more than one category (e.g., upstream transportation counted in both Category 4 and Category 9) or Scope 3 figures overlapping with supplier's reported Scope 1 and 2.
  • No sensitivity analysis: Significant estimation assumptions (e.g., emission factors for purchased goods, end-of-life treatment assumptions) are not tested for sensitivity — the organization cannot demonstrate the potential impact of alternative assumptions on total Scope 3.

Root cause: Scope 3 reporting is inherently estimation-heavy, and many organizations are still building their value-chain data infrastructure. However, the root cause of findings is usually not the estimation itself (which is expected and acceptable) but the lack of transparency about the estimation process. Practitioners need to verify that the estimation is reasonable and consistently applied — which requires documentation the organization has not produced.

Severity: Significant to material, depending on the proportion of total emissions represented by Scope 3 and the magnitude of the estimation uncertainty.

Fix:

  • Conduct and document a comprehensive Scope 3 category screening using the GHG Protocol's criteria (size, influence, risk, stakeholder concern, outsourcing, sector guidance)
  • For each reported category, document: the calculation method (spend-based, activity-based, supplier-specific, hybrid), the data sources, the emission factors used with their provenance, the key assumptions, and the data quality rating
  • Develop a supplier engagement programme to progressively replace secondary data with primary (supplier-specific) data for the top contributors to Scope 3
  • Perform sensitivity analysis on key assumptions and disclose the results — this demonstrates awareness of uncertainty and strengthens credibility
  • Clearly disclose data quality per category using the GHG Protocol's 1-5 data quality scale, and set improvement targets for moving categories from lower to higher quality over time

Findings Summary Table

The following table summarises the most common ESG assurance findings by category, typical severity, root cause pattern, and recommended remediation timeline.

Finding Category Typical Severity Root Cause Pattern Remediation Timeline
Incomplete data sets Material Decentralised collection, no tracking 3-6 months
Manual transcription errors Significant Spreadsheet reliance, no validation 1-3 months
Inconsistent units Significant No standard templates or guidance 1-2 months
Boundary inconsistencies Material ESG/financial boundary disconnect 2-4 months
Methodology gaps Significant No Basis of Reporting document 2-3 months
Control deficiencies Significant ESG outside control frameworks 3-6 months
KPI definition ambiguity Significant Undocumented operational definitions 1-2 months
Lack of audit trail Material No retention policy for ESG evidence 2-4 months
Scope 3 estimation issues Material Immature value-chain data, no documentation 6-12 months

Systematic Remediation Approach

Addressing ESG assurance findings requires a systematic approach rather than ad hoc fixes. The following framework helps organizations prioritise and execute remediation effectively.

Step 1: Triage and Prioritise

Not all findings require immediate action. Triage findings by:

  • Impact on assurance conclusion: Material findings that affect the conclusion must be addressed first — ideally before the statement is finalised
  • Effort to remediate: Some fixes (e.g., documenting KPI definitions) are low-effort but high-impact. Prioritise quick wins that demonstrate commitment
  • Systemic vs isolated: Systemic issues (e.g., no data governance framework) should take priority over isolated errors, as they prevent recurrence of multiple finding types

Step 2: Assign Ownership and Timelines

Every finding needs a named owner (not a department — a person), a specific remediation action, a target completion date, and a defined evidence of completion. Track remediation in a formal action log that is reviewed at regular governance meetings.

Step 3: Address Root Causes, Not Symptoms

Many organizations fix the specific error identified in the finding without addressing the underlying cause. If the finding is "energy data missing for three sites," the symptom fix is to obtain the missing data. The root cause fix is to implement a data collection platform with automated reminders, submission tracking, and escalation. Without root cause remediation, the same finding will recur next year.

Step 4: Conduct Pre-Assurance Readiness Review

Before the next formal assurance engagement, conduct an internal readiness review (or engage an external provider for a pre-assurance assessment) that specifically tests whether prior-year findings have been effectively remediated. This catches any gaps before the formal engagement and demonstrates to the assurance provider that findings are taken seriously.

Step 5: Build Maturity Progressively

ESG data maturity is a multi-year journey. Set realistic expectations:

  • Year 1: Establish basic data governance, document methodologies, implement manual controls
  • Year 2: Automate data collection, implement validation checks, achieve limited assurance over key metrics
  • Year 3: Extend assurance scope, move toward reasonable assurance for critical metrics, implement continuous monitoring

The organizations that achieve clean assurance opinions are not those with perfect data — they are those with documented processes, transparent limitations, and demonstrated year-over-year improvement in data quality and controls.

Frequently Asked Questions

What are the most common ESG assurance findings?

The most common findings relate to data quality issues (incomplete data, manual transcription errors, inconsistent units), boundary and scope inconsistencies, methodology gaps in emission calculations, weak internal controls over ESG data, ambiguous KPI definitions, lack of audit trail, and Scope 3 estimation issues. Data quality findings alone account for roughly 40% of all observations.

How many findings are typical in an ESG assurance engagement?

For a first-time limited assurance engagement, 8-15 findings of varying severity is typical. Organizations with mature sustainability reporting processes may see 3-6 findings. The number depends on scope breadth, data maturity, and the number of metrics assured.

What causes ESG data quality failures during assurance?

ESG data quality failures typically stem from manual data collection in spreadsheets, lack of automated validation checks, inconsistent measurement units across sites, missing source documentation, and absence of formal data governance over sustainability metrics.

How do you fix Scope 3 estimation issues flagged during assurance?

Fix Scope 3 findings by documenting category screening and materiality rationale, disclosing estimation methods and data sources for each category, using activity-based data where available, maintaining source evidence, performing sensitivity analysis on key assumptions, and clearly disclosing data quality ratings per category.

Can ESG assurance findings prevent issuance of an assurance statement?

In extreme cases, yes. If findings are so pervasive that the practitioner cannot form a conclusion, they may issue a disclaimer or decline to issue a statement. More commonly, material findings result in a qualified conclusion or scope limitation. Organizations should use the pre-assurance readiness phase to identify and fix critical issues before the formal engagement.