Mastering Forward-looking Data Challenges in 2024 and Beyond
Financial institutions and companies that apply IFRS 9 and need accurate, fully compliant models and reports for Expected Credit Loss (ECL) calculations must incorporate forward‑looking data into PD, LGD and EAD Models. This article explains the core forward‑looking data challenges, how they affect model outputs and regulatory disclosures (including IFRS 7 Disclosures), and gives practical, step‑by‑step guidance, tools and checklists to reduce model risk, improve Governance and deliver transparent Risk Committee Reports and Model Validation artifacts.
Why forward‑looking data challenges matter for IFRS 9 practitioners
IFRS 9 requires expected credit losses to reflect reasonable and supportable forward‑looking information — macroeconomic scenarios, management overlays and observable market signals. For credit risk managers, model owners, Finance and Audit, shortcomings in forward‑looking inputs lead to materially different ECL balances, more work during Model Validation, and increased scrutiny in IFRS 7 Disclosures. The interaction between business judgment, model mechanics and data availability creates a tight governance problem: you must document assumptions, defend scenario selection and produce evidence for the Risk Committee and external auditors.
Many institutions face implementation friction because of technical constraints; see how these are described in the wider context of IFRS 9 technical challenges. Regulatory expectations also amplify the need for defensible and traceable forward‑looking inputs; for a discussion on that subject see IFRS 9 regulatory challenges.
Core concept: definition, components and clear examples
Definition — what we mean by forward‑looking information
Forward‑looking information are inputs into ECL that represent expected future economic conditions or borrower behaviour. Typical components are: a set of macroeconomic scenarios (base, upside, downside), scenario probabilities, stress adjusters, management overlays, and indicator series for PD, LGD and EAD Models.
Key components and how they feed models
- Macroeconomic scenarios: GDP, unemployment, house prices, oil price paths used directly or via macro‑to‑risk mappings.
- Scenario weights: probabilities assigned to each scenario to produce weighted ECL outcomes.
- Model mappings: explicit functions or look‑ups mapping macro variables to PD/LGD/EAD changes.
- Management overlays: expert adjustments with supporting rationale and forward evidence.
- Trigger indicators: timely signals to move accounts between Three‑Stage Classification categories.
Example — applying one macro variable to PDs
Example: a retail unsecured portfolio with a baseline 12‑month PD of 2.5%. Historical regression indicates a 0.4 percentage point increase in PD for every 1% increase in unemployment. If the downside scenario forecasts unemployment rising by 3% vs baseline, the adjusted PD in that scenario = 2.5% + (0.4pp × 3) = 3.7%. Repeat for LGD and EAD paths and weight outcomes across scenarios to compute lifetime ECL for stage 2 accounts.
Where institutions lack internal statistical strength, augmenting with external indicators and careful judgement is common practice; see practical considerations around Forward-looking ECL data to maintain auditability.
Practical use cases and recurring scenarios
Use case 1 — Quarterly Risk Committee report
Situation: the committee expects a narrative on how forward‑looking inputs changed since last quarter and the sensitivity of the ECL to macro scenarios. Deliverables: scenario charts, delta analysis (ECL vs prior), and a sensitivity table showing % change in ECL per 1% GDP shock. Use the table to explain whether movements are model‑driven (PD mapping) or portfolio‑driven (mix changes).
Use case 2 — Model Validation and Audit
Situation: the validation team needs reproducible mapping files and evidence for scenario selection. Provide versioned code or spreadsheets, back‑tests of scenario weighting, and a sensitivity testing exercise demonstrating robustness — particularly for PD, LGD and EAD Models.
Use case 3 — Stress testing and capital planning
Situation: regulators ask for ECL under regulatory stress. Link your forward‑looking scenarios to stress test severities and show the correlated impact on capital metrics and provisioning. Teams often use scenario overlays and scenario probability shifts in adverse conditions; document governance and escalation paths for ad hoc scenario changes.
Data workflows
Forward‑looking workflows often fail at data collection and integration. Address known Data collection challenges early: create an inventory of external feeds, internal ledgers, and derived indicators and document feed frequency and quality checks.
Impact on decisions, performance and financial reporting
Forward‑looking information affects multiple outcomes: reported ECL provisions, volatility in profit and loss, capital adequacy, and stakeholder confidence. Clear, defensible forward‑looking inputs provide:
- More accurate provisioning timing — reducing unexpected provisioning shocks to the income statement.
- Improved capital planning — earlier detection of deteriorating scenarios helps preserve CET1 buffer.
- Better governance — transparent scenario logic helps Model Validation and the Risk Committee evaluate judgemental overlays.
- Clearer disclosures — IFRS 7 Disclosures require explanation of forward‑looking assumptions and sensitivity testing; well structured inputs improve disclosure quality.
Advanced techniques such as ensemble modelling and alternative data sources can reduce bias and improve calibration; learn when to adopt them in our note on Using big data in ECL.
Common mistakes and how to avoid them
Mistake 1 — Over‑reliance on a single scenario
Some teams report only the base case. Avoid this by enforcing multi‑scenario reporting with transparent weights and running sensitivity testing across reasonable alternative paths.
Mistake 2 — Weak documentation of judgemental overlays
Judgement without evidence is a red flag during Model Validation. Create an overlay register: rationale, supporting indicators, owner, effective date, and rollback criteria.
Mistake 3 — Ignoring data shortage or poor quality
Data scarcity is a root cause of unreliable forward‑looking inputs — consider methods covered in IFRS 9 data shortage for acceptable proxies and conservative adjustments. Maintain a data quality score for each feed and threshold rules for conservative treatment.
Mistake 4 — Not performing sensitivity and scenario testing
Build a routine Sensitivity Testing program to quantify how changes in each macro vector affect PD, LGD and EAD. Sensitivity testing is a core component of IFRS 7 Disclosures and a focus area for auditors.
Mistake 5 — Siloed governance
Forward‑looking inputs must be a cross‑functional activity — Risk, Finance, Treasury and Business lines. Establish a simple change‑control board with clear escalation to the Risk Committee for material changes.
If you want to understand more of the upstream causes of these mistakes and remedial patterns, review our article on IFRS 9 technical challenges.
Practical, actionable tips and a checklist
Use the checklist below as a minimum starting point to make forward‑looking inputs robust and auditable.
- Inventory: catalog all external and internal data feeds, their owners and frequency.
- Scenario library: maintain versioned macro scenarios (base/up/down) and explicit probabilities.
- Mapping rules: document functional forms linking macros to PD, LGD and EAD, and keep back‑test evidence.
- Sensitivity Testing: publish a sensitivity matrix for the Risk Committee each quarter.
- Model Validation pack: include data lineage, code, parameter bridges and stress test outputs.
- Governance: establish owners for scenario design, sign‑off thresholds and communication plans.
- Fallbacks: decide pre‑approved fallback rules for missing data or abrupt feed failures — see our considerations on ECL data.
- Automation: automate scenario refreshing and delta reporting to reduce manual error and speed up month‑end close.
- Training: run quarterly workshops with business heads and auditors so they understand the mapping logic and limitations.
- Tooling: evaluate purpose‑built IFRS 9 solutions to reduce manual steps — a practical starting point is to compare options in IFRS 9 solutions.
Quick process for a scenario update (step‑by‑step)
- Trigger: Economics team issues a revised macro baseline.
- Impact assessment: data team runs mappings to PD/LGD/EAD and produces delta ECL for each portfolio.
- Validation: Model Validation reviews methodology changes; run Sensitivity Testing and materiality checks.
- Governance: submit summary to Risk Committee with recommended scenario weights and disclosure text.
- Publication: update IFRS 7 Disclosures, finalize closing entries and archive versioned artefacts.
KPIs / success metrics
- Percentage variance in ECL explained by forward‑looking inputs (target: >90% traceability)
- Number of scenario runs automated per quarter (target: ≥3 — base/up/down)
- Time to produce Risk Committee delta pack from scenario release (target: ≤5 business days)
- Model Validation findings related to forward‑looking inputs per year (target: zero major findings)
- Data feed availability (uptime) and data quality score (target: 99% uptime, quality >95%)
- Disclosure completeness score for IFRS 7 (internal audit metric)
FAQ
How do we choose scenario probabilities?
Scenario probabilities should reflect management’s best estimate of likelihood and be supported by observable indicators where possible (market-implied probabilities, economic forecasts). Document the rationale and perform sensitivity testing to show how different weights affect ECL.
When should we apply management overlays?
Use overlays when model output is demonstrably biased due to non‑stationary conditions, data shortage or sudden structural breaks (e.g., pandemic). Every overlay must have documented triggers, evidence, an owner and a planned re‑assessment date.
What level of granularity is required for PD/LGD/EAD mappings?
Granularity should balance predictive power and data availability. Retail portfolios often require cohort or segment mappings; wholesale may require facility‑level or obligor‑level mappings. Always provide back‑test evidence and a stability analysis.
How often should sensitivity testing be performed?
At minimum quarterly, and ad hoc when macro forecasts change materially or a significant model update occurs. Sensitivity Testing is an input to IFRS 7 Disclosures and supports Model Validation.
Next steps — practical call to action
Start by running a light‑touch review this quarter: produce a one‑page scenario map, run one sensitivity test for each material portfolio, and prepare a short pack for your Risk Committee. For institutions that need an integrated technical solution to automate scenario inputs, data lineage and reporting, consider trying eclreport to speed up model governance and produce compliant Risk Committee Reports and audit‑ready artefacts.
Action plan (30/60/90 days):
- 30 days: inventory data feeds, publish base/up/down scenarios and run a delta ECL report.
- 60 days: implement sensitivity testing templates, register overlays and assign owners.
- 90 days: automate scenario refresh pipelines and submit updated IFRS 7 Disclosures to Audit.
Contact eclreport for a trial or demo focused on your PD, LGD and EAD Models and governance workflows.
Reference pillar article
This article is part of a content cluster addressing implementation challenges for IFRS 9. For the full context and broader obstacles institutions commonly face when adopting IFRS 9, see the pillar article The Ultimate Guide: Key challenges institutions face when implementing IFRS 9 – an overview of the difficulties and why implementation is complex.