Interpretation and Reporting
Learn how to analyze LCA results, draw valid conclusions, and communicate findings effectively to different audiences.
Prerequisites:
Interpretation and Reporting
The interpretation phase is where everything comes together. You've collected inventory data, calculated impact scores, and now you need to make sense of it all. This final phase of LCA ensures your conclusions are robust, your limitations are acknowledged, and your findings reach the right people in the right way.
The Purpose of Interpretation
Interpretation isn't just summarizing your results—it's a systematic process to:
- Identify significant findings from your inventory and impact assessment
- Evaluate the robustness and reliability of those findings
- Draw conclusions that are consistent with your goal and scope
- Make recommendations that are appropriate given the study's limitations
This phase runs iteratively alongside the other phases. As you interpret preliminary results, you may discover data gaps that send you back to inventory collection, or scope issues that require revisiting your goal definition.
The Three Elements of Interpretation
ISO 14044 defines three key elements for interpretation:
1. Identification of Significant Issues
Start by identifying what matters most in your results:
Contribution analysis breaks down total impacts by life cycle stage, process, or flow. This reveals hotspots—the processes or stages driving the majority of environmental burden.
A typical finding might be: "Raw material production accounts for 65% of the product's Global Warming Potential, with aluminum extraction alone contributing 40%."
Dominance analysis identifies which inventory flows contribute most to each impact category. Often, a small number of flows drive the majority of impacts:
| Impact Category | Dominant Flows |
|---|---|
| Climate change | CO₂ from energy, CH₄ from processes |
| Acidification | SO₂ from coal, NOx from transport |
| Eutrophication | Phosphate from agriculture, nitrate from fertilizers |
Anomaly detection checks for results that seem unexpected or inconsistent. If a minor process shows outsized impacts, investigate whether it reflects reality or a data error.
Create visual representations of contribution analysis—stacked bar charts or Sankey diagrams make hotspots immediately apparent and aid communication with stakeholders.
2. Evaluation
Once you've identified significant issues, evaluate whether your conclusions can be trusted:
Completeness check verifies that all relevant information is available:
- Are all processes within the system boundary accounted for?
- Have all relevant impact categories been addressed?
- Are there data gaps that could affect conclusions?
If you find gaps, either collect additional data or assess whether the gaps could change your conclusions.
Sensitivity analysis tests how results change when you vary key assumptions or data:
Scenario 1: Baseline electricity mix → GWP = 45 kg CO₂ eq
Scenario 2: 100% renewable electricity → GWP = 28 kg CO₂ eq
Scenario 3: 100% coal electricity → GWP = 78 kg CO₂ eq
This tells you that the electricity source has high sensitivity—your conclusions about this product's climate impact depend significantly on this assumption.
Key parameters to test in sensitivity analysis:
- Energy sources and grid mixes
- Transport distances and modes
- Allocation methods for multi-output processes
- End-of-life scenarios
- Data choices where alternatives exist
Consistency check ensures methodological choices were applied uniformly:
- Were the same data quality requirements used throughout?
- Were system boundaries applied consistently across compared options?
- Were allocation methods consistent?
For comparative studies, consistency is critical. An apparent difference between products might actually reflect different methodological choices rather than real environmental performance differences.
Uncertainty analysis, while not explicitly required by ISO 14044, strengthens interpretation significantly. Monte Carlo simulation can quantify the probability that one option outperforms another.
3. Conclusions, Limitations, and Recommendations
This is where you synthesize everything into actionable findings:
Conclusions should:
- Answer the questions posed in the goal definition
- Be supported by the data and analysis
- Acknowledge the limitations of the study
- Distinguish between findings with high vs. low confidence
Limitations should honestly describe:
- Data gaps and their potential influence
- Assumptions that could affect conclusions
- Impact categories not covered
- Geographic or temporal restrictions on applicability
Recommendations should:
- Follow logically from the conclusions
- Be appropriate given the study's limitations
- Consider the needs of the intended audience
- Distinguish recommendations from conclusions
Reporting Your LCA
The way you report depends on your audience and the study's purpose. ISO 14044 distinguishes between third-party reports (for external communication) and internal reports.
Third-Party Report Requirements
For studies disclosed to the public, ISO 14044 requires comprehensive documentation including:
| Section | Content Required |
|---|---|
| Goal and scope | Intended application, audience, functional unit, system boundary, assumptions, limitations |
| Inventory analysis | Data collection procedures, calculation procedures, allocation methods |
| Impact assessment | LCIA methods used, impact categories, limitations of the assessment |
| Interpretation | Significant issues, sensitivity results, conclusions, recommendations |
| Critical review | Reviewer qualifications, responses to reviewer comments, review statement |
Tailoring Reports to Audiences
Different audiences need different information:
Technical audience (LCA practitioners, researchers)
- Full methodological details
- Complete data documentation
- Uncertainty and sensitivity results
- Appendices with inventory data
Business decision-makers
- Executive summary with key findings
- Visual representation of hotspots
- Clear recommendations
- Cost-benefit context where relevant
General public
- Plain language explanations
- Infographics and simple visuals
- Comparison to relatable benchmarks
- Avoidance of technical jargon
Be especially careful when communicating LCA results to non-technical audiences. Oversimplification can lead to misinterpretation, while excessive caveats may obscure genuine insights.
The Critical Review Process
ISO 14044 requires independent critical review for studies used for comparative assertions disclosed to the public. Even for other studies, critical review adds credibility.
Types of critical review:
-
Internal expert review: Review by someone independent of the study but within the same organization
-
External expert review: Review by an independent LCA expert
-
Panel review: Review by a committee including LCA, subject matter, and stakeholder experts (required for comparative assertions)
Critical reviewers assess:
- Compliance with ISO standards
- Scientific and technical validity of methods
- Appropriateness of data
- Transparency and consistency
- Alignment of conclusions with findings
Common Interpretation Mistakes
Avoid these pitfalls:
Overinterpreting small differences: A 5% difference between options may not be meaningful given data uncertainty. Report results as ranges or with confidence intervals where possible.
Ignoring trade-offs: Declaring a product "better" based on one category while ignoring worse performance in others misleads readers.
Extrapolating beyond scope: Results for one geographic region, time period, or application don't automatically apply elsewhere.
Confusing potential and actual impacts: LCA estimates potential environmental impacts, not verified damages. Language should reflect this distinction.
Hiding unfavorable results: Selective reporting undermines credibility. Present complete results, including those that may not support desired conclusions.
Communicating Uncertainty
Every LCA involves uncertainty. Transparent communication about uncertainty builds trust:
Qualitative approaches:
- Describe data quality using pedigree matrices
- Explain which parameters are well-characterized vs. uncertain
- Discuss the direction of potential bias
Quantitative approaches:
- Report confidence intervals where data permits
- Show sensitivity ranges for key parameters
- Present Monte Carlo results showing probability distributions
Visual approaches:
- Error bars on comparison charts
- Probability distributions for key outcomes
- Traffic light indicators for data quality
Key Takeaways
- Interpretation identifies significant issues, evaluates robustness, and draws conclusions
- Contribution and sensitivity analysis reveal what matters and how confident you can be
- Completeness and consistency checks ensure your findings are trustworthy
- Tailor reports to your audience while maintaining technical rigor
- Critical review adds credibility, especially for public comparative assertions
- Transparent communication of uncertainty and limitations builds trust
Practice Exercise
You've completed an LCA comparing two packaging options. Option A has lower GWP (15 vs. 20 kg CO₂ eq) but higher water use (50 vs. 30 L eq). Sensitivity analysis shows that transport distance assumptions could change GWP results by ±25%.
Draft a one-paragraph conclusion that:
- Accurately reflects the findings
- Acknowledges the trade-off
- Addresses the sensitivity of the GWP conclusion
- Avoids declaring an overall "winner"
Congratulations!
You've completed the LCA Foundations track. You now understand the complete LCA methodology from goal definition through interpretation. The next step is putting this knowledge into practice with real tools and data.
Explore these options:
- Software Tutorial Track: Learn to conduct LCAs using openLCA
- EPD Track: Understand how LCAs support Environmental Product Declarations
- Advanced Methods Track: Dive deeper into uncertainty, consequential LCA, and more
Further Reading
- ISO 14044:2006 Section 4.5 (Life Cycle Interpretation)
- Weidema, B.P., & Wesnæs, M.S. (1996). Data quality management for life cycle inventories. Journal of Cleaner Production 4(3-4), 167-174.
- Heijungs, R. (2020). On the number of Monte Carlo runs in comparative probabilistic LCA. International Journal of Life Cycle Assessment 25, 394-402.