Building on the success of ClimateCheck 2025, the 2026 edition expands to include disinformation narrative classification using the CARDS taxonomy, with substantially more training data and a focus on sustainable, real-world NLP systems.
ClimateCheck 2026: Expanding Scientific Fact-Checking with Disinformation Narrative Classification
ClimateCheck 2026: Adding Disinformation Narrative Classification
ClimateCheck is back for 2026, this time at the NSLP Workshop (LREC 2026) in Palma de Mallorca, Spain on May 12. The new edition expands beyond fact-checking to include disinformation narrative classification using the CARDS taxonomy—a structured framework for understanding not just whether claims are false, but how and why they mislead.
Participants will work with substantially more training data and compete on two tasks: the original scientific fact-checking challenge plus the new narrative classification task. The competition emphasizes sustainable, real-world NLP systems that balance accuracy with computational efficiency.
The Tasks
Task 1: Scientific Fact-Checking — The same abstract retrieval and claim verification challenge from 2025, now with more training data and refined evaluation metrics.
Task 2: Disinformation Narrative Classification (New) — Classify climate claims according to the CARDS taxonomy. Beyond determining if a claim is false, identify which type of disinformation narrative it represents—enabling more targeted counter-strategies and deeper understanding of how climate misinformation spreads.
Timeline: Dataset release December 2025, submissions due February 2026, workshop May 12, 2026 in Palma de Mallorca. System papers encouraged.
Building on 2025
The 2025 competition demonstrated that fine-tuned BERT models can match LLM accuracy while running 11-380x faster, consuming dramatically less energy, and offering better explainability—advantages that matter for both sustainability and transparency. Hybrid retrieval also proved essential, significantly outperforming single-method approaches.
The 2026 edition builds on these insights while pushing the field forward with narrative classification—moving from “is this claim false?” to “what type of disinformation strategy is being used?” The continued emphasis on sustainable systems reflects the reality that climate fact-checking tools should minimize their own environmental impact.
The competition is organized by Raia Abu Ahmad, Aida Usmanova, Max Upravitelev, and Georg Rehm as part of the NFDI4DS consortium. Students, early-career researchers, and interdisciplinary teams are encouraged to participate. More details at the competition website.
Related Resources: