Skip to main content
Funding & Opportunities

Announcing the First Round of Grants for Platform Accountability in Europe

By 30/03/2026No Comments3 min read

We are pleased to announce the recipients of the first round of grants supporting innovative projects that address emerging challenges in Europe’s information ecosystem. 

This round’s selected initiatives reflect the growing complexity of the digital information environment, particularly the role of generative AI systems and large online platforms in shaping public understanding of key policy issues. The two selected projects have been granted a total of 25,000 EUR to investigate how online platforms and AI systems shape the production and spread of disinformation, and how they comply with regulatory and self-regulatory commitments.

Generative AI and the Transformation of Disinformation Narratives
Led by BIC Media

Generative AI systems are increasingly acting as intermediaries of information which creates a critical challenge for the European fact-checking community. When users ask AI tools to explain topics such as EU sanctions, migration policy, or support for Ukraine, these systems do more than retrieve facts –  they summarise, contextualise, and sometimes reshape narratives originating from broader media ecosystems.

The project will investigate how Belarusian and Russian media construct disinformation narratives about the European Union and how leading generative AI systems interpret and reproduce these narratives in response to user queries.

The project contributes directly to discussions under the Digital Services Act (DSA) and the EU AI Act by providing empirical evidence on how AI systems handle propaganda and disinformation, and whether they may contribute to systemic risks.

Platform Enforcement of Climate Disinformation Policies
Led by Maldita.es and Science Feedback

This project examines how YouTube and TikTok enforce their policies on climate disinformation in France and Spain, and whether these practices comply with obligations under the Digital Services Act and commitments under the EU Code of Practice on Disinformation.

Given the influence of video platforms, particularly among younger audiences, and the role of algorithmic amplification and monetisation, this project provides critical insights into how disinformation is managed in audiovisual environments.

The project tests compliance with key DSA provisions on terms and conditions, complaint handling, dispute resolution, and systemic risk mitigation. It also evaluates adherence to voluntary commitments under the Code of Practice on Disinformation, particularly regarding demonetisation and reduced amplification of misleading content.

Both projects aim to provide actionable insights for policymakers, platforms, and fact-checkers, contributing to a more transparent and resilient European information environment

Leave a Reply