Rethinking Media Literacy 2025
Page 21 of 45 · WEF_Rethinking_Media_Literacy_2025.pdf
The final stage addresses how disinformation can
affect individuals, groups and society at large.
Disinformers are adaptive and will continue to
exploit loopholes in policy or prevention efforts. It
is important therefore to plan for scenarios where
such content continues to spread or take up
oxygen in public discourse. However, the existence or reach of disinformation should not be mistaken
for a set of foregone outcomes. Interventions must
focus on creating a stronger feedback loop, so
that learnings from previous episodes can inform
mitigation and response. This includes measures to
undermine disinformation already in circulation and
safeguard those affected. 4.5 Post-consumption (influence and impact)Marketplace
Strengthen in-platform safety features and user controls – for example, to limit screen time or
order newsfeeds based on chronology. Allow consumers to withdraw consent for targeted
advertising or the sale of their personal data to third parties. Ensure recommender algorithms
do not produce a “rabbit hole” effect, whereby engaging with disinformation funnels users
towards more content of this nature.
As sources of information – including local news outlets and independent journalism – continue
to shrink or disappear, the information ecosystem grows more vulnerable as, in a market,
competition fuels better products. This erosion not only weakens the marketplace of ideas
but also undermines the context and credibility needed for media literacy efforts to succeed.
Recognizing this reality, media literacy interventions must adapt by equipping users to
critically evaluate the information landscape as it is, while also supporting efforts to revitalize or
reimagine public-interest information infrastructure in both digital and offline spaces.
Supply
Publish profiles on the actors behind disinformation campaigns, including their known
tactics and suspected motivations. Invest in public databases of fact-checks and debunking
mechanisms, alongside enhanced tools for reporting and “trusted flagger” schemes. Develop
stronger legal frameworks that tackle disinformation in a fair and proportionate manner,
grounded in human rights and a nuanced assessment of harm. Improve support and redress
mechanisms for those victimized by disinformation, including digital safety and security
training. Arm high-trust communicators with the knowledge and resources to compete in a
saturated information space.
Demand
Conduct in-depth research to assess how people encounter disinformation in their everyday
lives and its corresponding effects (e.g. on perceptions, attitudes and behaviours). Produce
robust studies that quantify how such content can cause observable harm (e.g. mobilization
to violence, threats to public health, disruption to emergency response or democratic
processes). Adapt education materials and provision in line with these trends, in particular the
emergence of new technologies.
Rethinking Media Literacy: A New Ecosystem Model for Information Integrity
21
Ask AI what this page says about a topic: