Rethinking Media Literacy 2025
Page 36 of 45 · WEF_Rethinking_Media_Literacy_2025.pdf
7.2 Scaling up effective interventions
While progress has been made, MIL initiatives
remain fragmented and underfunded. To build a
more resilient society, stakeholders should:
–Integrate MIL into AI governance:
Regulate synthetic media by requiring clear
labelling of AIGC and introducing platform-
level friction (e.g. interstitial warnings or
slowed sharing).
Example: An individual platform’s new
policies on AI content labelling should serve
as an initial step, but standards should be
harmonized across platforms.
–Embed MIL in everyday digital experiences:
Users should encounter MIL nudges – such
as credibility warnings, source reminders
or contextual explanations – during routine
engagement online.
Example: X’s community notes allow users
to collaboratively add context to misleading
tweets, crowd-sourcing fact-checking in a
way that encourages active participation
and reflection.
–Educate users on the why and the how:
While users often learn how to fact-check
or spot bias, they are rarely taught why they
see certain content. AI-driven algorithms in
combination with user preferences, platform
features and supply-and-demand dynamics
amplify content that captures attention.58
Example: Integrate teachings on the
political economy of social media along
with other MIL teachings.
–Support lifelong-learning initiatives:
Expand MIL access beyond schools by
incorporating modules into workforce
development, civic education programmes
and digital onboarding.
Example: BC4D includes MIL training for
adult employees, not just students.
–Incentivize private-sector participation:
While technology companies have now
taken steps to support MIL, there is a need
to build on and scale these efforts across
all industries. For these companies not
doing it, encourage them to co-create with
educators, fact-checkers and civil society MIL
experiences beyond content moderation.
Example: TikTok’s election hubs that
direct users to authoritative information (for
example, national election commission sites)
demonstrate how platforms can embed
media literacy interventions at scale.59
–Enhance evaluation mechanisms:
Impact measurement is essential for
scaling what works. Invest in independent,
longitudinal evaluations that track behavioural
change across populations and platforms.
Example: The UK’s Ofcom regulator
conducts regular Digital Literacy Tracker
surveys on attitudes of adults.60
–Promote long-term investment strategies:
Shift from one-off grants to sustained,
multi-year support for MIL ecosystems. This
ensures consistency, local ownership and
better adaptation to new threats.
Example: Multi-year and long-term
investments in national MIL strategies and
infrastructure by organizations.
–Develop “pre-bunking” initiatives:
Rather than only correcting misinformation
after exposure, invest in pre-bunking
strategies that teach people to recognize
common disinformation tactics.
Example: Google and Jigsaw’s YouTube
ads based on inoculation theory –
which teach users about disinformation
techniques such as scapegoating – have
shown measurable improvements in users’
resistance to manipulation.61
–Adapt to emerging threats:
Disinformation evolves with technology,
especially with GenAI, deepfakes and algorithmic
manipulation. MIL must remain agile and
incorporate the latest insights on digital deception.
Example: Initiatives can equip journalists,
educators and others with tools to identify
AIGC and teach others to do the same.
Rethinking Media Literacy: A New Ecosystem Model for Information Integrity
36
Ask AI what this page says about a topic: