The Intervention Journey A Roadmap to Effective Digital Safety Measures 2025
Page 27 of 45 · WEF_The_Intervention_Journey_A_Roadmap_to_Effective_Digital_Safety_Measures_2025.pdf
overseeing the programme’s day-to-day operations.
To become a Lantern participant, companies
must undergo a rigorous application process and
compliance review before being legally admitted
into the programme.
Signals shared through Lantern are not decisions
or conclusions but pieces of information requiring
further investigation by receiving companies. Yet, a
single signal could be the missing piece that helps
safeguard a child.
Importantly, Lantern does not facilitate automated
actions on a platform. Participating companies
independently assess signals before sharing them
in Lantern – likewise, companies must confirm that
signals downloaded from Lantern correspond to
a policy violation on their platform prior to taking
enforcement actions. Companies must also provide
additional safeguards, such as a user appeals process.
Signals in Lantern are broadly categorized as
content-based or incident-based.
–Content-based signals focus on the shared
content related to OCSEA, such as CSAM
images or videos, manuals or other illegal
content. These signals are typically shared as
hashes, URLs or keywords.
–Incident-based signals address violations of
OCSEA policies where content may or may not
be shared, including minor sextortion, sexual
grooming, contact offences or trafficking.
These signals are usually shared as account
information, critical for identifying cross-platform
actors evading detection.
After investigating signals, companies can provide
feedback to Lantern on its use and outcomes,
in line with their policies and applicable law. Feedback, measurement
and transparency
Several metrics are used by the Tech Coalition in
measuring the success of the Lantern programme
so far. In terms of participant growth, there has
been a steady increase since the launch in August
2023, with 25 companies currently participating.
Through December 2023, participating companies
identified, confirmed and took action on 30,989
accounts for violations of policies prohibiting CSEA.
In addition, 1,293 individual uploads of CSEA
material were removed, and 389 URLs/bulk uploads
(meaning a given URL could host numerous pieces
of content) of CSEA material were removed. This
is in addition to enforcement actions by individual
companies for terms of service violations.
Additionally, 768,044 signals have been uploaded
into Lantern.
All information is provided in aggregate, and the
outcomes were reported directly by participating
companies to the Tech Coalition. A key aspect
of this programme is that sharing is voluntary,
therefore not all participating companies have
shared signals or outcomes. As the programme
matures, the Tech Coalition plans to implement
ways to increase signal contributions and outcome
reporting from participating companies while
continuing to consider evolving risk factors.
The Tech Coalition and participating companies
will continue to refine the programme and uphold
privacy and human rights alongside its mission of
protecting young people online.
Through
December 2023,
participating
companies
identified,
confirmed and
took action on
30,989 accounts
for violations of
policies prohibiting
CSEA.
The Intervention Journey: A Roadmap to Effective Digital Safety Measures
27
Ask AI what this page says about a topic: