Earning Trust for AI in Health 2025
Page 14 of 21 · WEF_Earning_Trust_for_AI_in_Health_2025.pdf
Proposed framework for regulating AI in health FIGURE 1
Regulatory provisions setting the guardrails
Regulatory provisions need well-structured guidelines to have real-world effects,
which can be developed using industry standards and experience
Adapt post-market
surveillance and monitoring
Ensure market access
guidelines and protocols
consider the capabilities of AI
technologies to/uni00A0evolve
post-deployment
Transform guidelines and regulations into actionable procedures
To support/uni00A0the private sector to ensure real-world compliance and effectivenessRegulatory
provisionsIndependent testing
Provide non-binding
guidance for all stakeholders
Under the responsibility of
governments or dedicated
regulatory bodies, supported
by independent expertise,
notably from the innovation
and academic communitiesIndependent
guidelines setting
Operationalization
Source: World Economic Forum and Boston Consulting Group analysis
Private-sector involvement should be carefully
designed to preserve regulatory integrity and
independence, while taking advantage of the
sector’s unique skills and capabilities. Thus, it is
essential to mobilize the private sector at the right
steps of the regulatory process (see Figure 1):
–First, the private sector should be consulted
in the upstream phases of the regulatory
and guidelines development processes. The
private sector can support the ecosystem to
provide non-binding guidance that over time will
inform legislation on AI in health.
–Second, private-sector involvement should
extend to the translational aspects of
legislation. A legislative framework sets out a
high-level vision for the roles of AI technologies
in society, paired with appropriate boundaries
and guardrails. The development and
implementation of guidelines that aim to realize
this high-level vision can benefit greatly from
industry input, offering insights into how that
vision can be realized through purposeful and
public value-driven innovation. For instance, the
world’s first international standard dedicated to AI management systems (ISO/IEC 42001:2023)
was developed through international
collaboration involving diverse stakeholders.31
–Third, the private sector is ideally placed
to develop and scale pre- and post-market
testing and monitoring approaches to
detect deviations in the performance of
AI technologies and correct them. The
private sector is best positioned to provide
the technical expertise needed to build real-
time monitoring capabilities. For example, US
company Galileo has developed a platform that
embeds accurate evaluations directly into AI
development workflows.32
Regulators already create frameworks for
private-sector engagement. However, most of
the companies interviewed for this paper reported
challenges in making consistent and meaningful
contributions. Appropriately involving private-sector
actors in the policy process and implementing
feedback loops can help ensure that guidelines for
AI in health keep pace with technological advances.
Earning Trust for AI in Health: A Collaborative Path Forward
14
Ask AI what this page says about a topic: