Asset Tokenization in Financial Markets 2025
Page 55 of 63 · WEF_Asset_Tokenization_in_Financial_Markets_2025.pdf
Contents55
Conclusion
Tokenization can benefit
financial markets by
establishing transparency,
allowing greater ownership
control, promoting operational
efficiency, granting greater
accessibility to investors and
enabling multi-asset operations.
The differentiators of this technology
application can help to realize cost and time
savings for financial markets and broaden
access to investors in capital markets.
However, achieving these benefits will not
come without deep public–private
collaboration on consistent regulations and
standards, adaptation of market structures
and value chains, enhanced collateral
frameworks and safe and sound usage
of open networks.
Market structures need to evolve to harness
tokenization’s advantages. Today’s financial
infrastructure is based on centralized
intermediaries and predefined settlement
cycles, whereas tokenized markets introduce
programmability, atomic settlement and the
potential for “always-on” markets. However,
despite technological advances, certain financial operations – including risk
management, clearing and corporate actions –
require controlled execution windows to
maintain stability and mitigate volatility.
Technological innovation alone does not
eliminate the operational and regulatory
realities that underpin market integrity. There
needs to be a balance between incumbent
institutions and new industry players from a
regulatory and market power perspective.
Ensuring fair market access, open
interoperability and balanced regulatory
influence is critical to preventing any undue
concentration of power while inspiring
sustainable growth.
Lack of global standards and regulatory
fragmentation for tokenization remain a leading
challenge and policy-makers should update
financial regulations based on the principle
of technology neutrality to accommodate
tokenized assets while ensuring enforceability,
investor protection and risk management.
The lack of legal clarity around on-chain
ownership rights and settlement finality can
stifle the value proposition of tokenization as
observed in potential discrepancies associated
with a shared system of record and unified
consensus on the state of an asset.
By implementing new technologies in the
market infrastructure, it is necessary to ensure
interoperability, particularly in defining common
transaction protocols, asset classification frameworks and reference data models.
However, complex financial processes –
such as corporate actions – may require
a phased approach to standardization
to adapt to jurisdictional differences and
evolving industry practices.
For tokenized markets to function effectively,
liquidity providers and market makers should
be encouraged to participate. Without
sufficient secondary market depth, tokenized
assets risk remaining illiquid, limiting their utility
despite technical advances. Policy-makers
and financial institutions should explore
mechanisms to encourage market-making
activities, such as tailored liquidity
programmes, capital treatment incentives and
expanding regulatory sandboxes that spur
institutional participation.
Several stakeholders should make strides to
facilitate the use of tokenization including
policy-makers, technology providers and
financial institutions.
By addressing regulatory uncertainty, adapting
market structures and encouraging
competitive, liquid markets, tokenization can
enable an improved global financial
infrastructure. Achieving this vision requires a
balanced, pragmatic approach that spurs
innovation while upholding market integrity,
competition and operational resilience.Conclusion
Ask AI what this page says about a topic: