Bloomberg and Kaiko said that they integrated licensed Bloomberg market data onto blockchain infrastructure to support tokenized markets. The core idea is straightforward: put audit-grade reference and pricing data onchain in a way institutions can actually use without breaking licensing rules. The partnership uses Kaiko’s on-ramp to the Canton Network to deliver Bloomberg Data License content into tokenized workflows, targeting the recurring pain points of valuation disputes and reconciliation overhead.
The timing matters because tokenized finance is moving from pilots into production-style plumbing. Bloomberg and partners described the tokenized market as roughly $25 billion within a broader RWA tokenization market of $34 billion as of early 2026, and cited tokenized U.S. Treasuries alone exceeding $10.8 billion, per Cryptorank and firm statements. In that context, “clean data onchain” is less of a nice-to-have and more of a prerequisite for scaling institutional adoption.
How the data gets onchain without breaking licensing
The technical flow routes Bloomberg Data License content through Kaiko’s data on-ramp service, which Kaiko said has been operational on the Canton Network since August 2025. Canton’s model—permissioned, privacy-enabled, and entitlement-controlled—lets the system preserve traditional licensing constraints while still writing immutable records for auditability. In practical terms, only licensed, authorized participants can read the feeds onchain, which is critical for regulated institutions that cannot rely on informal or “public” price sources for valuation and settlement.
The architecture is designed to feel familiar to enterprise teams. Centralized licensing and entitlement remain intact, but the canonical dataset and its lineage are recorded onchain to reduce timing mismatches and source fragmentation. That is the operational win: fewer “which price did you use?” arguments, fewer reconciliation loops, and a clearer audit trail when a tokenized contract executes.
Where this actually plugs into institutional workflows
The partners highlighted use cases where reliable reference prices are not optional—they are the control layer. Real-time collateral valuation supports lending and margin decisions, and consistent pricing can shorten repo settlement and reduce disputes. They also pointed to tokenized derivatives and structured products, where pricing inputs drive contract behavior and risk models, and where having an authoritative, licensed feed can reduce ambiguity at the point of settlement.
Ambre Soubiran, Kaiko’s CEO, framed the move as a market-structure shift. She said bringing institutional-grade valuation data onchain represents a fundamental change in how markets operate, reflecting the partnership’s goal of acting as a trust bridge between traditional data practices and blockchain execution. The broader emphasis in the announcement is that auditability and permissioning are not constraints—they are the adoption enablers for banks and asset managers.
Compliance is clearly part of the pitch as well. By keeping access permissioned and creating an auditable trail, the model is designed to match regulatory expectations around control, provenance, and accountability. In institutional terms, it is meant to support “who had access, when, and what exact data was used” in a way that stands up to internal audit and supervisory review.
Adoption, though, will be decided downstream. Integration with custodians, trading venues, and tokenization platforms will determine whether participants accept an onchain canonical source as authoritative for valuation and settlement. The partnership is essentially offering a single source of truth, but the market still has to operationalize it across systems and counterparties. The next step after the Feb. 26 announcement is embedding these feeds into live tokenization workflows and client deployments, where the promised reduction in reconciliation overhead will be tested in production.








