By Albert An, CEO of Tower Research Capital
At first glance, it seems like a linguistic coincidence: can there be any meaningful overlap between digital tokens in blockchain and the tokens that power deep learning? They share a term, but not a purpose. The former fuels decentralized economies, while the latter enables machines to process language, images, and meaning. They share a term, but not a function. Yet, despite the difference, the convergence of blockchain and AI is accelerating – not through literal interoperability, but through shared challenges and complementary architectures. In that convergence lies profound opportunity.
Two Kinds of Tokens, Two Technological Revolutions
In blockchain, digital tokens are foundational. Native cryptocurrencies like bitcoin and Ether underpin their respective networks, while programmable tokens such like ERC-20s or SPLs extend functionality, representing value to governance rights. Some function as digital cash; others unlock voting rights in decentralized autonomous organizations (DAOs) or access to applications. Increasingly, we are also seeing these tokens represent traditional financial assets – equities, bonds, real estate, and money market funds – in tokenized form.
That trend is accelerating fast. In 2024, the market for tokenized real-world assets (RWAs) grew by 85%, surpassing $15 billion. Analysts project it could grow to $30 trillion by 2030. Traditional financial players are already responding: Franklin Templeton issued the first tokenized U.S. money market fund, and JPMorgan used Chainlink’s interoperability protocol to settle tokenized Treasuries on a public blockchain.Tokenization is no longer theoretical – its operational.
In contrast, AI’s “tokens” are fragments of meaning. In deep learning, tokens are the atomic units of input: – words, subwords, or even image segments that models consume to learn patterns and generate coherent outputs. These tokens make language and vision interpretable to machines as they parse meaning, detect patterns, and generate coherent outputs. Massive language models like GPT-3 (trained on 300 billion tokens), Chinchilla (1.4 trillion), and PaLM 2 (3.6 trillion) – have scaled thanks to access to a vast corpora of tokenized data. In AI, tokens don’t circulate – they compute. They don’t hold value – they encode it.
The Convergence: Blockchain as AI’s Next Infrastructure Layer
While blockchain tokens and AI tokens are different in form and function, they share something deeper: both represent foundational units in systems that rely on scale,
coordination, and trust. More importantly, both technologies are fundamentally about infrastructure – not just tools, but new operating systems for organizing humans and machines at scale. As AI shifts from general-purpose tools to autonomous agents that are interacting with sensitive, private data, blockchain’s architecture – decentralized, transparent, and immutable – may offer solutions to some of AI’s most pressing challenges.
Today’s AI ecosystem is largely centralized, governed by opaque model training processes and closed data pipelines. Blockchain introduces a new set of design primitives: verifiable ownership, transparent auditability, and incentive alignment. With it, we can imagine a decentralized data marketplace that breaks away from big-tech monopolies; cryptographic attestations that trace the origin and usage rights of training data; and collaborative networks where contributors are compensated with tokens and system integrity is enforced through protocol, not policy.
Four Use Cases at the Intersection
So what does this convergence look like in practice? Here are four emerging examples that show how blockchain and AI may begin to co-evolve:
● Decentralized AI Training Marketplaces – Instead of relying on traditional centralized entities to collect and control training data, decentralized marketplaces enable open, permissionless marketplaces where individuals, researchers, and institutions can contribute datasets or processing power in exchange for crypto-based compensation. Open sourced standards like the Model Context Protocol, combined with decentralized access to model training promote global participation – especially from communities historically excluded from centralized innovation hubs. The result: more diverse models and less concentrated control. This opens AI development to a broader population – especially in underrepresented or emerging regions – and helps avoid monopolization of foundational models by a handful of firms.
● AI-Generated NFTs and Creative Provenance – Generative AI is unlocking a new wave of digital creativity – from digital art, music, and to writing. Blockchain gives these outputs permanence and ownership. By minting AI-generated works as NFTs, creators can establish authorship, monetize their outputs, and track provenance. AI becomes the engine of creation; blockchain becomes the ledger of authenticity. Together, they lay the foundation for a scalable, transparent creator economy.
● Autonomous Economic Agents – Modern AI agents can already interface with APIs, make decisions, and take actions on behalf of users. With crypto wallets and access to smart contracts, these agents become full participants in digital economies – paying for services, purchasing data, or even hiring human freelancers. Blockchain offers the financial infrastructure and trust layer for these transactions to occur autonomously, securely, and continuously – with no need for banks, intermediaries, or working hours.
● On-Chain Data Provenance for AI Training – One of the thorniest problems in AI today is data lineage: who owns training data, what rights were attached, and how models were built. By anchoring hashes of training datasets on-chain, developers can create immutable, timestamped records of data use. This enhances compliance and copyright enforcement, and also lays the groundwork for auditable, explainable models. In a world filled with synthetic content, trust starts with knowing where the data came from.
The Currency of Intelligent Machines
The convergence doesn’t depend on a unified token standard – it rests on the alignment of two ecosystems built for scale, automation, and autonomy. As AI agents evolve into self-directed actors, they’ll require a native economic interface: a way to transact value, manage permissions, and enforce agreements. Crypto enables this programmability through composable DeFi protocols and on-chain primitives — giving AI agents a plug-and-play financial toolkit they can interact with directly, without relying on human intermediaries or bespoke integrations.
AI agents won’t fill out wire transfer forms or log in to banking portals – but they will hold wallets, use smart contracts, and operate continuously across borders. In that sense, crypto may not displace fiat in human economies, but it could become the default medium of exchange in machine economies. Blockchain provides the rails, crypto provides the capital, and AI provides the intelligence.
That’s the broader vision: blockchain and AI started as separate technological revolutions, but they are now converging into a shared substrate for how machines – and people – will interact, govern, and exchange. Not because their tokens are interoperable, but because they share a logic of decentralization, programmability, and trust-minimization. The next era of systems won’t just process information. They’ll transact, reason, evolve. And they’ll need infrastructure that’s built for that.