Crypto x AI Needs Better Benchmarks

Reflections on crypto x AI and a restatement of Archetype’s AI thesis under our third fund

A multi-year arms race of billions of dollars and trillions in tokens has fueled a production line of growth across AI unlike anything we’ve seen before.

At Archetype, we’ve always had deep conviction in the role that crypto can play in that story.

As we’ve run a wide-ranging effort to help catalyze this frontier, we’ve been fortunate to partner with some incredible founders pushing the boundaries of what’s possible when AI is unleashed on open, programmable rails.

But in order to accurately assess the industry’s impact on AI and evaluate the opportunity set ahead, we need a framework focused on honest reflection and better north stars.

It’s time to recalibrate our benchmarks to ensure we’re ruthlessly self-aware in working towards meaningful contributions to the world.

Speculative Assets, Solutions, and Quality Benchmarks

A glance at any number of crypto x AI market maps or fundraising charts gives the impression of a robust ecosystem having developed across different layers of the stack.

The reality is a little more nuanced than that.

There’s an important distinction to be made between speculative assets and solutions in the context of what has emerged in this space.

Speculative Assets 

Speculative assets primarily serve as mechanisms for (liquid) exposure to AI across a wide variety of form factors and narratives.

In a world with immense demand for that exposure, conventional markets are limited in their ability to offer it. For most, the accessible option looks something like buying shares in one of ~7 big tech companies.

So, for several obvious reasons, it makes a ton of sense for crypto to have found early product-market fit in providing a diverse suite of tokenized instruments to serve this need.

The thing is, what makes speculative assets appealing is not just liquidity and granularity (i.e., the ability to get concentrated exposure to specific verticals or ideas), but also the sophistication of the story being sold.

And in solving for this, a large chunk of the crypto x AI stack has (often deliberately) blurred the lines between speculative assets and their counterpart: solutions.

Solutions

Solutions measurably provide some form of utility that attracts adoption without relying exclusively on financial incentives, whether in the form of full-fledged companies, capital markets, standalone tools, or bits of software.

Solutions can be validated across both qualitative and quantitative variables.

Evals differ dramatically at any layer of the AI stack, spanning cost-per-token (itself a function of several underlying inputs) and latency, trust and verification, reliability and uptime, compliance, security, or capital efficiency.

Certain use cases will place a premium on verifiability or privacy at the expense of efficiency, while long-running inference workloads may be more latency-insensitive and focused on overall cost minimization.

A deeper analysis of evals relevant to AI’s different subsectors goes beyond the scope of this article, but contextualizing our guiding benchmarks should be far more straightforward: 

  1. Democratize exposure to the universe of investments across AI.

  2. Build a competitive suite of infrastructure and tooling that underpins a critical mass of AI’s expansion in the coming years.

  3. Leverage crypto rails to scale a multi-billion dollar ecosystem of net-new intelligent applications.

I would argue that a substantial portion of what has emerged in this space to date falls in the speculative assets category, overindexing on the first benchmark while pretending to solve for the latter two.

Stated bluntly: the billions in network value across crypto x AI is largely indicative of the semi-successful launch of a suite of financial products for investors to satisfy their insatiable demand for proxy exposure to AI.

There’s nothing wrong with building financial products. In fact, a significant edge crypto-native founders bring to the table over their web2 counterparts is their ability to make AI’s upside accessible and investible.

But as with all benchmarks and evaluations, we have to know what we’re measuring in order to accurately track our growth.

Solving for solutions is what takes us from passengers to copilots in the trajectory of AI.

Rethinking the Spectrum of Decentralization in AI

Another thing that needs to be reframed: overindexing on decentralization introduces very different, often contradictory implications when it comes to AI.

More importantly, the concept is just one part of the broader toolkit that crypto brings to the table. The axis of control is as relevant to AI as those of coordination and composability, and it may well be that crypto has far more to offer on the latter two than the former.

In fact, we too often incorrectly use the phrases ‘decentralized’ and ‘distributed’ interchangeably.

For example, what about in terms of network topology for execution and consensus, say, in the case of a compute network?

Even there, the answer differs across inference or training use cases, as well as across large enterprise vs. startup vs. consumer needs, with considerations ranging from throughput or latency to cost, fault tolerance, security, compliance, and verifiability.

What about supply side participation?

For a network collecting, say, visual or mapping data at the edge, the operator might be centralized—highly opinionated in the types of data needed to serve different use cases and also handling post-processing and enterprise sales—while the value comes from expanding the supply side of data contributors.

We can imagine a parallel scenario around reinforcement learning.

Horizontal expansion of the network to drive environment diversity can improve generalization and limit overfitting for agents–highly relevant in the search for functional systems in the real world. It can also support verifier diversity, unlocking mastery across complex domains. And yet it’s hard to imagine this playing out without a core team dedicated to environment and verifier curation alongside rigorous evaluations.

Both are cases of distributed networks that can benefit from blockchains as orchestrators and economic substrates, but neither are decentralized in the conventional sense.

Or maybe we’re simply talking about democratized ownership, where the next OpenAI is built and maintained by a single group, but the use of a token lets anyone share in the economic upside (while replacing the monopoly of mega VC funding).

The point is simple: blanket decentralization doesn’t make any sense, it can’t come at the expense of utility, and a lazy over-reliance on it does a disservice to the significantly more comprehensive value proposition blockchains offer to AI.

Carelessly introducing the concept in the hopes that an ideological premium will make up for an ineffective system is how we end up with ghost cities of siloed infrastructure and tooling.

Knowing when we’re talking about a story being sold around a speculative asset, as opposed to a solution evaluated objectively on its ability to perform across a range of relevant metrics, determines whether we build in echo chambers or build for the world.

How We Think About the Future of Crypto x AI

Rather than lose ourselves in narratives and echo chambers, we approach the space with a consistent set of key beliefs and theses.

1. Scaling Laws are Expanding

Even as massive, centralized pre-training efforts continue, a greater reliance on distributed compute, specialized data, and diverse, domain-specific post-training aligns uniquely well with blockchains that can orchestrate and scale efficient markets for the resources needed to facilitate this shift. 

For example, such markets can massively expand the supply of compute.

Not the lazy, tokenized gaming GPU marketplace kind, but a performant, reliable compute layer that rivals its larger Neocloud competitors against the metrics that actually matter.

(Listing 10K GPUs that don’t really exist and issuing tokens in their place isn’t one of them.)

Thankfully, several great teams like Ritual*, Exo*, Odyn*, Yotta, Pluralis, Prime Intellect, Ambient, Gensyn, Nous, and others are helping this become very real, even if significant work remains in the pursuit of the holy grails of decentralized inference and training at global scale.

But if and when these teams succeed, they’ll blow open the doors for scaling laws to expand across existing and new pathways alike.

Pre-training runs that amass huge pools of compute without the overhead of giant, co-located data centers will do to foundation models what Bitcoin did to the financial system.

Swarms of reinforcement learning in parallel can accelerate model improvement and refinement across new use cases under a growing collective intelligence.

And bringing resilient, global networks of compute online while managing memory and cost overhead can also unlock inference-time and reasoning scaling at unprecedented scale.

At the same time, onchain data markets may provide a solution to data bottlenecks in AI.

Applications are being developed that incentivize diverse real world data collection at the edge, new economic models from Vana are being introduced around data itself as an asset, and creative mechanisms like Opacity’s* zkTLS allow us to source private user data while rewarding people in return.

Abundant datasets sourced across real life scenarios can also give elite crypto-native researchers an edge in the (re-emerging) exploration of non-transformer-based systems like recurrent architectures or state-space models that can power use cases across robotics, AR/VR, world models, and more.

Meanwhile, networks are emerging that aggregate and compound human contributions towards a variety of tasks, from expert-enhanced queries to data labeling or collaborative reinforcement learning environments in a distributed setting.

We’re rapidly moving out of the dial-up internet phase across these fields, even as many open questions remain.

2. AI Needs More Than a Brain to Operate Like Humans

As AI leaves the lab and proliferates into production, crypto offers an arsenal of instruments it needs to function at the highest levels of impact.

Blockchains can become the lifeblood for AI and enable agents to be programmatically autonomous

These agents will be able to acquire and expend resources, as well as freely interact with protocols, humans, and other agents, all while becoming investable, productive assets in the process.

Uber ratings or credit scores for agents? Cryptographic reputation primitives for agents can unlock very different forms of credit lines and job markets. 

Permissionless blockchains also serve as incredible sandboxes for AI. A chaotic, open environment of near-infinite assets, markets, and scenario diversity can drive rapid feedback loops that level up onchain agents far more efficiently than their counterparts trapped in unimaginative enterprise contexts.

In turn, a new class of DeFi protocols will give rise to assets and financial instruments tied to every part of the AI supply chain, alongside new capital markets from GAIB, USD.AI and others that become a core source of financing for the buildout (and ownership) of compute. 

In the meantime, thanks to teams like Taceo*, Lagrange*, EZKL, and Aizel, crypto has served as the frontier for some of the deepest R&D on infrastructure for both private and verifiable compute. The technical and economic feasibility of these different primitives varies, but more of them are rapidly approaching prime time.

AI is barely working within the confines of an enterprise, so it’s no surprise that agents aren’t quite currently running wild on blockchains. 

Bots don’t count. 

But crypto is ready with the mechanisms agents will soon need to reach their full potential.

3. Finding Tomorrow’s Killer AI Apps Onchain

We believe crypto has homecourt advantage in its ability to unlock net-new AI applications atop a suite of battle-tested distribution and monetization rails designed for a digitally native world.

GameFi 1.0 didn’t work because the games weren’t great and the economic models were unsustainable. 

Music NFTs felt like a step back to the early iTunes store (not in a good way). And let’s not start on the Metaverse.

But all of these were directionally correct in highlighting the unique impact blockchains can have on the creation, distribution, and monetization of content. AI can be the missing link that empowers us to create stuff good enough that it finally sparks the economic engine we were waiting for in the NFT boom.

To be sure, tomorrow’s killer apps need more than vibecoded front ends.

Building incredible experiences still requires amazing product and systems intuition, like leveraging growing data sets to drive continuous learning and iteration, or perfecting dynamic model orchestration and memory utilization under the hood so that intelligence can scale efficiently.

Breakout use cases will leverage persistent memory to create personalized, powerful experiences atop a well-designed AI stack, rivaling web2 offerings while leveraging onchain features to differentiate on economic models or distribution.

We’re on the verge of this thanks to teams like Pond*, Nava, Remix*, Virtuals, and Giza who are building powerful, abstracted tooling needed to launch and scale AI-native companies and applications onchain.

4. Magical UI/UX, Finally

And finally, AI in turn can accelerate the adoption of blockchains by bringing resilience, creativity, and coherence to the historically messy and fragmented world of onchain infrastructure and applications.

Most onchain interactions can still feel like using an operating system from 2003. 

Thankfully, teams like Privy* have helped usher in crypto’s mobile moment, while a cohort of great startups are building intuitive, natural-language interfaces for using blockchains.

Builders are experimenting with a mix of effective context engineering and different levels of post-training for a more efficient context → planning → execution flow. Seamless model orchestration across agents of varying size and expertise is finding its place within a front end UX that inspires.

Once these tools are perfected, they will open a flood gate of users and applications that will take crypto to entirely new heights. Projects like Herd*, Shinkai*, Surf, AskGina, Edgen and HeyElsa are pioneering this future.

In the meantime, Octane*, Sherlock*, Zellic and more are using AI to reinforce security and make blockchains resilient in the face of growing threat vectors.

Great Things Take Time

Despite all of this, we balance our optimism with perspective and discipline under two principles:

  • Avoiding the trap of reactivity to short term market dynamics, constraints, or themes that repeatedly prove to be transitory in this rapidly evolving space.

  • Making an effort to stay focused on the technical practicalities and economic priorities of the market, while rejecting an overreliance on ideology, web3 silos, or unsustainable token subsidies.

There’s much to be proud of in the industry’s journey to date, but it’s far too soon for self-congratulation (as a firm or as an ecosystem). As we enter this new chapter, revising the frameworks that guide us can ensure we remain grounded.

It has taken time for crypto to inspire and earn the trust of AI-native talent, as well as (in several cases) turn homegrown crypto talent into world class AI operatives.

Some self-inflicted reputational missteps as an industry haven’t helped the cause either.

A related dynamic is that a lot of what blockchains uniquely offer AI is preemptive and therefore early. From training to deployment, many of the constraints that will drive relevance and adoption of the emerging crypto x AI stack do not yet outweigh the practical benefits that the traditional stack offers.

But with AI demanding ever more compute, data, and capital—and in an increasingly distributed setting—a variety of constraints are moving from theory to practice with wide ranging implications for crypto to play more of a starring role.

We’re beginning to see the tides shift.

Foundational work from different teams is bringing legitimacy to crypto as a domain where fundamental innovation in AI can credibly take place. In parallel, entertaining experiments are showing signs of evolving into compelling AI-first consumer applications for a new generation of users.

We couldn’t be more excited about the opportunity set ahead, and we continue to anticipate a multi-faceted, reciprocal relationship between crypto and AI.

To the founders exploring any of the fields below and beyond, we’d love to play a role in bringing your vision to life.

For the insightful conversations on foundational AI infrastructure that went into this, thank you to Akilesh Potti, Alex Cheema, Alexander Long, Travis Good, Anne Ouyang, Bidhan Roy and many others.

Likewise to Hersh Patel and Anna Kazlauskas for talking through a reimagined role for data as a resource and asset in AI’s future.

To Aadharsh Pannirselvam, Katie Chiou, Tommy Hang, and Tyler Gehringer here at Archetype for your help and feedback.

And finally to Dylan Zhang, Giovanni Vignone, Jack Sanford, Andrew Hong, and Eskender Abebe on shaping a new era of creative, resilient AI applications.



*denotes an Archetype portfolio company

Disclaimer:

This post is for general information purposes only. It does not constitute investment advice or a recommendation or solicitation to buy or sell any investment and should not be used in the evaluation of the merits of making any investment decision. It should not be relied upon for accounting, legal or tax advice or investment recommendations. You should consult your own advisers as to legal, business, tax, and other related matters concerning any investment or legal matters. Certain information contained in here has been obtained from third-party sources, including from portfolio companies of funds managed by Archetype. This post reflects the current opinions of the authors and is not made on behalf of Archetype or its affiliates and does not necessarily reflect the opinions of Archetype, its affiliates or individuals associated with Archetype. The opinions reflected herein are subject to change without being updated.

Next
Next

Altius: A New Execution Paradigm