Jack Flintoft

VC @ Dorm Room Fund, undergrad @ UChicago

There’s a common narrative going around right now about how AI will transform into a commodity business. “The cost of intelligence will go towards ZERO!” — they profess like millenarians. In their minds, an image is conjured that looks something like an exponential decay graph, sloping violently towards zero.

Using “commodity” here, though, seems like a misnomer. With gold (along with most other commodities), we can melt it down and atomize it. A gram of gold in Ancient Anatolia is, essentially, the same as a gram of gold today. The mistake is assuming we can apply this same atomic quality to AI. Unlike gold, AI cannot be measured by weight, let alone physical substance. Sure, we can benchmark AI model outputs and think objectively about the tasks it can perform, but the reality is that AI is a metamorphosing substance.

This is because AI is a placeholder for intelligence. Therefore, an “atom” of intelligence today will be very different from an “atom” of intelligence one year from now. We only have to look at what we believed to be intelligent then (GPT-4) compared to what is considered intelligent now (Claude Opus 4.6) to see this difference. Goalposts shift with intelligence where they don’t with regular commodities.

What does this mean? It means there will always be class of “highest intelligence” models that refuse to succumb to market forces. Because intelligence cannot be atomized or permanently fixed, it never fully becomes abundant at the frontier. There is always a bleeding-edge model, and by definition, that model is scarce. As a result, frontier AI is governed by market power (supply and demand) rather than pure commodity economics.

This creates a recurring cycle we have seen play out time and again. It is the exact dynamic of algorithmic trading. A brilliant mathematical model initially generates massive alpha, only to decay as it is reverse-engineered by the broader market into beta — a cheap, baseline commodity. This same pattern dictates AI. Brief periods of absolute dominance by a leading model are inevitably followed by rapid consolidation as that specific level of intelligence diffuses. We see this now with Chinese labs like DeepSeek and Kimi aggressively driving the cost of last year’s reasoning down to pennies (for example, DeepSeek-V3 matching OpenAI’s GPT-4o on the MMLU benchmark, but charging just $0.28 per million output tokens vs. GPT-4o’s $10.00). They brutally commoditize the trailing edge, right up until the frontier shifts once again.

Leave a comment