When I was just clicking into my teens and living in Maine, noted academic and author, J.R.R. Tolkien was about the age I am now. Besides being an Oxford Don (twice over), he had already written The Hobbit in 1937 and The Lord of the Rings in the following 17 years after that. Tolkien’s Rise to popularity developed in waves with his early reception through the war years for The Hobbit thought of as a children’s book of fantasy. Lord of the Rings was published in three volumes in 1954–55 and was critically controversial, but mostly a UK adult phenomenon at that time. The pivotal moment in his popularity came starting in 1965 and played into the the U.S. countercultural revolution on American campuses to great success. “Frodo Lives” graffiti appeared across U.S. cities while paperback versions of his novels sold millions of copies and became deeply embedded in hippie and anti-war culture. At the same time the 1960s were a golden age for lapel pin-back buttons, amounting to another genuine cultural phenomenon. Buttons were one of the most popular and pervasive forms of political messaging during the decade, combining brief text with memorable graphic designs. They were inexpensive to produce in mass quantities and easy to distribute, giving any individual a way to voice opinions and potentially reach a broad audience. The craze was driven by politics and protest, pop culture as well as music, humor and counterculture. The iconic slogans. “Make Love Not War” is probably the best-remembered button slogan from the era, along with the ubiquitous smiley face. As psychedelia took hold, buttons went wild with every kind of political and social message in vivid colors. The phenomenon was essentially the social media of its time, a cheap, wearable, instantly readable broadcast of identity and belief. The intersection of the Tolkien craze and the button craze hit us in Maine and as an impressionable young person stuck in the hinterlands, I embraced both. I read The Hobbit and Lord of the Rings and proudly wore a button that declared, “Tolkien Spoken Here”. Now imagine that little phenom transcending another half century as Peter Jackson brought the epic to the big screen and reignited the craze all over again. Buck Roger’s or Middle Earth…fantasy always finds a place in people’s imagination just like people always find a way to fly their inner freak flag.
I’ve recently been inundated by a term of art that started with a countercultural phenomenon, crypto-currency, and has now gone very quickly mainstream just as another initially countercultural movement has gone mainstream even faster than crypto, Artificial Intelligence. This is not the fantasy of Middle Earth this time, it is quickly becoming the geo-political and socio-economic reality that is defining the future of mankind and the world as we know it, all in the blink of Gollum’s eye. “What has it got in its pocketsies, my precious?” Both crypto and AI have landed independently on the term “token” and use it to represent the best of crypto and now the essence of AI. No exaggeration…it’s defining the future of the world.
Crypto tokens and AI tokens are two completely unrelated uses of the word “token” that happen to share the same terminology, but given the impact of both phenomenon on us these days, that’s WAY too much coincidence to satisfy me. A crypto token is a unit of value or utility on a blockchain. There are a few distinct types: Currency tokens (like Bitcoin, ETH), which function as digital money or a store of value, Utility tokens that grant access to a product or service (e.g., Filecoin for storage), Governance tokens which give holders voting rights in a protocol’s decisions, Security tokens that represent ownership in a real-world asset (equity, real estate), and NFTs, the unique tokens representing provable ownership of a specific item. The key and common characteristics are that they all exist on a ledger, are transferable between wallets, have market prices, and derive value from scarcity, utility, or speculation.
AI tokens are a very different animal, but becoming increasingly prevalent in the AI culture. In AI/LLMs (large language models), a token is a chunk of text. Think of it as the atomic unit that a language model reads and writes. Roughly speaking, 1 token ≈ ¾ of an English word (so “hamburger” might be 2–3 tokens; “the” is 1). Tokens matter because the power of AI is in the fact that it uses plain language to take instruction and give output. And, context windows (what an AI engine can see and consider at one time) are measured in tokens (e.g., “200K token context”). It becomes the basic unit of measurement of AI usage and pricing is becoming a very big thing for AI. Pricing is per token (input vs. output tokens are often priced differently). Speed of processing has always been and will always be important as well and generation speed is measured in tokens/second. The model AI uses don’t see words or characters, they see token IDs from a vocabulary (GPT-4 uses ~100K token vocabulary). So, tokens in AI are a unit of text or ownership that live on a model’s context windows, have monetary value for billing, … not value per se, are ephemeral and not transferable like crypto tokens, and are NOT scarce by definition (again, unlike crypto tokens) since they are unlimited. The only real connection between crypto and AI tokens is that both fields borrowed “token” from older computer science usage where it meant a discrete, meaningful unit. In both cases it means the smallest meaningful piece, but the domains couldn’t be more different.
While I know how important everyone thinks crypto tokens are, AI tokens are truly the next big thing. It may well be that the ultimate universal currency is now upon us. Every interaction with a large language model, every prompt submitted and every response generated, is mediated by a structure that most users never see: the token. Understanding tokens is not some bit of trivia reserved for the hardcore members of the tech crowd. It is the prerequisite for understanding how AI is priced, AI agents consume compute, and why the emerging conversation about tokens as the universal currency for the GPU economy is expanding. In practical terms: one English word averages approximately 1.3 tokens. A 1,000-word document contains. Tokenization applies across all data types, not just text. Image-centric AI models map pixels or image regions into tokens. Audio models convert sound clips into spectrograms that are then tokenized. As multimodal AI becomes the norm, the token becomes the universal language of machine cognition across modalities. Token pricing is the commercial foundation of the AI industry. Nearly every major AI provider—OpenAI, Anthropic, Google, Mistral, Cohere charges for API access based on the number of tokens consumed, priced separately for input and output. Output tokens typically cost 3 to 5 times more than input tokens. This reflects the fundamental difference in compute intensity between “reading” a prompt and “writing” a response. In the language of AI infrastructure, generating text is “creative labor”; reading a prompt is “information retrieval.” The winners in the AI arms race will not only have the best models on offer but will have optimized for token pricing/usage. This means that the digital divide separating those with the means to access AI and those that do not will no doubt have serious political and economic ramifications in the not too distant future.
The Scale of Token Consumption is the next logical question. The growth of token consumption from 2024 to 2026 represents one of the most dramatic adoption curves in the history of technology infrastructure. In a period of approximately 18 months, global daily AI token consumption has grown by several orders of magnitude. Programming has become the dominant use case. Reasoning models (solving problems requiring multiple logical steps) surpassed 50% of all token usage. The market is now structurally bifurcated: proprietary models for reliability and enterprise workloads; open source for cost efficiency and customization. China’s daily AI token consumption surpassed 140 trillion tokens in March 2026, up from just 100 billion tokens per day at the beginning of 2024. That is a more than 1,000-fold increase in two years. China’s weekly large-model usage has surpassed that of the United States for several consecutive weeks now. We should all take note of that. The shift from human-initiated chatbot interactions to autonomous AI agents is the primary driver of the exponential growth in token consumption. The most consequential development of early 2026 is not the growth of token consumption per se, it is the emergence of tokens as an economic unit of account. The conversation has migrated from technical infrastructure to compensation/pricing strategy, macroeconomic policy, and social philosophy in the space of a few weeks.
Tokens have now suddenly become the newest form of employee compensation for AI superstars. It is said that AI tokens will be the “fourth component” of compensation alongside base salary, bonuses, and equity. How much compute access given to employees will define tech jobs very soon. In March 2026, Sam Altman of OpenAI stated: “Fundamentally, our business—and the business of every other model provider—is going to look like selling tokens.” Not so long ago, Altman introduced the concept of “Universal Basic Compute”, a social analogue to Universal Basic Income in which every citizen receives an allocation of AI compute (i.e., a slice of a future model’s processing capacity) that they could use personally, resell on an open market, or donate to causes such as medical research. The core insight is that compute access, measured in tokens, is a form of productive capacity that could, in principle, be distributed as a social resource rather than concentrated among those who can afford it. Microsoft CEO Satya Nadella has commented on the obligation that the AI infrastructure build-out (with all its attendant energy and water use) has created. Speaking at the World Economic Forum in Davos in January 2026: “We will quickly lose even the social permission to take something like energy, which is a scarce resource, and use it to generate these tokens, if these tokens are not improving health outcomes, education outcomes, public sector efficiency, private sector competitiveness—across all sectors, small and large.” This is a warning to the AI leaders, but it’s not clear that any of the executives (save perhaps at Anthropic) are taking heed. As tokens become the unit by which intelligence is manufactured and exchanged, controlling access to token generation becomes a form of productive capital in its own right.
It’s a brave new world and its all about AI from here on in. Crypto may wedge itself into people’s portfolios, but AI token…..watch out! So, where can I get a new lapel pin that says, “Token Spoken Here.”

