Levelling Up
Anyone who’s spent time in multiplayer games knows the rhythm. You drop into a new environment with almost no information. You die a lot. You watch what other players are doing. You gradually piece together the logic of the world, acquire the right gear, figure out which skills actually matter. Then, collectively, the player base cracks it and everyone levels up together.
That’s tech adoption. More specifically, that’s what’s been happening with AI over the last few years.
The Levels So Far
The gaming metaphor works because adoption really does move in discrete phases and each phase has its own dominant meme, its own shared “power-up” that the collective decides is the key to progress.
Level 1: Prompting as craft (2022-2023). ChatGPT drops and overnight everyone becomes obsessed with the idea that asking the question correctly is a skill. Prompt engineers emerge as a genuine job category. The power-up is linguistic precision. Know the magic words. The lore is: the model is dumb but can be coaxed.
Level 2: The 10x operator (2023-2024). The meme shifts. It’s not about the perfect prompt anymore, it’s about workflow. Who can stitch tools together fastest? The power-up is system design, automation, chains of thought. The lore is: AI is a multiplier and your value is in how cleverly you deploy it.
Level 3: Vibe coding (2025). Andrej Karpathy’s framing went viral: fully give in to the vibes, embrace exponentials, forget the code even exists. The power-up becomes surrender. Stop trying to control, just direct. The lore: the bottleneck is human perfectionism.
Level 4: Taste (now). The current consensus is crystallising around “taste” as the new differentiator. When the AI can execute almost anything, the person with discernment wins. Knowing what good looks like. The power-up is aesthetic and cultural intelligence. The lore: curation is the new creation.
The Problem With Levelling Up in a Rush
In a game, the levels are designed. Someone architected the progression. “Reality” is a construct. The difficulty is calibrated. The lore is internally consistent. But reality isn’t like that. Its much messier.
What we’re actually doing is a large group of people who share roughly the same information diet, the same conference circuit, the same LinkedIn feed, collectively deciding on a new “truth” every 12 months. If enough respected voices start saying “taste is the new moat,” it becomes the consensus. And once it’s consensus, it functions as truth, even if it’s only half right, or context-dependent, or a useful story rather than a durable insight. This isn’t new. I’ve observed this consensus bubble narrowing tendency for the last 30 years of working in and around tech.
The compressed cycle creates a specific kind of blindness. Each level’s meme feels definitive because it resolves the previous level’s anxiety. Prompting gave way to systems because prompting felt too fiddly and brittle. Systems gave way to vibes because systems felt too rigid. Vibes gave way to taste because vibe coding turned out to produce code with 1.7x more major issues than human-written code. Each power-up is partly a correction, but it’s also a kind of collective forgetting. The previous meme gets abandoned before it was fully understood.
And the self-serving consensus problem is real. The people most loudly proclaiming “taste” as the new differentiator are, almost universally, people with demonstrable taste. Designers. Senior creatives. People who’d benefit from that being true. That doesn’t make them wrong. But it should give us pause.
What The Game Metaphor Misses
In a game, you can reload from a save point. The environment resets. Experimentation is cheap.
In the real world, the things being built during each “level” persist. The vibe-coded apps with security vulnerabilities don’t disappear when the discourse moves on. The institutional habits formed during the prompt engineering phase shape how organisations think about AI for years. The memes don’t just describe adoption, they determine what gets built, which decisions get made, which capabilities get developed and which get ignored.
There’s also a much slower, quieter process happening beneath the meme cycle that barely registers in the discourse. How people are actually changing the way they think. What happens to expertise that atrophies. What develops in its place. One developer found that after relying heavily on AI tools, tasks that used to be instinct became manual and cumbersome , when he worked without them. It’s a transformation.
Every level feels like progress and in many ways it is. But the game metaphor flatters us. It implies we’re solving the environment, cracking the lore, accumulating the right tools. It suggests forward motion.
What it doesn’t account for is everything happening outside the edges of the map. The slow, structural changes. The capabilities we’re quietly trading away. The questions we’ve stopped asking because the current meme makes them seem naïve.
I’ve seen these swarm patterns again and again since I started working in digital 30 years ago. The technological change prompts interest which makes a related solution easier to sell and arriving at a consensus makes it easier still. But in this narrowing so many brilliant opportunities are eclipsed. I always try to be involved in the trend enough to gain a fluency, but remain critical enough not to get entirely sucked in. A vantage point to spot the juicy fruits, whether they’re a consequence of the new technology or not.


