Let the Robot Wars Begin!
Who Owns the Verbs
I’ve been struggling to find my writing path for 2026.
Multiple breakthrough moments at the end of December, it seems I am not alone. More as January started. All leading to a flood of ideas — some not yet actionable because the technology isn’t ready, some immediately implementable, others I designed, tested, and completed in less than a day. In January. Sometimes on a plane with terrible internet.
These breakthroughs are amazing, fun, and dangerous. They lead to unknown consequences and unfinished work — though the net is positive. The things that needed finishing got finished. The value created was real. The acceleration was real.
But the mess was also real. And finding a clean thread through it took longer than I expected.
Here’s where I’ve landed.
The Year of the Verb
2026 is going to be the year of agentic robots, agentic science, and automated new knowledge creation. Before and up to 2025, this was the domain of scientists, PhDs, and research organizations with timelines measured in decades. Things are changing fast — likely faster than groups of humans can keep up.
My team has been deep in digital agents for some time now — minimally a year in self-modulating AI agents. We’ve started calling the shift what it is: the move from compiled code to compiled intent. The software doesn’t execute your instructions anymore. It executes your goals. That difference sounds subtle until you live inside it for a week, and then you can’t unsee it.
Moltbot, Clawdbot, OpenClaw — whatever it will be called tomorrow — these are agentic communities. The first of a set of solutions designed for agents, not humans. We are about to experience the transition from a drizzle to a waterfall of self-modulating, self-improving, and self-regulating technologies — all created at, deployed to, and managed at the edge of the current internet.
This is a pattern not well understood. Not contemplated fully by organizations around the world. And potentially very disruptive to all current and future forms of work.
Pay Attention to Your Own Attention
Journaling this year will matter more than any year in the last twenty.
Not because the changes are bigger — they are — but because it will be difficult to remember and track the shifts in your own ideas, perspectives, beliefs, and opinions as all of this transforms around you. There is no need to write extensively or think hard about the format. When you have an idea that seems startling or novel, jot it down. The date. The stimulus. That’s it. You’ll want the record later.
As robots gain skills at exponential speed, paying attention to your own decision-making becomes critical. When you reach for that drink, or utensil, or snack — something is happening in your mind, your body, that you didn’t fully initiate. Pay attention. Agency erosion is real, and the tools naturally enable it as they anticipate your moves and strip friction from the thinking, building, and testing process. What feels like convenience is often a trade you didn’t agree to.
I’m watching this play out in my own house. My youngest wants the machine to teach her — she wants the path, not the destination. My thirteen-year-old told me last weekend he wants to ask the machine first, then think about what it says. I told him the opposite: always think before you ask. Same tool. Different sequence. Entirely different outcome. This is agency fluctuating in real time, at the kitchen table, over homework.

Compiled Intent and the New CRUD
This compiled intent will also change how we build and discover things. The clearest example is already scaling — in software development.
For fifty years, developers organized their world around a construct called CRUD: Create, Read, Update, Delete. The nouns of all systems. The verbs? Those were executed by humans — on systems, through interfaces designed for human hands and human judgment. The human was the verb layer.
The move to agentic has changed that.
The new Agentic CRUD: Curate, Reason, Update Continuously, Do. The verb layer. Agents will do those jobs. This pushes humans up another layer of the stack — into the compiled intent layer — where you stop telling the system what to do and start telling it what you’re trying to accomplish, and why.
The question this raises is not whether humans will still be needed. Of course they will. The question is what happens to the expertise that was built through the doing — the sorting, weighing, deciding, correcting. The friction that was the teacher. What do you lose when you stop doing the work?
I wrote a more detailed description of this in The Doing Was the Knowing — what CRUD becomes when agents own the verbs.

Further Reading
The Doing Was the Knowing — On CRUD, the verb layer, and what happens to expertise when agents take over the operations.
The Flip — On the perceptual shift that happens when you start working with AI instead of on it.
Total Transformation — On dragonflies, metamorphosis, and the question of what kind of change AI is actually asking of us.

