
Most Companies Are Behind. Even the Ones Who Think They're Ahead.

I've been to a lot of AI conferences. I've delivered sessions and workshops to thousands of people. And I've watched my own show-of-hands question evolve over time.
It started with "who's using AI weekly?" At first, maybe 10% of the room. Then 25%. Then half. Then nearly everyone. The transition happened fast.
I've started adding a new question lately. "How many of you have an AI agent running in the background while you work?" Zero hands, every time. Even in rooms full of people who use tools like ChatGPT daily.
At CES this month, I saw a version of this gap from the other side. A speaker asked the audience who had heard of evals. Nearly every hand in a large ballroom went up. The presenter stumbled a bit through his surprise. "Wow, it's usually much lower than that."
This was an advanced audience. People who flew to Las Vegas to see what we need to plan for in 2026. People who know what evals are. I'd bet that if someone had asked about agentic workflows, a lot of hands would have gone up. But knowing about something and actually running it are different things. The gap between awareness and adoption is the story of 2026.
The hype and the real
CES had two types of AI presentations. The first was the sales pitch. Polished slided and big claims but no live demos. "We've automated 80% of our org chart." "Our agents run entire departments."
One presenter gave a long talk about agentic systems handling complex workflows across his organization. It sounded impressive, but a little far fetched to anyone who has built these types of systems. I saw the hesitancy in a few experienced people I met at the conference as the guy spoke. His experience didn't match ours.
Then he ended up on a panel with someone who'd actually built these systems at scale. The moderator asked everyone to share a failure story. The sales-pitch guy couldn't name one.
A few people walked out. It was late in the session, but I think it showed how people felt. We weren't getting an honest take on the industry. Just a story that anyone building these tools could see through. Unfortunately, there's a lot of that right now.
On the other side, Joel Hron, CTO of Thomson Reuters, talked openly about how early the tooling still is. His examples were practical. Using AI to review code before deployment. Exposing their tax engine as a tool for agents to validate calculations. Insights like, "What you thought you wanted out of an agent nine months ago is very different than what you want today." The AI gives options, humans make selections. Nothing about replacing 80% of the workforce.
Here's the thing, though. Even the hype crowd is ahead of most companies. They're using these tools daily. They're experimenting. They're directionally right about where this is going. They're just claiming we're three years further along than we actually are.
Failures
When you use these tools consistently, you run into failures. We've seen our fair share and they're important to highlight because they teach us where we need to go in the future.
Our in-house CRM
We tried to build an internal CRM seven or eight times before it worked. We recently launched MSCRM, a fully vibe-coded CRM that replaces our Pipedrive subscription with more of the features we need and fewer we don't. But before that, we failed repeatedly with different coding tools and models. Every time, I knew the tools just weren't ready yet. The models were capable, but they needed systems that let them work without losing context. They needed access to relevant data. They needed permissions to work within file systems. All of this is being actively worked on, but few of these things work consistently today.
That's the thing about trying and failing. If you know where the failure points are, you're ready to plug in new tools the moment they make the leap. If you're not failing, you're either not pushing hard enough or you're going to be behind when the tools improve.
Early adoption challenges
Our initial internal adoption flopped. In late 2022, we introduced ChatGPT to our full team the same month it launched. I expected people to just start using it. They didn't. For months, we had paid accounts that few were using. That failure is why we started building internal training programs, which became the service we now offer clients. Buying seats is not enough. You have to build the muscle.
Poor use cases
We used AI for its weaknesses instead of its strengths. Early on, we tried using AI for outbound sales. It failed. Response rates and conversions were far worse than human outreach. Even at tiny token costs, there was no ROI.
Now we know better. We use AI for research. It generates lists of prospects with background on their business and why we might return value. Then humans handle the outreach. We use AI to draft proposals from RFPs, but the system invites a human in at critical moments to provide strategic direction. The AI runs the process, the human offers context and judgment, the AI produces options, the human refines and delivers.
Trying to replace human strengths with AI while using humans for AI tasks is a way to get nothing out of these tools.
Organizations will look different this year
There's a trend that surfaced at CES that most people aren't talking about yet. Large companies are breaking big teams into small ones. Not (only) trimming headcount. They're restructuring how work gets organized.
You've probably seen the diagram below. Every person you add to a team increases communication complexity. Three people have three connections. Ten people have forty-five. Thirty people have four hundred thirty-five. The coordination cost of large teams is rough.
3 people
3 connections
10 people
45 connections
30 people
435 connections
AI changes this. When you have an always-on execution partner handling repetitive work, you don't need as many humans in the loop. You can keep the human layer tight. Three people instead of thirty. You move faster because you're not drowning in meetings and Slack threads and status updates.
This goes well beyond just using AI tools. It's a structural change in how work gets organized. Companies are experimenting with breaking 200-person teams into dozens of small, autonomous pods. Each pod owns an outcome. Each pod has AI handling the execution layer. The humans focus on judgment, decisions, and the things that actually require a human brain.
This will be hard for most organizations to adapt to. Beyond the organizations, this is a significant change for most workers, especially middle managers. It's not a new tool you can roll out with a training session. It's a new operating model. I'm not sure most companies are ready for that conversation yet. But it's here.
People aren't ready for what's coming
This month we started using the Ralph Wiggum loop. It's a (terribly named) technique for running AI coding agents autonomously for hours instead of minutes. The basic problem is that tools like Claude Code work in sessions. They run until they think they're done, then stop. The Ralph Wiggum approach wraps a loop around that. When the AI tries to exit, a hook catches it, feeds the prompt back in, and lets it keep working. Each iteration builds on the last.
At its core, it's a task list the AI works to complete, plus a system that keeps the context window relatively clear so the model doesn't get slow as you use it. A fairly simple trick that completely changes the model's performance.
We saw this approach gaining traction online in December and started exploring it. After using it to build our CRM, it was clear something was different. Our team did a deep dive. What works, what doesn't, and how to simplify it for our own use (these early new approaches are also typically over-engineered). Now we're using it to build out large feature sets that senior developers can review and refine much faster. We're running multiple exploratory MVPs at once to see what works. I'm building personal apps to run all different parts of my work and life. It's been great for design exploration, feature concepting, and giving our team more ideas to choose from.
For simple web projects, we've built an approach that automates a lot of the execution. Humans stay in charge of strategic direction, content strategy, design oversight, and architecture. Then we hand off execution to AI with enough context to run.
In one month, we've changed aspects of how we work that I'm confident other organizations will still be catching up on in 2027.
The pattern is predictable. Users figure out techniques that push the tools further. The labs catch up and build it in. Then new user techniques emerge at the cutting edge. Right now, autonomous loops are that edge. By the time the major tools adopt this natively, there will be something else. There's always a next thing. And those who aren't experimenting have to wait, while those who do are seeing 10x, 100x, and even 1000x return on investment.
The gap is coming
I talk to a lot of companies about AI adoption. The ones in the lead have done the work. Policies, procedures, hands-on training, dedicated Champions, and enterprise subscriptions. They're using these tools daily. That puts them ahead of maybe 80% of companies out there.
But they're still behind the people running autonomous agents. They're still behind the organizations restructuring into small, high-autonomy teams. They're still behind the folks who can tell you exactly how their systems fail and what they've learned from it.
I think the chasm is coming. People who've been reluctant to adopt are going to find themselves starting from zero while their competitors operate at 10x to 1000x their output. People who deny the benefit entirely are going to be left behind in ways that will be hard to recover from.
I work hard not to sell hype. That doesn't help anyone. We don't claim to be three years ahead of where the tools and infrastructure actually are. But we're here. And "here" is still a lot. The tools work. The patterns are emerging. The early adopters have already crossed into territory that most companies don't even know exists.
Whether you work with us or figure it out on your own, the time to move is now. Not next quarter.