Skip to main content

AI Raises the Bar, But It Doesn’t Remove the Builders


Every week, there’s a new headline suggesting that AI is about to make large portions of the workforce obsolete. Development cycles are collapsing. Models are writing code. Systems are improving themselves. It’s not unreasonable to ask whether the role of the human professional is shrinking.

But that framing misses something important. AI is not simply reducing labor costs - it is raising expectations. And when expectations rise, the need for human judgment doesn’t disappear. It shifts.

Acceleration Without Elimination

Recently, while preparing a business demo, I leaned heavily on AI to troubleshoot and refine parts of the workflow. It generated SQL, suggested configuration changes, and dramatically accelerated the iteration cycle. But it also produced malformed SQL queries and confidently recommended an incorrect fix for a configuration issue. Each time, the system moved me forward faster, but it didn’t actually solve the problem. I still had to diagnose the root cause, constrain the environment, and make the correct architectural decisions.

That experience clarified something important. AI compresses execution. It reduces the time required to draft, generate, analyze, or iterate. The friction between intent and output is collapsing.

What hasn’t collapsed is accountability.

When something breaks in production, when a deployment introduces risk, or when a decision carries financial or operational consequences, the AI system isn’t the one answering for it. A person is. Speed is increasing. Ownership is not disappearing.

AI reduces the cost of iteration. It does not eliminate the need for judgment.

Consumers, Prosumers, and Perception

There’s a difference between using a technology and understanding its tradeoffs.

Millions of people stream music on Spotify every day without thinking about compression. The experience is fast, convenient, and sounds good enough. But audiophiles know that streaming sacrifices detail. You gain accessibility and speed, but you lose fidelity. That’s simply how the system works.

AI has a similar divide. Casual users see fluent output and rapid answers. Professionals who work with AI regularly understand that those answers are generated within constraints, with strengths and limitations that aren’t always obvious.

When you understand the tradeoffs, the system feels less mystical and less frightening.

The Acceleration Narrative

If this were only about casual use, the conversation would be simpler. But something else is happening at the leading edge of AI development.

A colleague of mine, Ben Ratkey, recently pointed out that some of the most advanced AI research organizations appear to have crossed a threshold. In many state-of-the-art software engineering workflows, the balance has flipped: what used to be roughly 20% AI-generated code and 80% human coding is now trending in the opposite direction. He also noted that recursive self-improvement loops may be activating, where AI systems meaningfully accelerate their own development cycles. The result, he concluded, could be shorter release cadences and faster capability leaps over the next 24 to 36 months.

That kind of acceleration understandably fuels concern. If AI can write most of the code, improve its own tools, and compress development timelines, what role is left for humans?

It’s a fair question, and it deserves a careful answer.

The “Ship Before Breakfast” Example

A recent TechCrunch report highlighted a striking development at Spotify: during a quarterly earnings call, company leadership disclosed that some of its top engineers “haven’t written a single line of code” since December thanks to internal AI tools. These systems - powered by generative AI and integrated with Slack and their build pipelines - can take a natural‐language request for a bug fix or feature, generate code, and deliver an updated build to an engineer’s phone before they even arrive at the office.

At first glance, that sounds like a dramatic rewriting of how software work gets done...and it is fast. Spotify shipped more than 50 new features and changes in the past year through this workflow.

But the key observation is this: the engineers aren’t absent from the process. They still define the intent, evaluate the output, merge it into production, and remain accountable for what goes live. The friction of typing code has diminished, but the responsibility for correctness, quality, and risk has not.

This example isn’t proof of human obsolescence. It’s proof of human re-anchoring, where practitioners shift from writing syntax to supervising outcomes and orchestrating execution.

The Expectation Escalation Effect

When execution becomes dramatically faster, expectations don’t stay the same.

If a feature can be implemented in hours instead of weeks, leadership begins to ask why everything can’t move at that pace. If iteration cycles collapse, tolerance for delay shrinks. If AI can generate drafts instantly, the baseline for productivity quietly shifts upward.

AI doesn’t just reduce cost. It raises the performance bar.

As speed becomes assumed, responsiveness becomes expected. The question moves from “Can this be done?” to “Why isn’t this done yet?” Acceleration doesn’t simplify organizations. It pressures them.

The Oversight Paradox

But the more capable AI becomes, the more expensive its mistakes become.

That risk doesn’t only show up in production systems. It also shows up in how quickly we accept output as authoritative.

Recently, a viral post described a company that had relied on AI-generated financial trend analysis for months, only to discover the numbers had been fabricated by the model. Board presentations had been built on hallucinated data. The lesson seemed obvious: trust without verification scales error.

There was just one problem. The story itself was AI-generated fiction. Thousands of readers amplified it before moderators removed it.

The irony is instructive. The failure wasn’t just technological. It was the absence of oversight, and the human tendency to equate fluency with truth.

In operational environments, the stakes are higher. When AI-assisted systems generate, modify, test, and deploy at machine speed, the potential blast radius expands. As autonomy increases, so does the need for governance.

AI systems operate at scale. They affect revenue, customer experience, and risk posture in minutes. Someone must define constraints. Someone must validate outputs. Someone must decide what is acceptable risk.

Put more succinctly:  acceleration without oversight creates instability.

As AI grows more powerful, the professional role shifts upward: from execution to orchestration, from syntax to systems thinking.

The stronger the system, the more essential the judgment surrounding it.

Work Doesn’t Disappear. It Evolves.

Whenever automation advances, the first question is whether jobs disappear. That fear is understandable. But history suggests something more nuanced: tasks compress faster than responsibility does.

AI is very good at reducing repetitive effort. It can draft, summarize, generate, and iterate at speeds that were previously impossible. What it does not remove is ownership. Organizations still need people who understand the broader system, define objectives, interpret results, and absorb accountability when outcomes matter.

The shift is not from employment to unemployment. It is from execution to leverage.

Typing code may matter less. Designing systems may matter more. Drafting documents may matter less. Validating outputs and shaping direction may matter more. The center of gravity moves upward toward judgment and coordination.

Disruption will occur. But the opportunity is not elimination. It is elevation.

Acceleration Doesn’t Remove Direction

As Ben suggested, some leading AI research organizations may now be entering recursive improvement cycles, which could accelerate capability gains. Release cycles may shorten. Performance leaps may feel sudden. The next 24 to 36 months could look very different from the last.

But acceleration does not eliminate direction.

Even if AI can generate code, refine workflows, or improve its own tools, it does not decide what problems are worth solving or what risks are acceptable. Those remain human responsibilities.

The faster systems improve, the more intentional leadership must become. Acceleration increases complexity. Complexity increases the demand for judgment.

And judgment remains a human function.

Where Value Moves

In summary, AI is changing how work gets done. It is compressing execution, accelerating iteration, and raising the baseline of what’s possible. That much is clear.

What is less obvious (and more important) is where value moves as a result.

As friction drops, expectations rise. As capability increases, complexity expands. As systems grow more autonomous, the cost of mistakes grows with them. In that environment, the premium shifts toward those who can define direction, impose constraints, evaluate tradeoffs, and absorb accountability.

The future isn’t humans versus AI.

It’s humans operating at higher leverage alongside increasingly powerful systems.

AI doesn’t remove the builders. It changes what building requires.

Popular posts from this blog

Finding Clarity in the Chaos of a Job Search

Job searches are humbling. They test your confidence, your patience, and your ability to stay motivated when things don’t move as quickly as you’d like. But they also teach you things about yourself that you might not have learned any other way. For me, the past few months have been a crash course in rediscovering what really matters: not just in a résumé, but in relationships, self-perception, and how we use technology to help tell our stories. Here are three lessons that stood out. Reach Out to Your Network (Long Before You Need It) Your network is a living thing. It requires upkeep, time, and attention, just like a flower garden. You can’t ignore it for years and expect it to bloom the moment you need it. Start planting early. Stay in touch with people whose paths you’ve crossed - colleagues, mentors, partners, even those you only worked with briefly. Drop a note once in a while. Comment on their posts. Share something that made you think of them. These small gestures are the sunl...

Time to Level Up!

With the recent news out of Salesforce and Oracle, it’s easy to understand why folks affected by layoffs might feel discouraged. Not only are they leaving companies they may have called home for years, but they’re also facing the daunting prospect of job hunting while headlines scream about “AI taking over human jobs.” Not long ago, another company I follow - let’s call it Acme  - went through a similar round of layoffs. Two employees in particular (we’ll call them Jim and John) showed how mindset can make all the difference. Jim had been at Acme for over 20 years. He was reliable, steady, and well-liked, but not exactly the standout type. When he was laid off, he decided to take some time off before even thinking about his next move. After all, he had a severance package. Didn’t he deserve a break after two decades of hard work? John’s story was different. Though he hadn’t been at Acme as long, he’d built a strong reputation and had both technical and leadership skills. Instead of...

COSMIC Insights

Consider the following scenario:  you're a mid-level manager and find out that a layoff is coming.  You're about too lose one of your best direct reports, but you have no ability to influence the decision to lay them off. Oy! My head hurts! What do you do? Oftentimes, I find that people - when presented with situations where they feel compelled to act but have no ability to change the outcome - enter a state of mental lethargy.  They don't know exactly what it is they should do but, "gosh darnit!", something has  to be done.  When they realize how helpless they actually are, they start lamenting about the situation, how they are backed into a corner, etc. In a very real sense, they go through the five stages of grief . I'd like to offer the following alternative way of approaching these and other situations:  I call it the COSMIC method, not only because it sounds cool but also because I like science fiction (" Lisan al Gaib! "). COSMIC is an acronym...