logo

In 5 Years, Knowing How to Code Will Be as Special as Knowing How to Type

Code is becoming a commodity. So what will actually set builders apart in the future?

/images/blog/developer-of-the-future.avif

There was a time when typing fast was a valuable skill. Companies hired people specifically because they could type. It was on resumes. It was a differentiator.

Then it became the baseline. Then it became invisible.

I'm not saying coding will disappear. I'm asking a harder question: what happens when writing code becomes as unremarkable as typing it?

Because the signals are already there.


The signals that point this way

AI tools are closing the gap fast. GitHub Copilot, Cursor, Claude Code — these aren't autocomplete anymore. They read entire codebases, plan changes across multiple files, run tests, and iterate on failures autonomously. In 2026, 92% of US developers are using AI coding tools daily. Gartner forecasts 60% of new code will be AI-generated by end of year.

The bottleneck is shifting. Ask any engineering team where they slow down. It's rarely "we don't have enough people who can write code." It's "we built the wrong thing." It's "we don't know what the user actually needs." It's "we can't agree on what to prioritize." These are not technical problems. They never were.

New roles are emerging without a clear name. Startups in San Francisco are posting jobs for "Builders" — roles filled by both technical and non-technical employees. Walmart created "agent builder" positions and filled them entirely from within. The job description isn't about coding. It's about going from problem to working product.

The value of code as an output is compressing. What used to take a senior engineer two weeks can now take a junior with good AI tooling two days. That compression doesn't stop here. The economic value isn't in the code anymore — it's in the judgment that precedes it.


But let's be honest — it could go another way

This is not a prediction. It's a hypothesis. And there are real signals pointing in a different direction too.

45% of AI-generated code contains security vulnerabilities. Teams are reporting higher code churn and decreased delivery stability when they lean too hard on AI without strong review. The productivity gains are real — but only when paired with technical depth.

Deep specialization still commands massive premiums. LLM fine-tuning, multimodal systems, edge AI, agentic architectures — specialists in these areas earn 30–50% more than generalists. The market isn't devaluing technical skill. It's devaluing generic technical skill.

The "vibe coding" ceiling is real. You can build a prototype with natural language prompts. You cannot debug a production incident at 2am, design a system that scales to 10 million users, or evaluate whether an AI agent's output is actually correct — without knowing what's happening under the hood.

Most AI initiatives still fail. McKinsey's 2025 surveys show companies struggle to translate AI pilots into business results. The failure mode isn't lack of technical talent. It's lack of people who understand both the tool and the problem it's supposed to solve. That person still needs to exist.

So maybe the question isn't "will coding matter?" It's "what kind of coding, and in service of what?"


What this new profile actually looks like

I've been thinking about what to call this. The market is trying out names: Product Builder, Agent Architect, Context Engineer. None have fully stuck yet.

But the profile is clearer than the title.

This person talks to users before opening an editor. Can write code but chooses when it's worth writing themselves vs. delegating to an agent. Understands where AI agents fail — and designs systems that account for that. Ships something working, gets feedback, and iterates without waiting for a spec. Thinks about what the product actually needs, not just what's technically interesting.

The paperclip maximizer thought experiment is useful here. An AI agent will optimize perfectly for the objective you give it. The person who defines that objective — who understands the user, the business, and the constraints — that's where the irreplaceable judgment lives.


How to prepare, depending on where you are today

If you're a developer

The goal isn't to become a PM. It's to close the gap between "I can build this" and "I understand why this is worth building."

  • Start talking to users. Even informally. Even about products you didn't build. Build the habit of asking "what problem are they actually trying to solve?"
  • Practice shipping things end-to-end. Not features — products. Even tiny ones. The constraint of being responsible for the whole thing changes how you think.
  • Learn to evaluate AI output critically. Don't just accept what the agent produces. Ask: is this correct? Is this secure? Is this what we actually needed?
  • Study systems design not just for architecture but for product thinking. How does this decision affect the user three steps from now?

What to start studying today: product discovery frameworks — Teresa Torres' Continuous Discovery Habits is a good start — basic UX research methods, and how to read and interpret usage data.

If you're a product manager

The goal isn't to learn to code from scratch. It's to close the gap between "I wrote the spec" and "I understand what's actually hard about building this."

  • Get hands-on with AI coding tools. Not to ship production code — to understand what agents can and can't do. You need to feel the ceiling firsthand.
  • Learn enough about systems to ask better questions. What are the tradeoffs here? Why would this be hard to scale? What breaks first?
  • Practice writing prompts for agents as if they were junior engineers. The discipline of being precise about what you want is the same discipline that makes you a better PM.

What to start studying today: fundamentals of how LLMs work (not the math — the mental model), basic SQL and data querying, and how to read a codebase well enough to understand scope.

If you're early in your career

You have an advantage the people above don't: you don't have habits to unlearn.

  • Don't just learn to code. Learn to build. Pick a real problem — yours, a friend's, a community's — and try to solve it end-to-end with AI tools.
  • Develop taste. Use a lot of products. Think about why some feel good and others don't. Taste is underrated and hard to teach.
  • Learn to communicate clearly in writing. This is the highest-leverage skill in an agentic world. Prompting an AI well and writing a clear spec are the same underlying skill.

What to start studying today: ship something small every month. Document what you learned. Build a track record of judgment, not just ability.


The question worth sitting with

I don't know exactly when this shift completes — or if it ever fully does. History is full of predictions about skills becoming obsolete that turned out to be wrong, or right but slow.

What I do think is that the people who will do well in whatever comes next share something: they're oriented toward the problem, not the tool. They care about what gets built and for whom, not just how.

Coding might stay special. Or it might become the new typing — assumed, invisible, and no longer sufficient on its own.

Either way, the question worth asking yourself today is: if the code writes itself, what do you bring?

I'd love to know what you think. Are you seeing these signals in your own work? Do you think I'm wrong about any of this?

GO HOME