The Curious Evolution of Coding: Why AI Won’t End Programming (But Will Change It Forever)

raindrops-splashing-wet-surface-with-blurred-city-lights-background

Perhaps the ultimate programming skill will be distinctly human:creativity in problem definition. After all, computers can increasingly answer questions, but they struggle to know which questions are worth asking. The dilemma now isn’t whether AI will replace programmers– it’s whether programmers who use AI will replace programmers who don’t

Remember when people used to physically wire computers by hand? Probably not, unless you’re pushing 90. But that’s how programming began in the 1940s: human hands connecting actual circuits to perform calculations, one painstaking connection at a time.

We’ve been watching the latest panic ripple through Silicon Valley. “AI will replace programmers!” scream the headlines. Tech workers fret about their future. Coding bootcamps wonder if they should pivot to “prompt engineering” instead.

But here’s the thing: we’ve seen this movie before. Multiple times, in fact. And it never ends the way people think it will.

The Cycle of Creative Destruction (With a Twist)

“When I started in this industry, we were still debugging with oscilloscopes,” a veteran programmer friend told me recently over coffee in Sector 5, Salt Lake Electronic Complex at the eastern outskirts of Kolkata. We were discussing ChatGPT’s ability to write surprisingly decent Python code. “People have been predicting the death of programming since the invention of COBOL.”

He’s right. Programming has perpetually been in a state of creative destruction. The original programmers – women like Betty Jean Jennings and Frances Bilas who programmed the ENIAC in 1946 – would barely recognise what we call “coding” today.

First came assembly language, rendering binary code obsolete for most. Then higher-level languages like FORTRAN and COBOL emerged, allowing programmers to express ideas more abstractly. Later, interpreted languages like BASIC democratised programming, helping launch the personal computer revolution from suburban garages instead of corporate back offices.

Each transition sparked anxiety. Each ultimately created more programming jobs, not fewer.

“What happens is that programming becomes simultaneously easier and more powerful,” explained James Bessen, an economic historian who studied technology transitions in America’s early textile mills. “When barriers to entry fall, more people can participate, and entirely new industries emerge.”

This pattern played out dramatically with the web revolution. Suddenly anyone could build simple “applications” with minimal skill. WordPress made website creation point-and-click simple. Yet the demand for professional developers skyrocketed.

The AI Turning Point

But maybe this time really is different?

At OpenAI’s offices in San Francisco recently, researchers are working on systems that can generate complex applications from natural language descriptions alone. “Tell it what you want, and it builds it,” is the catchphrase there as the latest models create a functional e-commerce site from a few paragraphs of requirements.

They call it “chat-oriented programming” or CHOP – essentially having a conversation with AI that results in working software. It’s the programming equivalent of asking ChatGPT to write your college essay.

Andrew Ng, who co-founded Google Brain and now runs deeplearning.ai, isn’t worried about programmers becoming obsolete. “The programmer of tomorrow will be more like a director,” he told me. “They’ll focus on what to build and why, while delegating how to AI collaborators.”

This resonates with what Nat Friedman, former CEO of GitHub, while describing GitHub’s Copilot tool. “Great programmers have always been distinguished more by their system design skills than their ability to remember syntax,” he said. “AI is just making that distinction more apparent.”

The New Programming Class Divide

A study published last year found developers using AI assistants completed tasks 55% faster. But here’s where it gets interesting: the benefits weren’t distributed equally.

“Junior developers who embrace AI tools can now outperform senior developers who resist them,” argues Steve Yegge, a respected voice in programming circles. He calls it “The Death of the Stubborn Developer” – suggesting the real risk isn’t AI replacing programmers, but programmers who can’t adapt being replaced by those who can.

This creates a fascinating generational dynamic. Younger developers, raised on YouTube tutorials and Stack Overflow, view AI as just another learning resource. Many senior developers see it as either threatening or merely a fancy autocomplete.

The truth, as one Stanford professor says, lies somewhere in between: “It’s more like having an eager but somewhat unreliable intern who can handle routine tasks but needs supervision for anything important.”

What Tomorrow’s Programmers Will Actually Do

So, what will programming look like in 2030?

First, expect an explosion of software creation by non-specialists. The marketing manager who once needed a development team to build a customer portal might now direct an AI to do it instead. This “citizen development” will become mainstream.

For professional programmers, the workday will look quite different. “You’ll spend less time implementing and more time verifying,” explained an MIT researcher developing tools to check AI-generated code for correctness. “Finding the subtle logic errors in otherwise perfect-looking code becomes the key skill.”

Michio Kaku, the futurist and physicist, envisions programming becoming more like teaching: “You’ll be training systems rather than commanding them. The best programmers will be those who can clearly communicate intent and verify that the system understood correctly.”

The most valuable skills will shift accordingly:

  • System architecture – envisioning how components should interact
  • Verification and testing–ensuring AI-generated code is correct and secure
  • Domain expertise–understanding the real-world context AI might miss
  • Effective collaboration with AI–an emerging skill akin to management

Meanwhile, some programming specialties will remain stubbornly resistant to automation. Embedded systems programming for medical devices or spacecraft, where lives depend on perfect execution, will likely remain human-centred longer than web development.

Learning to Program in an AI World

This transformation is already reshaping education. MIT, Stanford, and other leading computer science departments have begun integrating AI collaboration into their curricula. “We still teach the fundamentals,” a department chair told me, “but we’re adding courses on effectively directing AI systems.”

The skills gap concerns are real. A McKinsey report estimates that by 2030, AI could automate 60-70% of current programming tasks while creating demand for new roles in AI orchestration and verification. This transition period could be bumpy.

“What matters to an industry and society isn’t how long it takes to train one worker but what it takes to create a stable, trained workforce,” Bessen told me, noting that the largest economic benefits of the Industrial Revolution took decades to materialise for precisely this reason.

The Programmer’s Paradox

The irony in all this? Programmers have spent decades trying to make computers understand humans better. Now that we’re succeeding, many fear becoming obsolete. The goal was always to make machines understand what we want, not force humans to think like machines.

Perhaps the ultimate programming skill will be distinctly human: creativity in problem definition. After all, computers can increasingly answer questions, but they struggle to know which questions are worth asking.

The question isn’t whether AI will replace programmers. It’s whether programmers who use AI will replace programmers who don’t. And that transition, while potentially disruptive for individuals, suggests a future with more software creation than ever before–just accomplished in radically different ways. The code may be written by AI, but the vision will remain human. At least for now.

Leave us a Comment