Posted in

The Future of Programmers: When We Are The Complexity!

Part I: Understanding The Law

It was 1994 when I wrote my first program in GW-BASIC, one of the very early dialects of the BASIC programming language. Back in those days, some BASIC dialects required line numbers, and GW-BASIC was one of them. Conventionally, you started with number 10 for the first line and increased it by 10 for each following line. This way, if you needed to insert new lines later on, you wouldn’t have to shift all the following line numbers one by one. The code was compiled and executed according to these line numbers.

10 CLS
20 INPUT "Addition or Subtraction (A/S): ", P1
30 INPUT "First number: ", N1
40 INPUT "Second number: ", N2
50 IF P1 = "S" GOTO 90
60 SUM = N1+N2
70 PRINT "SUM=";SUM
80 GOTO 105
90 SUB = N1-N2
100 PRINT "SUB=";SUB
105 INPUT "Quit (Q): ", K
106 IF K <> "Q" GOTO 10
110 END

There was also another reason to use line numbers in GW-BASIC: the GOTO statement. Since it was a non-structured procedural language, this statement was essential to manage the flow of the program.

GOTO felt magical to me. I loved it—and totally abused it. They started flying all over my code. I thought that by using them, I was proving how good I was at programming. As you can imagine, my code became quite unreadable but I was happy with that!

Later on, I switched to Pascal and continued sprinkling GOTO statements wherever I could, until one day a friend told me that GOTO actually makes code ugly and that “good programmers” don’t use it. No way! Why would that be? Soon after, I kept hearing the same thing from others, including my high school programming teacher: “GOTO is harmful.” And years later, I would have discovered Edsger W. Dijkstra’s famous letter, “Go To Statement Considered Harmful.”

It took serious effort for me to write code without using GOTO. I felt disappointed and even a little angry, because I was being pushed out of my comfort zone and forced to solve the same problems in a completely different way (Structured Programming). It almost felt like learning programming from scratch once again. 

But what I didn’t realize at the time was that my small struggle mirrored a much larger pattern. Every generation of programmers has had its own ‘GOTO’ — a habit, a tool, or a mindset that once felt indispensable but eventually had to be abandoned. My frustration wasn’t unique; it was simply my turn to face the unwritten law of computer science: embrace change.

Part II: The Force Behind Change

In the 1950s and 60s, the first high-level programming languages began to appear. At first, many engineers resisted them, preferring to keep writing assembly and seeing compilers as inefficient. But high-level languages were designed to reduce the complexity of machine-level programming by abstracting away details like memory management and registers. By the 1970s, they were gaining serious traction, and by the 1990s they had completely taken over both business and scientific computing. Almost in parallel, the first foundational domain-specific languages (DSLs) emerged — SQL, HTML, CSS. Compared to general-purpose high-level languages, DSLs offered an even greater level of abstraction: narrower in scope, but far more expressive within their domains, making programming easier and more powerful in those specific areas.

Abstraction may not be the sole driver of change in computer science, but it remains one of the most powerful. At its essence, abstraction is about simplification — the same goal technology has pursued from the very beginning.

Frameworks, libraries, ORMs, even operating systems are all built on abstraction, each hiding layers of complexity beneath them. External services in our domain, like a payment API, abstract away financial complexity. A Docker container abstracts infrastructure itself. Abstractions are everywhere. We lean on them, hide the complexity, clear the path, and move forward.

Part III: When We Are The Complexity!

For decades, we’ve abstracted everything in computer science. Every step was about hiding one layer of complexity so we could focus on something higher. But now, apparently it’s our turn to get abstracted. 

At this point, we are the complexity in software engineering. Our knowledge, decisions, and workflows are being packaged, modeled, and automated. Best practices that once lived in our heads — design patterns, testing approaches, optimization tricks — are now being encoded into AI assistants and domain-specific copilots. Documentation and tutorials, once scattered across books and blogs, are being ingested by large language models and surfaced instantly as working code snippets.

That’s exactly why AI is here: to abstract us. We’ve become the bottleneck — the messy, inconsistent, human-shaped layer standing between business needs and working software. AI is the industry’s attempt to hide our complexity, just as we once hid the complexity of hardware, memory, and infrastructure.

AI isn’t the worst thing we’ve done to ourselves. We invented agriculture before! And on the bright side, it might finally be time to say goodbye to duct-tape programmers as those with strengths in orchestration, design, and system-level thinking continue to thrive.

Part IV: Quantum Computing: A New Way to Attack Complexity!

I just want to touch on this paradigm briefly, since it lies ahead of us. What does it aim for, and how will it shape our future as programmers?

Its real aim is to tackle the kinds of complexity where classical computers get stuck and brute force is the only option left. Factoring huge numbers, optimizing global routes, simulating molecules — problems that explode exponentially. Qubits in superposition allow many possibilities to be explored at once, turning the impossible into something solvable.

For programmers, this won’t mean “faster code” but different thinking. Instead of deterministic steps, we’ll work with probabilities, superpositions, and quantum gates. It will feel like learning programming all over again — another round of unlearning and relearning.

And just like AI didn’t remove the need for us, quantum won’t either. But it will change what matters: not typing code, but translating real-world problems into quantum formulations. The domains once untouchable — optimization, cryptography, advanced simulations — will open up. The challenge is the same as ever: can we embrace the change quickly enough when the ground shifts again?

(Soon…) The Future of Programmers: Survival in the Age of AI