This week, we recorded the annual year-in-review episode of the podcast. Of course, much of what we discussed had to do with AI and its impact on software development. We also went down the rabbit hole of discussing what the daily work of a “programmer” could be like in the future. And, while we obviously had to cut this discussion short in the podcast, I think there is merit to finishing that train of thought here.

Some of my co-hosts suggested that there might not be many holdovers from the current job of a „programmer”. Because in the (potentially near) future much, if not most, code would be written by large language models (LLMs) and all that would be left for the “programmer” to do was to prompt them properly. In fairness, the prevailing sentiment in the industry aligns with this perspective, foreseeing a drastic transformation or potential disappearance of the profession.

Here’s where I ardently disagree.

I firmly believe that our profession isn’t defined solely by the code we write, let alone the current methods we employ. If that were the case, proponents of Assembly might argue that those using higher-level languages shouldn’t be labeled “programmers.” The same argument could extend to individuals who wrote code on punch cards or threads, suggesting that these digital natives might not comprehend what “real” programming entails.

Programming inherently tends to become more abstract and higher level over time. (When was the last time you had to worry about a pointer in memory?) Some languages even explicitly mimic human language for readability (Objective-C, anyone?). Yet, we don’t withhold the “programmer” label in these cases.

So, when people use human language to prompt LLMs to generate code and software, should we perceive them as less skilled? I might be going out on a limb here, but there might also be a very primitive fear at play here: the fear of becoming obsolete. Hence, the notion of framing our current way of working as a form of higher art that is very different from what future “programmers” would be doing. (Ada Lovelace is probably turning in her grave right now…)

If writing code isn’t the essence of being a “programmer,” then what is?

First of all, I suggest the term “software developer” over “programmer”. I know this is a very blurry line and everyone has their own definitions here, but in my book, “programming” simply implies typing out code whereas “software development” implies a more holistic approach to the craft. Your interpretation may vary.

Regardless of terminology, the essence of software development is not merely to produce code. The goal of software development is to turn (business) requirements into software so that it can deliver value. That requirement can be very personal to you, or be part of a business; the software can be a complex application or a small script; and the value can be anything between making money, scratching an intellectual itch, or just providing fun and leisure. In that process, coding is one tool (of many) to get the job done, but it is not a goal in and of itself.

Are you working on an operating system? Congratulations, you’re a software developer! Automating a home task with a small script? Congratulations, you’re a software developer! Manipulating data programmatically in data science? Congratulations, you’re a software developer! Building something with no-code tools? Congratulations, you’re a software developer!

So what I want you to take away from this is: that learning and knowing how to write code is not what makes you a software developer. It might be a fantastic starting point - but there is so much more. Which is why getting to write less code does not necessarily make you any less of a software developer. Understanding business requirements, anticipating edge cases, finding creative solutions, foreseeing future needs and requirements, learning from user feedback and learning when to (not) ignore it, learning about architecture and patters, the human element in collaboration and so, so much more - enough to write a lifetime worth of books on the subject.

And who knows, maybe … in the distant future … AI could be universal enough to handle all these tasks. But even then, someone is still going to have to interact with that AI in some form or manner, and that someone … is going to be a software developer.


This discussion could easily be dismissed as “being about semantics only”. But I strongly believe that language matters. In using certain language we have a choice between elitism and inclusivity. Even if we seldom discuss these matters, language molds our mental model of the world, making these discussions crucial.

I’d rather live in a world that embraces change and welcomes more people into our craft than one that raises barriers to entry.