> Unfortunately, the winds of change are sometimes irreversible. The continuing drop in cost of computers has now passed the point at which computers have become cheaper than people. The number of programmers available per computer is shrinking so fast that most computers in the future will have to work at least in part without programmers.
show comments
cjfd
The article talks about 'software development will be democratized' but the current LLM hype is quite the opposite. The LLMs are owned by large companies and are quite impossible to train by any individual, if only because of energy costs. The situation where I am typing my code on my linux machine is much more democratic.
show comments
PeterWhittaker
One important and often overlooked democratization is spreadsheet formulas: non-programmers began programming without knowing they were, and without concern for error and edge cases. I cannot find the reference right now, but I recall seeing years ago articles about how mistakes in spreadsheet formulae were costing millions or more.
I see an analog with AI-generated code: the disciplined among us know we are programming and consider error and edge cases, the rest don't.
Will the AIs get good enough so they/we won't have to? Or will people realize they are programming and discipline up?
show comments
sfblah
I generally agree that it's difficult and counterproductive to try to eliminate talented programmers who put together the core of systems and set up the patterns that things like LLMs can emulate.
But, the modal programmer at this point is some person who attended a front-end coding bootcamp for a few months and basically just knows how to chain together CSS selectors and React components. I do think these people are in big trouble.
So, while the core, say, 10% of people I think should remain in the system. This 90% periphery of pretty bad programmers will probably need to move on to other jobs.
show comments
getnormality
I find it so fundamentally unhinged that people think things will get fully automated to the point that humans no longer matter. We are centuries into the deep automation of certain things, like looms, but people with deep understanding of those things are still needed to guide the automation and keep it working to meet human needs.
To ignore that pattern and say everything's going to be automated and humanity will be irrelevant seems to me to be... more of a death wish against human agency, than a prediction based on reality.
show comments
hnlmorg
I remember being in my early 20s, learning C and Pascal, and having this one kid telling me I was learning dead languages and he’d earn 3 times more than me leaning 4GL as well as himself being 3 times smarter than everyone else too.
The only reason I remember this encounter so clearly was because he got rather annoyed, to the point of being aggressive, when I pointed out that most of the computing landscape was built on C and this wasn’t going to change any time soon.
Multiple decades later, and C-derived languages still rule the world. I do sometimes wonder if his opinion mellowed with time.
manithree
I remember sitting in a senior seminar class in 1989 full of CS students. We were solemnly informed by a very earnest IBM employee that we would regret having majored in computer science because IBM's CASE tools were going to kill job market. That aged like milk.
Will something come along some day that will actually drastically reduce the need for programmers/developers/software engineers? Maybe. Are we there yet? My LLM experience makes me seriously doubt it.
show comments
jleyank
Developers are “unwanted overhead” until the customer money threatens to walk out the door. They’re going to damage their future products and probably reduce their customer base (fewer consumers) and then sit there looking like gaffed fish when the budget ink turns red. “Who would have thought…”
Don’t facilitate losing your job.
show comments
bdcravens
The market however has done a pretty good job of it, especially when it's a developer bull market that suddenly shifts directions. Case in point: late 90s, the mad rush to put warm bodies in chairs for those who could even spell HTML. A few years later, many had left and gone back to selling cars or whatever they did before.
BobBagwill
The potentially cool thing about LLM's is bootstrapping. No matter how much COBOL you wrote, COBOL didn't get better. LLM's can be used to make LLM's (and other software stuff) better. LLM's could be used to create their successor(s).
Of course, in the end, it won't do us humans any good, because when the Singularity AKA Rapture comes, we'll all be converted to Computronium. :-)
kopirgan
Wow it mentions practically every flavour of the month technology that was supposed to make it drag and drop to make useful programs
I recall Power builder in particular it was the rage.
manoDev
There are two ways to look at it:
- Software engineering is a cost center, they are middlemen between the C-level ideas and a finished product.
- Software engineering is about figuring out how to automate a problem, exploring the domain, defining context, tradeoffs, and unlocking new capabilities in the process
show comments
shiandow
LLMs seem quite successful when considered something like a natural langiage interface, but expecting intelligence seems a step too far. For one they do not learn, at least not online, and that is a somewhat important requirement for truly intelligent behaviour.
Arguably programming is as much learning as it is writing code. This is part of the reason some people copy an entire API and don't realise they're not so much building useful code as building an understanding.
show comments
helsinkiandrew
I'd say that the article left out Software Reuse - talked a lot more about in the late 90's early 00's than now.
You could argue that coding with LLM's is a form of software reuse, that removes some of its disadvantages.
show comments
bluGill
Something else that really should be mentioned:
Every recession where there was mass lay-offs on programmers (not every recession hits programmers hard), there were many articles saying that whatever that latest thing [see article] was the cause of this and industry is getting rid of programmers they will never need again.
In every case of course "it is the economy stupid". The tools made little difference in the need for programmers. The tools that worked actually increased the need because things you wouldn't even attempt without the tools were now worth hiring extra people to do.
debo_
I like the arrogance present in the title. "Eternal promise" in a discipline that was conceived about a century ago.
ryanjshaw
Until a year ago I believed as the author did. Then LLMs got to the point where they sit in meetings like I do, make notes like I do, have a memory like I do, and their context window is expanding.
Only issue I saw after a month of building something complex from scratch with Opus 4.6 is poor adherence to high-level design principles and consistency. This can be solved with expert guardrails, I believe.
It won’t be long before AI employees are going to join daily standup and deliver work alongside the team with other users in the org not even realizing or caring that it’s an AI “staff member”.
It won’t be much longer after that when they will start to tech lead those same teams.
> "The name derived from the idea that The Last One was the last program that would ever need writing, as it could be used to generate all subsequent software."
That was released in 1981. Spoiler alert: it was not, in fact, the last one.
miljanm
what's wrong with eliminating programmers?
pixelsort
> There is every reason to believe that those who invest in deep understanding will continue to be valuable, regardless of what tools emerge.
I don't take issue with this, except that it's a false comfort when when you consider the demand will naturally ebb and individual workload will naturally escalate. In that light, I find it downright dishonest because the rewards for attaining deep knowledge will continue to evaporate; necessitating AI-assistance.
The reason is it different this time around is because the capabilities of LLMs have incentivized the professional class to betray the institutions that enabled their specializations. I am talking about the amazing minds at Adobe, Figma, and the FAANGS who are bridging agentic reasoners and diffusion models with domain-specific needs of their respective professional users.
Humans are class of beings, and the humans accelerating the advance of AI in creative tools are the reason that things are different this time. We have class traitors among us this time, and they're "just doing their jobs". For most, willful disbelief isn't even a factor. They think they're helping while each PR just brings them closer to unemployment.
show comments
bananaflag
Yeah but this time it's for real.
All the other attempts failed because they were just mindless conversions of formal languages to formal languages. Basically glorified compilers. Either the formal language wasn't capable enough to express all situations, or it was capable and thus it was as complex as the one thing it was designed to replace.
AI is different. You tell it in natural language, which can be ambiguous and not cover all the bases. And people are familiar with natural language. And it can fill in the missing details and disambiguate the others.
This has been known to be possible for decades, as (simplifying a bit) the (non-technical) manager can order the engineer in natural, ambiguous language what to do and they will do it. Now the AI takes the place of the engineer.
Also, I personally never believed before AI that programming will disappear, so the argument that "this has been hyped before" doesn't touch my soul.
I have no idea why this is so hard to understand. I'd like people to reply to me in addition to downvoting.
show comments
Havoc
History reviews is not a great way to approach ground breaking tech
I recently stumbled upon this delightfully titled book from 1982, "Application development without programmers": https://archive.org/details/applicationdevel00mart
Which includes this excellent line:
> Unfortunately, the winds of change are sometimes irreversible. The continuing drop in cost of computers has now passed the point at which computers have become cheaper than people. The number of programmers available per computer is shrinking so fast that most computers in the future will have to work at least in part without programmers.
The article talks about 'software development will be democratized' but the current LLM hype is quite the opposite. The LLMs are owned by large companies and are quite impossible to train by any individual, if only because of energy costs. The situation where I am typing my code on my linux machine is much more democratic.
One important and often overlooked democratization is spreadsheet formulas: non-programmers began programming without knowing they were, and without concern for error and edge cases. I cannot find the reference right now, but I recall seeing years ago articles about how mistakes in spreadsheet formulae were costing millions or more.
I see an analog with AI-generated code: the disciplined among us know we are programming and consider error and edge cases, the rest don't.
Will the AIs get good enough so they/we won't have to? Or will people realize they are programming and discipline up?
I generally agree that it's difficult and counterproductive to try to eliminate talented programmers who put together the core of systems and set up the patterns that things like LLMs can emulate.
But, the modal programmer at this point is some person who attended a front-end coding bootcamp for a few months and basically just knows how to chain together CSS selectors and React components. I do think these people are in big trouble.
So, while the core, say, 10% of people I think should remain in the system. This 90% periphery of pretty bad programmers will probably need to move on to other jobs.
I find it so fundamentally unhinged that people think things will get fully automated to the point that humans no longer matter. We are centuries into the deep automation of certain things, like looms, but people with deep understanding of those things are still needed to guide the automation and keep it working to meet human needs.
To ignore that pattern and say everything's going to be automated and humanity will be irrelevant seems to me to be... more of a death wish against human agency, than a prediction based on reality.
I remember being in my early 20s, learning C and Pascal, and having this one kid telling me I was learning dead languages and he’d earn 3 times more than me leaning 4GL as well as himself being 3 times smarter than everyone else too.
The only reason I remember this encounter so clearly was because he got rather annoyed, to the point of being aggressive, when I pointed out that most of the computing landscape was built on C and this wasn’t going to change any time soon.
Multiple decades later, and C-derived languages still rule the world. I do sometimes wonder if his opinion mellowed with time.
I remember sitting in a senior seminar class in 1989 full of CS students. We were solemnly informed by a very earnest IBM employee that we would regret having majored in computer science because IBM's CASE tools were going to kill job market. That aged like milk.
Will something come along some day that will actually drastically reduce the need for programmers/developers/software engineers? Maybe. Are we there yet? My LLM experience makes me seriously doubt it.
Developers are “unwanted overhead” until the customer money threatens to walk out the door. They’re going to damage their future products and probably reduce their customer base (fewer consumers) and then sit there looking like gaffed fish when the budget ink turns red. “Who would have thought…”
Don’t facilitate losing your job.
The market however has done a pretty good job of it, especially when it's a developer bull market that suddenly shifts directions. Case in point: late 90s, the mad rush to put warm bodies in chairs for those who could even spell HTML. A few years later, many had left and gone back to selling cars or whatever they did before.
The potentially cool thing about LLM's is bootstrapping. No matter how much COBOL you wrote, COBOL didn't get better. LLM's can be used to make LLM's (and other software stuff) better. LLM's could be used to create their successor(s).
Of course, in the end, it won't do us humans any good, because when the Singularity AKA Rapture comes, we'll all be converted to Computronium. :-)
Wow it mentions practically every flavour of the month technology that was supposed to make it drag and drop to make useful programs
I recall Power builder in particular it was the rage.
There are two ways to look at it:
- Software engineering is a cost center, they are middlemen between the C-level ideas and a finished product.
- Software engineering is about figuring out how to automate a problem, exploring the domain, defining context, tradeoffs, and unlocking new capabilities in the process
LLMs seem quite successful when considered something like a natural langiage interface, but expecting intelligence seems a step too far. For one they do not learn, at least not online, and that is a somewhat important requirement for truly intelligent behaviour.
Arguably programming is as much learning as it is writing code. This is part of the reason some people copy an entire API and don't realise they're not so much building useful code as building an understanding.
I'd say that the article left out Software Reuse - talked a lot more about in the late 90's early 00's than now.
You could argue that coding with LLM's is a form of software reuse, that removes some of its disadvantages.
Something else that really should be mentioned:
Every recession where there was mass lay-offs on programmers (not every recession hits programmers hard), there were many articles saying that whatever that latest thing [see article] was the cause of this and industry is getting rid of programmers they will never need again.
In every case of course "it is the economy stupid". The tools made little difference in the need for programmers. The tools that worked actually increased the need because things you wouldn't even attempt without the tools were now worth hiring extra people to do.
I like the arrogance present in the title. "Eternal promise" in a discipline that was conceived about a century ago.
Until a year ago I believed as the author did. Then LLMs got to the point where they sit in meetings like I do, make notes like I do, have a memory like I do, and their context window is expanding.
Only issue I saw after a month of building something complex from scratch with Opus 4.6 is poor adherence to high-level design principles and consistency. This can be solved with expert guardrails, I believe.
It won’t be long before AI employees are going to join daily standup and deliver work alongside the team with other users in the org not even realizing or caring that it’s an AI “staff member”.
It won’t be much longer after that when they will start to tech lead those same teams.
This topic always reminds me of "The Last One", https://en.wikipedia.org/wiki/The_Last_One_(software) :
> "The name derived from the idea that The Last One was the last program that would ever need writing, as it could be used to generate all subsequent software."
That was released in 1981. Spoiler alert: it was not, in fact, the last one.
what's wrong with eliminating programmers?
> There is every reason to believe that those who invest in deep understanding will continue to be valuable, regardless of what tools emerge.
I don't take issue with this, except that it's a false comfort when when you consider the demand will naturally ebb and individual workload will naturally escalate. In that light, I find it downright dishonest because the rewards for attaining deep knowledge will continue to evaporate; necessitating AI-assistance.
The reason is it different this time around is because the capabilities of LLMs have incentivized the professional class to betray the institutions that enabled their specializations. I am talking about the amazing minds at Adobe, Figma, and the FAANGS who are bridging agentic reasoners and diffusion models with domain-specific needs of their respective professional users.
Humans are class of beings, and the humans accelerating the advance of AI in creative tools are the reason that things are different this time. We have class traitors among us this time, and they're "just doing their jobs". For most, willful disbelief isn't even a factor. They think they're helping while each PR just brings them closer to unemployment.
Yeah but this time it's for real.
All the other attempts failed because they were just mindless conversions of formal languages to formal languages. Basically glorified compilers. Either the formal language wasn't capable enough to express all situations, or it was capable and thus it was as complex as the one thing it was designed to replace.
AI is different. You tell it in natural language, which can be ambiguous and not cover all the bases. And people are familiar with natural language. And it can fill in the missing details and disambiguate the others.
This has been known to be possible for decades, as (simplifying a bit) the (non-technical) manager can order the engineer in natural, ambiguous language what to do and they will do it. Now the AI takes the place of the engineer.
Also, I personally never believed before AI that programming will disappear, so the argument that "this has been hyped before" doesn't touch my soul.
I have no idea why this is so hard to understand. I'd like people to reply to me in addition to downvoting.
History reviews is not a great way to approach ground breaking tech