I too began with BASIC (but closer to 1980). Although I wrote and published games for the Macintosh for a number of years as I finished up college, my professional career (in the traditional sense) began when I was hired by Apple in 1995 and relocated to the Bay Area.
Yeah, what started out as a great just got worse and worse as time went on.
I suspect though that to a large degree this reflects both the growing complexity of the OS over that time as well as the importance of software in general as it became more critical to people's lives.
Already, even in 1984 when it was first introduced, the Mac had a rich graphics library you would not want to have to implement yourself. (Although famously of course a few apps like Photoshop nonetheless did just that—leaning on the Mac simply for a final call to CopyBits() to display pixels from Adobe's buffer to the screen.)
You kind of have to accept abstraction when networking, multiple cores, multiple processes become integral to the machine. I guess I always understood that and did not feel too put out by it. If anything a good framework was somewhat of a relief—someone else's problem, ha ha. (And truly a beautiful API is just that: a beautiful thing. I enjoy working well constructed frameworks.)
But the latter issue, the increasing dominance of software on our lives is what I think contributed more to poisoning the well. Letting the inmates run the asylum more or less describes the way engineering worked when I began at Apple in 1995. We loved it that way. (Say what you want about that kind of bottom-up culture of that era, but our "users" were generally nerds just like us—we knew, or thought we knew anyway, better than marketing what the customer wanted and we pursued it.)
Agile development, unit tests, code reviews… all these weird things began to creep in and get in the way of coding. Worse, they felt like busywork meant simply to give management a sense of control… or some metric for progress.
"What is our code coverage for unit test?" a manager might ask. "90%," comes the reply from engineering. "I want to see 95% coverage by next month," comes the marching orders. Whatever.
I confess I am happy to have now left that arena behind. I still code in my retirement but it's back to those cowboy-programmer days around this house.
Yee haw!
show comments
sho_hn
My advice to everyone feeling existential vertigo over these tools is to remain confident and trust in yourself. If you were a smart dev before AI, chances are you will remain a smart dev with AI.
My experience so far is that to a first approximation, the quality of the code/software generated with AI corresponds to the quality of the developer using the AI tool surprisingly well. An inexperienced, bad dev will still generate a sub-par result while a great dev can produce great results.
The choices involved in using these tools are also not as binary as they are often made out to be, especially since agents have taken off. You can very much still decide to dedicate part of your day to chiseling away at important code to make it just right and make sure your brain is engaged in the result and exploring and growing with the problem at hand, while feeding background queues of agents with other tasks.
I would in fact say the biggest challenge of the AI tool revolution in terms of what to adapt to is just good ol' personal time management.
show comments
alexgarden
Wow... I really relate to this. I'm 50 as well, and I started coding in 1985 when I was 10... I remember literally every evolutionary leap forward and my experience with this change has been a bit different.
Steve Yegge recently did an interview on vibe coding (https://www.youtube.com/watch?v=zuJyJP517Uw) where he says, "arch mage engineers who fell out-of-love with the modern complexity of shipping meaningful code are rediscovering the magic that got them involved as engineers in the first place" <-- paraphrased for brevity.
I vividly remember, staying up all night to hand-code assembler primitive rendering libraries, the first time I built a voxel rendering engine and thinking it was like magic what you could do on a 486... I remember the early days at Relic, working on Homeworld and thinking we were casting spells, not writing software. Honestly, that magic faded and died for me. I don't personally think there is magic in building a Docker container. Call me old-fashioned.
These days, I've never been more excited about engineering. The tedium of the background wiring is gone. I'm back to creating new, magical things - I'm up at 2 AM again, sitting at my desk in the dark, surrounded by the soft glow of monitors and casting spells again.
show comments
chasd00
What the author describes is also the feeling when you shift from being a developer all day to being a team lead or manager. When you become a lead you have to let go and get comfortable with the idea that the code is not going to be how you would do it. You can look at code produced by your team and attempt to replace it all with your craftsmanship but you're just setting yourself up to fail. The right approach is use your wisdom to make the team better, not the code. I think a lot of that applies to using AI when coding.
I'm turning 50 in April and am pretty excited about AI coding assistants. They make a lot of personal projects I've wanted to do but never had the time feasible.
show comments
qalmakka
I am much younger than the author, but I've been coding for most of my life and I find close to no joy in using AIs. For me coding has always been about the nitty-gritty quirkiness of computers, languages, solving issues and writing new cool things for the sake of it. It was always more about the journey than the end goal, and AI basically hollows out all of the interesting bits about coding. It feels like skipping straight to the end of a book, or somewhat like that.
I don't know if I am the only one, but developing with chatbots in my experience turns developing software into something that feels more akin to filling out forms or answering to emails. I grieve for the day we'll lose what was once a passion of mine, but unfortunately that's how the world has always worked. We can only accept that times change, and we should follow them instead of complaining about it.
show comments
weli
> I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic. I’m fifty now, and the magic is different, and I’m learning to sit with that.
Don't take this the wrong way but this is more of an age thing rather than a technology advancement thing.
Kids growing up nowadays that are interested in computers grow up feeling the same magic. That magic is partly derived from not truly understanding the thing you are doing and creating a mental "map" by yourself. There is nothing intrinsic to computing nowadays that makes it less magic than fiddling around with config.sys, in 50 years there will be old programmers reminiscing of "Remember when all new models were coming out every few months and we could fiddle around with the vector dimensionality and chunking length to get the best of gpt-6.2 RAG? Those were the times".
show comments
aabajian
It seems AI is putting senior developers into two camps. Both groups relate to the statement, "I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic. I’m fifty now, and the magic is different, and I’m learning to sit with that."
The difference is that the first camp is re-experiencing that feeling of wonder while the second camp is lamenting it. I thankfully fall in the first camp. AI is allowing me to build things I couldn't, not due to a lack of skills, but a lack of time. Do you want to spend all your time building the app user interface, or do you want to focus on that core ability that makes your program unique? Most of us want the latter, but the former takes up so much time.
show comments
gustavopezzi
Thank you for writing this. My feelings are very similar to the ones described by the author and the timeline almost matches. The thrill of tecnology for me started to fast decay since the early 2010s and now I see it as a no-return stage. I still have fun with my retro hardware & software but I am no longer an active practitioner and I have pivoted my attention and my efforts somewhere else. Unfortunately, I no longer feel excited for the future decades of tech and I am distancing myself from it.
show comments
GMoromisato
I'm lucky because I work as an independent consultant. I get paid to deliver solutions, but I get to choose how to create those solutions. I write whatever code I want however I want. As long as it solves the problem, no one cares.
I started programming in 1980, and I having just as much fun now as I did then. I literally cannot wait to sit down at my IDE and start writing.
But that was not always true. When I worked for a larger company, even some startups, it was not always fun. There's something about having full control over my environment that makes the work feel like play.
If you feel like programming isn't fun anymore, maybe switching to a consulting gig will help. It will give you the independence and control that you might be craving.
show comments
serf
6 or 7 , 38 now -- and having a blast.
it isn't all funeral marches and group crying sessions.
And don't let the blog post fool you , it is a rant about AI -- otherwise we would have heard complaints about the last 200 paradigm shifts in the industry over the past thirty years.
Sure, we got our share of dilbert-style agile/waterfall/tdd jokes shoved in our face, but no one wrote a blog post about how their identity was usurped by the waterfall model .
>And different in a way that challenges the identity I built around it and doesn’t satisfy in the way it did.
Everyone should do their own thing, but might I suggest that it is dangerous for anyone in this world to use a single pillar as their foundation for all identity and plinth of their character.
show comments
abraxas
I'm the exact age as the author and this post could have been written by me (if I could write). It echoes my story and sentiment exactly right down to cutting my literal baby teeth on a rubber key ZX Spectrum.
The anxiety I have that the author might not be explicitly stating is that as we look for places we add genuine value in the crevices of frontier models' shortcomings those crevices are getting more narrow by the day and a bit harder to find.
Just last night I worked with Claude and at the end of the evening I had it explain to me what we actually did. It was a "Her" (as in the movie) moment for me where the AI was now handholding me and not the other way around.
show comments
godshatter
I'm 60, started with a Tandy Model I in junior high, learned 6809 assembly for my Color Computer, loved the fact we could put certain values in particular memory positions and change the video mode and put pixels to the screen. It's been decades of losing that level of control, but for me coding is the fun part. I've never lost that spark of enjoyment and really obsession I felt early on. I enjoy the supposedly boring job of writing SQL and C with embedded SQL and working with business concepts to produce solutions. Coding is the fun part for me, even now.
I got moved up the chain to management and later worked to get myself moved back down to a dev role because I missed it and because I was running into the Peter Principle. I use AI to learn new concepts, but mostly as a search engine. I love the tech behind it, but I don't want it coding for me any more than I want it playing my video games for me. I was hoping AI would show up as robots doing my laundry, not doing the thing I most enjoy.
andyjohnson0
I'm a developer, mid/late fifties. My first computer was a Commodore Vic 20, so I guess I started writing code at about the same time as the OP even if I'm a few years older.
Yes, I mourn the end of my craft and all that that. But also:
This isn't the end of hand-written code. A few will still get paid to do it in niche domains. Some will do it as a hobby or craft activity - like oil painting or furniture making. The tooling will move on and become more specialised and expensive. Like owning Japanese woodworking tools.
But software construction as a human-based economic activity is clearly about to slam hard into a singularity, and many of us who rely on our hard-won skills to pay the bills and survive are going to find ourselves unemployed and unemployable. A few early adopters will get to stay on and sip their artesanal coffee and "build beautiful things" while their agent herds toil. But most of us won't. Software has always mostly been just CRUD apps, and that is going to need a whole lot less people going forward. People like me, perhaps, or you.
Some, who have sufficient financial and chronological runway, will go off and do other things. Many won't have that opportunity. I have personal experience of late-career unemployment - although I'm currently working - and its not pretty. A lot of lives are going to to be irreparably disrupted by this. Personally, I'd hoped that I could make it through to some stable kind of retirement, but I just don't see it anymore.
strangattractor
Having been in this game about 10 years longer I can understand how he feels. I distinctly remember when I realized that C compilers for the ARM produced better assembly than I could code by hand. Bitter sweet but the code being written became larger and more complex because of it.
Modern coding has become more complex than I would have ever thought possible. The number of technologies an individual would have to master to actually be a expert "full stack" coder is ludicrous. It is virtually impossible for an individual to prototype a complex Web based app by themselves. I think AI will lower that barrier.
In return we will get a lot more software - probably of dubious quality in many cases - as people with "ideas" but little knowledge start making apps. Not a totally bad thing but no utopia either. I also think it will likely reduce the amount of open source software. Content producers are already hoarding info to prevent AI bots from scraping it. I see no reason to believe this will not extend to code as more programmers find themselves in a situation more akin to musicians than engineers.
benlivengood
The contrast between this and https://news.ycombinator.com/item?id=46923543 (Software engineering is back) is kind of stark. I am using frontier models to get fun technical projects done that I simply didn't have time for since my late teens. It is still possible to understand an architecture down to the hardware if you want to, but it can happen a lot faster. The specifications are queryable now. Obscure bugs that at least one person has seen in the past are seconds away instead of minutes or hours of searching. Even new bugs have extra eyes on them. I haven't written a new operating system yet but it's now a tractable problem. So is using Lean or Julia or some similar system to formally specify it. So far I've been digging into modern multithreaded cache performance which is just as fascinating as directly programming VGA and sound was in the early PC days. Linux From Scratch is still up to date. You can get FPGAs that fit in your USB port [0]. Technical depth and low-level understanding is wherever you want to look for it.
Programming is not art for me. I do not find it useful to gold plate solutions. I prefer getting the job done, sometimes by any means necessary for "the vehicle" to continue running.
AI often generates parts of code for my hobby projects, which allow me speed running with my implementation. It often generates errors, but I am also skilled, so I fix error in the code.
I use AI as boiler plate code generator, or documentation assist, for languages I do not use daily. These solutions I rarely use 1:1, but if I had to go through readme's and readthedocs, it would take me a lot longer.
Would there be more elegant solutions? often - yes. Does it really matter? For me - not.
show comments
jbreckmckye
I don't disagree that technology is less fun in an AI era. The question is, what other careers are out there for someone who wants to make things?
About a decade ago, I went through a career crisis where I couldn't decide what job to do - whether technology was really the best choice for my particular temperament and skills.
Law? Too cutthroat. Civil service? Very bureaucratic. Academia? Bad pay. Journalism? An industry in decline.
It is a shame, what is happening. But I still think, even with AI hollowing out the fun parts, tech remains the best job for a smart, motivated person who's willing to learn new things.
show comments
sejje
I think one of the big distinctions between people who like building with AI and those who don't, is that the people who are pro-AI are building their own ideas, of which they have many.
The people who are anti-AI are largely building other people's ideas, for work. And they have no desire to ramp up velocity, and it's not helpful to them anyway because of bureaucratic processes that are the real bottleneck to what they're building.
Not everyone falls into these silos, of course.
waffletower
Not going to pull age or title rank here -- but I suggest if your use of AI feels empty, take advantage of its speed and plasticity and iterate upon its output more, shape the code results. Use it as a sculptor might too -- begin with its output and make the code your own. I particularly like this latter approach when I am tasked with use of a language I view as inferior and/or awkward. While this might read as idealistic, and I agree that there are situations where this interaction is infeasible or inappropriate, you should also be encountering problems where AI decidedly falls on its face and you need to intervene.
alexpotato
At my first full time job in the early 2000s I was tasked with building a webscraper. We worked for law firms representing Fortune 500 companies and they wanted to know who was running "pump and dump" stock schemes on stocks using Yahoo Finance message boards.
At the time, I didn't know the LWP::Simple module existed in Perl so I ended up writing my own socket based HTTP library to pull down the posts, store them in a database etc. I loved that project as it taught me a lot about HTTP, networking, HTML, parsing and regexes.
Nowadays, I use playwright to scrape websites for thing I care about (e.g. rental prices at the Jersey Shore etc). I would never think to re-do my old HTTP library today while still loving the speed of modern automation tools.
Now, I too have felt the "but I loved coding!" sense of loss. I temper that with the above story that we will probably love what comes next too (eventually).
zozbot234
There's nothing "hollowed out" about directing an AI effectively, the feedback is as quick and tight as it always was. The trick is that you don't just "vibe code" and let the AI one-shot the whole thing: you should propose the change first and ask the AI about a good, detailed plan for implementing it. Then you review what the robot has proposed (which is trivial compared to revising code!) make sensible changes, ask for feedback again, and repeat. By the time the AI bot has to write actual code, it's not running on vibes anymore: it's been told exactly what to do and how to assess the result. You spend more time upfront, but a lot less on fixing the AI's mistakes.
show comments
pixl97
A blacksmith was a person that picked up chunks of carbon and heated them to they were glowing red and beat the iron to submission with a hammer in their hands.
Today iron is produced by machines in factories by the mega-tonne.
We just happened to live in the age where code when from being beaten by hand to a mass produced product.
And so the change of technology goes.
show comments
mosburger
Oh my god. This is me. If I were any better at writing, I could have written this, the author is even the same age as me (well, a year younger) and followed a similar trajectory. And a lot of what I've been feeling lately feels similar to burnout (in fact I've been calling it that), but it really isn't burnout. It's... this, whatever this is... a "fallow period" is a good term.
And I feel like an old man grumbling about things changing, but... it's not the same. I started programming in BASIC on my Tandy 1000 and went to college and learned how to build ISA cards with handwritten oscilloscope software in the Computer Engineering lab. My first job was writing firmware. I've climbed so far up the abstraction chain over a thirty year career and I guess I don't feel the same energy from writing software that first got me into this, and it's getting harder to force myself to press on.
adamtaylor_13
This essay begins by promising not to be a "back in my day" piece, but ends up dunking on 20-year-olds who are only a few years into their career, as if they have any choice about when they were born.
thom
I too get less of a kick out of writing enterprise middleware than I did making games as a kid in the 80s. Why did the industry do this to me?!
randusername
A lot of that magic still remains in embedded.
If vendors can't be bothered to use a C compiler from the last decade, I don't think they'll be adopting AI anytime soon.
At my work, as of 2026, we only now have a faction riled up about evangelizing clean code, OOP, and C++ design patterns. I hope the same delay keeps for all the rest of the "abstraction tower".
show comments
epaga
I gave up after the third “It’s not X, it’s Y” in like two paragraphs. Is nobody else allergic to that AI voice? Isn’t the author?
So depressing this is the current state of blogging. Can’t wait for this phase to be over.
show comments
danesparza
I humbly submit this interview with Grady Booch (if you know, you know) talking about the "3rd golden age of software engineering - thanks to AI": https://youtu.be/OfMAtaocvJw
I feel like the conversation does a good job of couching the situation we find ourselves in.
runjake
I am a little older than OP. I don't think I've ever had that feeling about a programming project for work that came from someone else.
Generally, I get that feeling from work projects that I've self-initiated to solve a problem. Fortunately, I get the chance to do this a lot. With the advent of agentic coding, I am able to solve problems at a much higher rate.
Quite often, I'll still "raw dog" a solution without AI (except for doc lookups) for fun, kind of as a way to prove to myself I can still do it when the power's out.
JetSetIlly
I'm the exact same demographic as the author, just turned 50, writing code since childhood in BASIC. I'm dealing with the AI in programming issue by ignoring it.
I still enjoy the physical act of programming so I'm unsure why I should do anything that changes that. To me it's akin to asking a painter to become a photographer. Both are artists but the craft is different.
Even if the AI thing is here to stay, I think there will be room for people who program by hand for the same reason there's still room for people who paint, despite the invention of the camera.
But then, I'm somebody who doesn't even use an IDE. If I find an IDE obtrusive then I'm certain I'll find an AI agent even more so.
davebranton
The deep, profound, cruel irony of this post is that it was written by AI.
Maybe if you work in the world of web and apps, AI will come for you. If you don't , and you work in industrial automation and safety, the I believe it will not.
pixelsort
I was 7 in 1987, learned LOGO and C64 BASIC that year, and I relate to this article as well.
It feels as though a window is closing upon the feeling that software can be a powerful voice for the true needs of humanity. Those of us who can sense the deepest problems and implications well in advance are already rare. We are no more immune to the atrophy of forgetting than anyone.
But there is a third option beyond embrace or self-extinguish. The author even uses the word, implying that consumers wanted computers to be nothing more than an appliance.
The third option is to follow in the steps of fiction, the Butlerians of Dune, to transform general computation into bounded execution. We can go back to the metal and create a new kind of computer; one that does have a kind of permanence.
From that foundation, we can build a new kind of software, one that forces users to treat the machine as appliance.
It has never been done. Maybe it won't even work. But, I need to know. It feels meaningful and it has me writing my first compiler after 39 years of software development. It feels like fighting back.
show comments
hamdouni
Total resonance with this part :
"They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of."
dsiegel2275
Wow this hits home - I just turned 51 and I also started coding at age 7, writing BASIC on a TRS-80 Model III.
I still have a very distinct memory when my father told me he was buying us our first home computer. I remember him telling me that you could use the computer to make games. I was so excited by the idea and amazing by this technology (that I hadn't yet even remotely understood). I remember saying "Oh, you just tell it to make a game? And it makes a game?" He explained to me then what programming was.
When we got the TRS-80, he and I worked together to build a game. We came up with an idea for a text adventure game called "Manhole Mania" - you were a city works employee exploring the sewers after reports of strange noises. We never finished much of it - maybe just the first few "rooms".
Maybe this weekend I will tell Codex to make me a game.
manjuc
Same, been a product designer for years, still love design deep down but the essence is somehow not there anymore. reading this hit different. It's refreshing to see someone put it into words instead of the usual "stuff".
I turn 52 this year. I also started at 10 years old programming in a combination of AppleSoft BASIC and assembly language and typing machine code out of books so I could use Double Hires graphics since it wasn’t supported by BASIc and doing my own assembly language programming.
I stuck with C and C++ as my bread and butter from 1996-2011 with other languages in between.
I don’t miss “coding” because of AI. My vision has been larger than what I could do myself without delegating for over a decade - before LLMs.
“coding” and/or later coordinating with people (dotted line) reporting to me has been a necessary evil until a year or two ago to see my vision go to implementation.
I absolutely love this new world. For loops and while loops and if statements don’t excite me in my 50s. Seeing my vision come to life faster than I ever could before and having it well archited does.
I love talking to “the business” and solving XYProblems and getting to a solution 3x faster
h4kunamata
Late 30s here, I have seen:
* dial-up being replaced by DSL
* CAT being replaced with fiber for companies
* VOIP replacing bulk BPX
* Cloud replacing on-prem to an extent
* Cloud services plague now called SaaS
* License for life being replaced by subscription
* AI driving everything to shit literally
The technology is no longer helping anything, it is actually tearing our society apart.
Up to 2000s, things were indeed evolution, improvements, better life style be it personal or professional.
Since 2000s, Enshitification started, everything gets worse, from services, to workflows, to processes, to products, to laws.
Gen-Z does not realize how bad things are, and how we are no longer becoming smarter but dumber, kids cannot even read but have every single social media account.
If they could spend one day back in early 2000s, the current generation would start a civil war in every single city across the globe.
sebringj
idk, i'm loving the newness of all of it, I feel more empowered than ever before, like it's my time. Before startups would take like a year to get going, now it's like a month or so. It's exciting and scary, we have no idea where it's going. Not boring at all. I was getting bored as shit and bam, now i can dream up shit quick and have it validated to, ya i figured that out with an MCP so ya this is my jam. Program MCPs and speed it up!!!!!!
aeuropean12
Well yes it has changed. But look at everything that can be accomplished with these abstractions/libraries/frameworks that exist.
Why reinvent the wheel.
Yes, there might be less room for the Wild Wild West approach, as mentioned in the article: But that is the structure of compounded knowledge/tooling/code available to developers/others to create more enriched software, in the sense that it runs on what is available now and provides value in today's age of computing.
I also had a 486DX2-66. And I recall coding in Assembly, Pascal, C etc.
I do not miss it. These days I can create experiences that reach so many more people (a matured Interneet with realtime possibilities - to simplify) and with so much more potential for Good. Good in the sense of usefulness for users, good in the sense of making money (yeah, that aspect still exists).
I do understand your sentiment and the despairing tone. There have been times when I was struck by the same.
But I do not miss 1995 and struggling with a low-level formatted HD and Assembly that screwed up my floppy disks, or the worms that reached my box, or the awful web sites in terms of UX that were around, or pulling coaxial cables around for LAN parties.
It's just a different world now. But I get what you are saying, and respect it. Stay optimistic. :)
ktrnka
I'm a few years behind you. I got started on my uncle's handed down vic 20 in the late 80s.
The culture change in tech has been the toughest part for me. I miss the combination of curiosity, optimism, creativity, and even the chaos that came with it. Nowadays it's much harder to find organizations like that.
bentt
Some farmers probably lamented the rise of machines because they feared their strength would no longer be needed in the fields. These farmers were no doubt more concerned with their own usefulness as laborers than in the goals of the farm: to produce food.
If you program as labor, consider what you might build with no boss. You’re better equipped to start your own farm than you think.
show comments
davidw
50 myself, and started coding with a Commodore 64, but only really picked it up seriously with the advent of open source software, and that feeling of being able to dig around any component of the system I wanted to was exhilarating.
I think that's one of the biggest things that gives me pause about AI: the fact that, if they prove to be a big productivity boost, you're beholden to huge corporations, and not just for a one-time purchase, but on an ongoing basis.
Maybe the open source models will improve, but if keeps being driven by raw compute power and big numbers, it seems to tilt things very much in favor of those with lots and lots of capital to deploy.
hazyc
I think the true genuinely-love-programming type of people will increasingly have to do what so many other people do, and that's separation of work and personal enjoyment. You might have to AI-architect your code at work, and hand code your toy projects on the weekend.
alt227
I prefer to see it as the automtion of the IT age.
All other professions had their time when technology came and automated things.
For example wood carvers, blacksmiths, butchers, bakers, candlestickmakers etc etc. All of those professions have been mostly taken over by machines in factories.
I view 'ai' as new machines in factories for producing code. We have reached the point where we have code factories which can produce things much more efficiently and quicker than any human can alone.
Where the professions still thrive is in the artisan market. There is always demand for hand crafted things which have been created with love and care.
I am hoping this stays true for my coding analogy. Then people who really care about making a good product will still have a market from customers who want something different from the mass produced norm.
show comments
js-j
I can share a similar experience: I began to learn programming during my first school years, on an Apple II clone with Logo, a fancy language with turtle graphics as a most distinctive feature. We used to boot Logo off 5.25" floppy disks...
jdlyga
It's turned from SimCity into SimSimCity. It's like playing a simulation where you manage a person who's playing SimCity.
simonsarris
> The feedback loop has changed. The intimacy has gone. The thing that kept me up at night for decades — the puzzle, the chase, the moment where you finally understand why something isn’t working — that’s been compressed into a prompt and a response
It's so strange to read because to me its never been more fun to make software, its especially never been easier for an individual. The boring parts are being automated so I can work on the bespoke and artistic parts. The feedback loop is getting shorter to making something nice and workable. The investigation tools for profiling and pinpointing performance bottlenecks are better than ever, where Claude is just one new part of it.
hnthrowaway0315
I have given the topic some thoughts. I concluded that the ONLY way for ordinary people (non-genius, IQ <= 120) to be really good, be really close to the genius, is to sit down, condensate the past 40 or so year's tech history of three topics (Comp-Arch, OS and Compiler) into a 4-5 years of self-education.
Such education is COMPLETELY different from the one they offered in school, but closer to those offered in premium schools (MIT/Berkeley). Basically, I'd call it "Software engineering archaeology". Students are supposed to take on ancient software, compile them, and figure out how to add new features.
For example, for the OS kernel branch:
- Course 0: MIT xv6 lab, then figure out which subsystem you are interested in (fs? scheduler? drivers?)
- Course 0.5: System programming for modern Linux and NT, mostly to get familiar with user space development and syscalls
- Course 1: Build Linux 0.95, run all of your toolchains in a docker container. Move it to 64-bit. Say you are interested in fs -- figure out the VFS code and write a couple of fs for it. Linux 0.95 only has Minix fs so there are a lot of simpler options to choose from.
- Course 2: Maybe build a modern Linux, like 5.9, and then do the same thing. This time the student is supposed to implement a much more sophiscated fs, maybe something from the SunOS or WinNT that was not there.
- Course 3 & 4: Do the same thing with leaked NT 3.5 and NT 4.0 kernel. It's just for personal use so I wouldn't worry about the lawyers.
For reading, there are a lot of books about Linux kernels and NT kernels.
kwar13
> They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
yup.
show comments
PaulDavisThe1st
So tired of this sort of complaint (and I'm 62).
The computing the author enjoyed/enjoys is still out there, they are just looking for it in all the wrong places. Forget about (typical) web development (with its front and backend stacks). Forget about windows and macOS, and probably even mobile (though maybe not).
Hobby projects. C++/Rust/C/Go/some-current-Lisp. Maybe even Zig! Unix/Linux. Some sort of hardware interaction. GPL, so you can share and participate in a world of software created by people a lot more like you and a lot less like Gates and Jobs and Zuckerberg and ...
Sure, corporate programming generally tends to suck, but it always did. You can still easily do what you always loved, but probably not as a job.
At 62, as a native desktop C++ app developer doing realtime audio, my programming is as engrossing, cool, varied and awesome as it has ever been (probably even more so, since the GPL really has won in the world I live in). It hasn't been consumed by next-new-thing-ism, it hasn't been consumed by walled platforms, it hasn't been taken over by massive corporations, and it still very much involves Cool Stuff (TM).
Stop whining and start doing stuff you love.
show comments
JohnMakin
I'm ~40ish but middle career and not in management. I envy this author, whatever joy he found in solving little puzzles and systems was extinguished in me very early in my career in an intense corporate environment. I was never one to love fussing much with code, but I do love solving system scale problems, which also involve code. I don't feel I am losing anything, the most annoying parts of code I deal with are now abstracted into human language and specs, and I can now architect/build more creatively than before. So I am happy. But, I was one of those types that never had a true passion for "code" and have meant plenty of people that do have that, and I feel for them. I worry for people that carved out being really good at programming as a niche, but you enter a point in your career where that becomes much less important than being able to execute and define requirements and understand business logic. And yea, that isn't very romantic or magical, but I find passion outside of what pays my bills, so I lost that ennui feeling a while ago.
show comments
ilitirit
I'm roughly the same (started at 9, currently 48), but programming hasn't really changed for me. What's changed is me having to have pointless arguments with people who obviously have no clue what they're talking about but feel qualified either because:
a) They asked an LLM
b) "This is what all our competitors are doing"
c) They saw a video on Youtube by some big influencer
d) [...insert any other absurd reason...]
True story:
In one of our recent Enterprise Architecture meetings, I was lamenting the lack of a plan to deal with our massive tech debt, and used an example of a 5000 line regulatory reporting stored procedure written 10 years ago that noone understood. I was told my complaint was irrelevant because I could just dump it into ChatGPT and it would explain it to me. These are words uttered by a so-called Senior Developer, in an Enterprise Architecture meeting.
show comments
dwoldrich
I am in a very similar boat, age and experience-wise. I would like to work backward from the observation that there is no resource constraints and we're collectively hopelessly lost up the abstraction Jenga tower.
I observe that the way we taught math was not oriented on the idea that everyone would need to know trigonometric functions or how to do derivatives. I like to believe math curricula was centered around standardizing a system of thinking about maths and those of us who were serious about our educational development would all speak the same language. It was about learning a language and laying down processes that everyone else could understand. And that shaped us, and it's foolish to challenge or complain about that or, God forbid, radically change the way we teach math subjects because it damages our ability to think alike. (I know the above is probably completely idealistic verging on personal myth, but that's how I choose to look at it.)
In my opinion, we never approached software engineering the same way. We were so focused on the compiler and the type calculus, and we never taught people about what makes code valuable and robust. If I had FU money to burn today, I'd start a Mathnasium company focused around making kids into systems integrators with great soft skills and the ability to produce high quality software. I would pitch this business under the assumption that the jenga tower is going to be collapsing pretty much continuously for the next 25-50 years and civilization needs absolute unit super developers coming out of nowhere who will be able to make a small fortune helping companies dig their way out of 75 years of tech debt.
kypro
> I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic
I'm significantly younger than OP, but this was it for me too. I'm autistic and found the world around me confusing growing up. Computers were wonderful because they were the only thing that really made sense to me.
I was obsessed with computers since I was 5. I started programming probably around age 10. Then in my early teens I started creating Flash applications, writing PHP, Java, etc...
When I look back on my early career now it was almost magical. This in the mid to late 00s (late to some I know), but this was before the era of package managers, before resources like Stackoverflow, before modern IDEs. You had some fairly basic frameworks to work with, but that was really about it. Everything else had to be done fully by hand.
This was also before agile was really a thing too. The places I worked at the time didn't have stand-ups or retrospectives. There were no product managers.
It was also before the iPhone and the mass adoption of the internet.
Back then no one went into software engineering as a profession. It was just some thing weird computer kids did, and sometimes businesses would pay us to build them things. Everyone who coded back then I got along with great, now everyone is so normal it's hard for me to relate with me. The industry today is also so money focused.
The thing and bothers me the most though is that computers increasingly act like humans that I need to talk to to get things done, and if that wasn't bad enough I also have to talk with people constantly.
Even the stuff I build sucks. All the useful stuff has been build so in the last decade or so stuff I've built feels increasingly detached from reality. When I started I felt like I was solving real practical problems for companies, now I'm building chatbots and internal dashboards. It's all bollocks.
There was a post recently about builders vs coders (I can't remember exactly). But I'm definitely a coder. I miss coding. There was something rewarding about pouring hours into a HTML design, getting things pixel perfect. Sometimes it felt laborious, but that was part of the craft. Claude Code does a great job and it does it 50x faster than I could, but it doesn't give me the same satisfaction.
I do hope this is my last job in tech. Unfortunately I'm not old enough to retire, but I think I need to find something better suited to my programatic way of thinking. I quite like the idea of doing construction or some other manual labour job. Seems like they're still building things by hand and don't have so many stupid meetings all the time.
marginalia_nu
You can still have fun programming. Just sit down and write some code. Ain't nobody holding a gun to your head forcing you to use AI in your projects.
And the part of programming that wasn't your projects, whether back in the days of TPS reports and test coverage meetings, or in the age of generative AI, that bit was always kinda soul draining.
kraig911
"Over four decades I’ve been through more technology transitions than I can count. New languages, new platforms, new paradigms. CLI to GUI. Desktop to web. Web to mobile. Monoliths to microservices. Tapes, floppy discs, hard drives, SSDs. JavaScript frameworks arriving and dying like mayflies."... made me think of
I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.
where we came from and where we're going this whole time in my career those things are kind of hard to pinpoint. Abstraction is killing us for sure. Time to market above all else. It's no wonder why software in cars, appliances and medical equipment is a factor that is killing people.
DanielBMarkham
This is quite the lament. Very well written.
I'm about ten years ahead of the author. I felt this a long time before AI arrived. I went from solving problems for people to everything I tried to ending up in an endless grind of yak-shaving.
I worked my way through it, though. It made me both give up programming, at least in the commercial sense, and appreciate the journey he and I have gone through. It's truly an amazing time to be alive.
Now, however, I'm feeling sucked back into the vortex. I'm excited about solving problems in a way I haven't been in a long time. I was just telling somebody that I spent 4-6 hours last night watching Claude code. I watched TV. I scratched my butt. I played HexaCrush. All the time it was just chugging along, solving a problem in code that I have wanted to solve for a decade or more. I told him that it wasn't watching the code go by. That would be too easy to do. It was paying attention to what Claude was doing and _feeling that pain_. OMG, I would see it hit a wall, I would recognize the wall, and then it'd just keep chugging along until it fixed it. It was the kind of thing that didn't have damned thing to do with the problem but would have held me up for hours. Instead, I watched Pitt with my wife. Every now I then I'd see a prompt, pop up, and guide/direct/orchestrate/consult/? with Claude.
It ain't coding. But, frankly, coding ain't coding. It hasn't been in a long, long time.
If a lot of your job seems like senseless bullshit, I'm sad to say you're on the way out. If it doesn't, stick around.
I view AI as an extinction level threat. That hasn't changed, mainly because of how humans are using it. It has nothing to do with the tech. But I'm a bit perplexed now as to what to do with my new-found superpowers. I feel like that kid on the first Spiderman movie. The world is amazing. I've got half-a-dozen projects I'm doing right now. I'm publishing my own daily newspaper, just for me to read, and dang if it's not pretty good! No matter how this plays out, it is truly an amazing time to be alive, and old codgers like us have had a hella ride.
Decabytes
I too have felt these feelings (though I'm much younger than the author). I think as I've grown older I have to remind myself
1. I shouldn't be so tied to what other people think of me (craftsman, programmer, low level developer)
2. I shouldn't measure my satisfaction by comparing my work to others'. Quality still matters especially in shared systems, but my responsibility is to the standards I choose to hold, not to whether others meet them. Plus there are still community of people that still care about this (handmade network, openbsd devs, languages like Odin) that I can be part of it I want to
3. If my values are not being met either in my work or personal life I need to take ownership of that myself. The magic is still there, I just have to go looking for it
yayitswei
Is there some magic lost also when using AI to write your blog post?
show comments
qsi
Well-written and it expresses a mood, a feeling, a sense of both loss and awe. I was there too in the 8-bit era, fully understanding every byte of RAM and ROM.
The sense of nostalgia that can turn too easily into a lament is powerful and real. But for me this all came well before AI had become all consuming... It's the just the latest manifestation of the process. I knew I didn't really understand computers anymore, not in the way I used to. I still love coding and building but it's no longer central to my job or lif3. It's useful, I enjoy it but at the same time I also marvel at the future that I find myself living in. I've done things with AI that I wouldn't have dared to start for lack of time. It's amazing and transformative and I love that too.
But I will always miss the Olden Days. I think more than anything it's the nostalgia for the 8-bit era that made me enjoy Stranger Things so much. :)
Zaskoda
I found that feeling again while building a game on the EVM. All of the constraints were new and different. Solidity feels somewhere between and high and low level language, not as abstracted as most popular languages today but a solid step above writing assembly.
A lot of people started building projects like mine when the EVM was newer. Some managed to get a little bit of popularity, like Dark Forest. But most were never noticed. The crypto scene has distracted everyone from the work of tinkerers and artists who just wanted to play with a new paradigm. The whole thing became increasingly toxic.
It was like one last breath of fresh cool air before the pollution of AI tools arrived on the scene. It's a bitter sweet feeling.
ookblah
maybe we just change, honestly. i think when i were younger there was nothing to lose, time felt unlimited, no "career" to gamble with, no billion dollar idea, just learning and tinkering and playing with whatever was out there because it was cool and interesting to me. in some respects i miss that.
not sure how that relates to llms but it does become an unblocker to regain some of that "magic", but also i know to deep dive requires an investment i cannot shortcut.
the new generation of devs are already playing with things few dinosaurs will get to experience fully, having sunk decades into the systems built and afraid to let it go. some of that is good (to lean on experience) and some of it holding us back.
jppope
Fantastic Article, well written, thoughtful. Here are a couple of my favorite quotes:
* "Then it professionalised. Plug and Play arrived. Windows abstracted everything. The Wild West closed. Computers stopped being fascinating, cantankerous machines that demanded respect and understanding, and became appliances. The craft became invisible."
* "The machines I fell in love with became instruments of surveillance and extraction. The platforms that promised to connect us were really built to monetise us. The tinkerer spirit didn’t die of natural causes — it was bought out and put to work optimising ad clicks."
* "Previous technology shifts were “learn the new thing, apply existing skills.” AI isn’t that. It’s not a new platform or a new language or a new paradigm. It’s a shift in what it means to be good at this."
* "They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of... But sure. AI is the moment they lost track of what’s happening."
* "Typing was never the hard part."
* "I don’t have a neat conclusion. I’m not going to tell you that experienced developers just need to “push themselves up the stack” or “embrace the tools” or “focus on what AI can’t do.” All of that is probably right, and none of it addresses the feeling."
To relate to the author, I think with a lot of whats going on I feel the same about, but other parts I feel differently than they do. There appears to be a shallowness with this... yes we can build faster than ever, but so much of what we are building we should really be asking ourselves why do we have to build this at all? Its like sitting through the meeting that could have been an email, or using hand tools for 3 hours because the power tool purchase/rental is just obscenely expensive for the ~20min you need it.
paulmooreparks
I'm 55 and I started at age 13 on a TI-99/4A, then progressed through Commodore 64, Amiga 2000, an Amiga XT Sidecar, then a real XT, and on and on. DOS, Windows, Unix, the first Linux. I ran a tiny BBS and felt so excited when I heard the modem singing from someone dialing in. The first time I "logged into the Internet" was to a Linux prompt. Gopher was still a bigger thing than the nascent World-Wide Web.
The author is right. The magic has faded. It's sad. I'm still excited about what's possible, but it'll never create that same sense of awe, that knowledge that you can own the entire system from the power coming from the wall to the pixels on your screen.
show comments
xg15
> They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
But sure. AI is the moment they lost track of what’s happening.
I feel this is conflating different things. Yes, the abstraction tower was massive already before, but at least the abstractions were mostly well-defined and understandable through interfaces: even if you don't understand the intricacies of your storage device, driver and kernel, you can usually get a quite reliable and predictable mental representation how files work. Same goes for network protocols, higher-level programming languages or the web platform.
Sure, there are edge cases where the abstraction breaks down and you have to get into the lower levels, but those situations are the exception, not the norm.
With AI, there is no clearly defined interface, and no one really knows what (precise) input a given output will produce. Or maybe to put it better, the interface is human language and your mental representation is the one you have talking to a human - which is far more vague than previous technical abstractions.
On the bright side, at least we (still) have the intermediate layer of generated code to reason about, which offsets the unpredictability a bit.
harel
I've had the same journey, same age markers. The sentiment is the same, but at the same time this new world affords me super powers I'm currently drunk on. When that drunkenness becomes a hangover I hope I won't be disappointed.
hmaxwell
You can still write code yourself. Just like you can still walk to work, you do not need to use a car.
ge96
Yeah I could use Cursor or whatever but I don't, I like writing code. I guess that makes me a luddite or something, although I still develop agents. I enjoy architecting things (I don't consider myself an architect) I'm talking about my hobby hardware projects.
My_Name
The irony is that you could still code the way you always did, where you control every pixel. Nothing is stopping you.
But you would not be able to make anything anywhere near as complex as you can with modern tools.
towndrunk
I know exactly how you feel. I don't know how many hours I sat in front of this debugger (https://www.jasik.com) poking around and trying to learn everything at a lower level. Now its so different.
show comments
karolist
Was this text run through LLM before posting? I recognize that writing style honestly; or did we simply speak to machines enough to now speak like machines?
show comments
sebnukem2
Oh boy this hits home.
At this point I entered surviving mode, and curious to see where we will be 6 months, 2 years from now. I am pessimistic.
I want to tinker with my beloved Z80 again.
dcanelhas
> I wrote my first line of code in 1983. I was seven years old, typing BASIC into a machine that had less processing power than the chip in your washing machine
I think there may be a counterpoint hiding in plain sight here: back in 1983 the washing machine didn't have a chip in it. Now there are more low-level embedded CPUs and microcontrollers to develop for than before, but maybe it's all the same now. Unfathomable levels of abstraction, uniformly applied by language models?
stronglikedan
Same, but it changed when I was 17 and again when I was 27 and then 37 and so on. It has always been changing dramatically, but this latest leap is just so incredibly different that it seems unique.
KingOfCoders
Cool, at 7? I started at 9 and I'm 53 now. And Claude does all the things. Need to get adjusted to that though. Still not there.
Last year I found out that I always was a creator, not a coder.
CrzyLngPwd
Starting code when I was 14, sold my first bit of code at 17, which was written in 6502 assembler.
40+ years later, been through many BASICs, C, C++ (CFront on onwards) and now NodeJS, and I still love writing code.
Tinkering with RPi, getting used to having a coding assistant, looking forward to having some time to work on other fun projects and getting back into C++ sooooon.
What's not to love?
metalrain
I think it's the loss of control.
Even if you can achieve awesome things with LLMs you give up the control over tiny details, it's just faster to generate and regenerate until it fits the spec.
But you never quite know how long it takes or how much you have to shave that square peg.
sowbug
Did hardware engineers back in the 1970s-80s* think that software took the joy out of their craft? What do those engineers now think in retrospect?
*I'm picking that era because it seems to be when most electronic machines' business logic moved from hardware to software.
aldousd666
I'm 46 but same. I'm not quite as melancholy about it, but I do feel a lot of this.
pfdietz
I retired a few years ago and it's very clear that was a good thing.
rraghur
Are you me?
I'm 49.... Started at 12... In the same boat
First 286 machine had a CMOS battery that was loose so I had to figure that out to make it boot into ms-dos
This time it does feel different and while I'm using them ai more than ever, it feels soulless and empty even when I 'ship' something
28304283409234
> Cheaper. Faster. But hollowed out.
Given the bazillions poured into it I have yet to see this proven to be cheaper.
suprstarrd
This is at least partially AI-written, by the way
cs02rm0
I'm 43. Took a year or so off from contracting after being flat out for years without taking any breaks, just poked around with some personal projects, did some stuff for my wife's company, petitioned the NHS to fix some stuff. Used Claude Code for much of it. Travelled a bit too.
I feel like I turned around and there seem to be no jobs now (500+ applications deep is a lot when you've always been given the first role you'd applied to) unless you have 2+ years commercial AI experience, which I don't, or perhaps want to sit in a SOC, which I don't. It's like a whole industry just disappeared while I had my back turned.
I looked at Java in Google Trends the other day, it doesn't feel like it was that long ago that people were bemoaning how abstracted that was, but it was everywhere. It doesn't seem to be anymore. I've tried telling myself that maybe it's because people are using LLMs to code, so it's not being searched for, but I think the game's probably up, we're in a different era now.
Not sure what I'm going to do for the next 20 years. I'm looking at getting a motorbike licence just to keep busy, but that won't pay the bills.
show comments
enricotr
The deepest thing I read from HN in months. Respect.
fabiensanglard
> the VGA Mode X tricks in Doom
Doom does not use mode-X :P ! It uses mode-Y.
That being said as a 47 years old having given 40 years to this thing as well, I can relate to the feeling.
dave_sid
Great post. Good to see someone posting something positive for a change about the shift in development.
onion2k
It'd be more strange if the thing you learned 43 years ago was exactly the same today. We should expect change. When that change is positive we call it progress.
show comments
kwar13
I am younger than the author but damn this somehow hit me hard. I do remember growing up as a kid with a 486...
TimPC
I think more than ever programmers need jobs where performance matters and the naive way the AI does things doesn't cut it. When no one cares about things other than correctness your job turns into AI Slop. The good news right now is that AI tends to produce things that AI struggles to do well with so large scale projects often descend into crap. You can write a C-compiler for $20,000 with an explosive stack of agents, but that C-compiler isn't anywhere close to efficient or performant.
As model costs come down that $20,000 will become a viable number for doing entirely AI-generate coding. So more than ever you don't want to be doing work that the AI is good enough at. Either jobs where performance matters or being able to code the stack of agents needed to produce high quality code in an application context.
show comments
elzbardico
I don't know what these people from our now traditional daily lamentation session are coding where Claude can do all the work for them just with a few prompts and minimal reviews.
Claude is a godsend to me, but fuck, it is sometimes dumb as door, loves to create regressions, is a fucking terrible designer. Small, tiny changes? Those are actually the worse, it is easy for claude, on the first setback, decides to burn the whole world and start from zero again. Not to mention when it gets stuck in an eternal loop where it increasingly degenerates the code.
If I care about what I deliver, I have to actively participate in coding.
franze
I'm 47 and excited to live in a time of the moat important innovation since the printing press.
josefrichter
A bit younger, and exact opposite. Probably the most excited I've ever been about the state of development!
cadamsdotcom
Abstractions can take away but many add tremendous value.
For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.
Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.
Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.
Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.
These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.
You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).
Or to take a moment to marvel.
small_model
Same as assembly programmers felt when C came along I guess
show comments
phendrenad2
As someone who has always enjoyed designing things, but was never really into PUZZLES, I always felt like an outsider in the programming domain. People around me really enjoyed the "fun" of programming, whereas I was more interested in the Engineering of the thing - balancing tradeoffs until within acceptable margins and then actually calling it "DONE". People around me rarely called things "done", they rewrote it and rewrote it so that it kept satisfying their need for puzzle-solving (today, it's Ruby, tomorrow, it's rewritten in Scala, and the day after that, it's Golang or Zig!)
I feel that LLMs have finally put the ball in MY court. I feel sorry for the others, but you can always find puzzles in the toy section of the bookstore.
4ndrewl
"They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of."
and they still call themselves 'full stack developers' :eyeroll:
bananamogul
'It’s not a “back in my day” piece.'
That's exactly what it is.
burnerToBetOut
> …Not burnout…
Than meybe wadeAfay? ;)
innocentoldguy
I have been around for a similar amount of time. Another change I have seen over the years is the shift from programming being an exercise in creative excellence at work to being a white-collar ditch-digging job.
leesec
yeah coding is a lot more fun and useful now
show comments
sheikhnbake
At least parts of this were written with AI
peter_d_sherman
>"The abstraction tower
Here’s the part that makes me laugh, darkly.
I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.
They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
But sure. AI is the moment they lost track of what’s happening.
The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack.
AI is just the layer that made the pretence impossible to maintain."
Absolutely brilliant writing!
Heck -- absolutely brilliant communicating! (Which is really what great writing is all about!)
You definitely get it!
Some other people here on HN do too, yours truly included in that bunch...
I was happy riding my horse when this dude invented a car.
coldtea
>But sure. AI is the moment they lost track of what’s happening. The abstraction ship sailed decades ago.
Bullshit. While abstraction has increased over time, AI is no mere incremental change. And the almost natural language interaction with an agent is not the same as Typescript over assembly (not to mention you could very well right C or Rust and the like, and know most of the details of the machine by heart, and no, microcode and low level abstractions are not a real counter-argument to that). Even less so if agents turn autonomous and you just herd them onto completion.
show comments
j45
Programming changed all along.
New concepts came out all along.
They became standardized all along and came down market to smaller and smaller projects.
Source control.
Cloud.
Agile/Scrum.
Code completion IDEs.
Higher Level languages.
These were not LLMs but did represent a shift that had to be kept up with.
LLMs are no different, just a bigger jump.
There is just as much opportunity here.
Software development and software developers are not going away.
More software that never could be built will now be built.
For the forseeable future there will always be software that needs to be overseen by a human.
bmitc
Humans have a special knack for taking the humanity out of basically anything. It's a bizarre pattern.
tonymet
I have the opposite take. There’s nothing stopping you from jumping into any component to polish things up. You can code whatever you wish. And AI takes away nearly all of the drudgery : boilerplate, test cases, inspecting poor documentation, absurd tooling.
It also lets me focus more on improving things since I feel more liberated to scrap low quality components. I’m much braver to take on large refactors now – things that would have taken days now take minutes.
In many ways AI has made up for my growing lack of patience and inability to stay on task until 3am.
show comments
anarticle
I've written sse2 optimized C, web apps, and probably everything in between (hw, datasci, etl, devops).
I like coding with AI both vibe and assisted, since as soon as the question enters my head I can create a prototype or a test or a xyz to verify my thoughts. The whole time I'm writing in my notebook or whiteboard or any other thing I would have gotten up to. This is enabling tech, the trouble for me is there is a small thread that leads out of the room into the pockets of billion dollar companies.
It is no longer you vs the machine.
I have spent tons of time debugging weird undocumented hardware with throwaway code, or sat in a debugger doing hex math.
I think one wire that is crossed right now in this world is that computing is more corporate than ever, with what seems like ever growing platforms and wealth extraction at scale. Don't let them get you down, host your own shit and ignore them. YES IT WILL COST MORE -> YOUR FREEDOM HAS A PRICE.
Another observation is that people that got into the game for pure money are big mad right now. I didn't make money in the 00s, I did in the end of the 10s, and we're back at job desolation. In my groups, the most annoyed are code boot campers who have faked it until they made it and have just managed to survive this cycle with javascript.
Cycles come and go, the tech changes, but problem solving is always there.
codr7
It's not like it's changing by itself, you can always opt out of the slop race and scratch your itches instead.
“... when I was 7. I'm 50 now and the thing I loved has changed”
Welcome to the human condition, my friend. The good news is that a plurality of novels, TV shows, country songs, etc. can provide empathy for and insight into your experience.
zzzeek
I'm 57 and wrote my first line of BASIC in 1980, so while I can still chime in on this specific demographic I feel that I ought to. So im like this guy, but like a lot of other people in my specific demographic we aren't writing these long melancholy blog posts about AI because it's not that big of a deal. As an OSS maintainer most of my work is a lot of boring slog adding features to libraries to suit new features in upstream dependencies, nitpicky things people point out, new docs, tons of tedium. Claude helps a ton with all of that. no way is Claude doing the real architectural puzzle stuff, that's still fully on me! I can just use Claude to help implement it. It's like the ultimate junior programmer assistant. It's certainly a new, different and unique experience in one's programming career but it really feels like another tool, like an autocomplete or code refactoring tool that is just a lot better, with similar caveats. I mean in my career, I've had to battle the whole time people who don't "get" source code control (starting with me), who don't "get" IDEs (starting with me), people who dont "get" distributed version control (same), people who don't "get" ORMs (oh yes, same for me though this one I took much more dramatic steps to appreciate them), people who don't "get" code formatters, now we're battling people who don't "get" LLMs used for coding, in that sense the whole thing doesnt feel like that novel of a situation.
it's the LLMs that are spitting out fake photos and videos and generating lots of shitty graphics for local businesses, that's where I'm still wielding a pitchfork...
ossa-ma
The irony of these "My craft is dead" posts is that they consistently, heavily leverage AI for their writing. So you're crying about losing one craft to AI while using AI to kill another. It's disingenuous. And yes it is so damn obvious.
show comments
kylehotchkiss
There's 3-4 of these posts a day - why don't people spend more time hand-building things for fun in their free time? That's what led a lot of us to this career path to start with. I have a solid mix of hand-code and AI-assisted projects in my free time.
toss1
>>The machines I fell in love with became instruments of surveillance and extraction.
Surveillance and Extraction
"We were promised flying cars", and what we got was "investors" running the industry off the cliff into cheap ways to extract money from people instead of real innovation.
sweetheart
> I started programming when I was seven because a machine did exactly what I told it to
What a poetic ending. So beautiful! And true, in my experience.
bitwize
This isn't new. It's the same feeling the first commercial programmers had working in assembly, or machine code, once compilers became available. Ultimately I think even Mel Kaye forsook being able to handpick memory locations for optimum drum access before his retirement, in favor of being able to build vastly more complex software than before.
AI has just vastly extended your reach. No sense crying about it. It is literally foolish to lament the evolution of our field into something more.
nprateem
Programming is dead. In the last 4 days I've done 2 months of work. The future is finally here.
Bad times to be a programmer. Start learning business.
delaminator
I'm 57. I was there when the ZX81 came out.
I had my first paid programming job when I was 11, writing a database for the guy that we rented our pirate VHS tapes from.
AI is great.
tiahura
Don't program as a career, but am also 50 and programming since TRS-80. AI has transformed this era, and I LOVE IT! I can focus on making and not APIs or syntax or all of the bootstrapping.
show comments
mrcwinn
Professional development is changing dramatically. Nothing stops anyone from coding "the old way," though. Your hobby project remains yours, exactly the way you want it. Your professional project, on the other hand, was never about you in the first place. It's always about the customer/audience/user, period full stop.
AndrewKemendo
Please stop upvoting these posts. We have gotten to the point where both the front page and new page is polluted with these laments
It’s literally the same argument over and over and it’s the same comments over and over and over
HN will either get back to interesting stuff or simply turn into a support group for aging “coders” that refuse to adapt
I’m going to start flagging these as spam
show comments
throwawaymeta01
same bud.
maybe that just means it's a maturing field and we gotta adapt?
yes, the promise has changed, but you still gotta do it for the love of the game. anything else doesnt work.
AIorNot
I’m 50 too and I’ve complained and yearned about the “old” days too, a lot of this is nostalgia as we reminisce about periods of time in our youth when we had the exuberance and time to play and build with technology of our own time
Working in AI startups strangely enough I see a lot of the same spirit of play and creativity applied to LLM based tools - I mean what is OpenClaw but a fun experiment
Those kids these days are going to reminisce about the early days of AI when prompts would be handwritten and LLMs would hallucinate
I’m not really sure 1983, 1993 or 2003 really was that gold of age but we look at it with rose colored glasses
almosthere
11 and now 45. I am still interested in it, but I feel like in my 20s I would get a dopamine rush when a row showed up in a database. In my 30s I would get that only if a message passed through a system and updated on-screen analytics within 10 seconds. Thank god for LLMs because all of it became extremely boring, I can't stand having to get these little milestones each new company or each new product I'm working on. At least with LLMs the dopamine hit comes from being in awe of the code that gets generated and realizing it found every model, every messaging system interface, every API, and figuring out how to make it backwards compatible, updating the UI - something that would take half a day, now in 5 minutes or less.
themafia
> I’ve had that experience. And losing it — even acknowledging that it was lost
What are you talking about? You don't know how 99% of the systems in your own body work yet they don't confront you similarly. As if this "knowledge" is a switch that can be on or off.
> I gave 42 years to this thing, and the thing changed into something I’m not sure I recognise anymore.
Stop doing it for a paycheck. You'll get your brain back.
I'm 61 (retired when I was 57).
I too began with BASIC (but closer to 1980). Although I wrote and published games for the Macintosh for a number of years as I finished up college, my professional career (in the traditional sense) began when I was hired by Apple in 1995 and relocated to the Bay Area.
Yeah, what started out as a great just got worse and worse as time went on.
I suspect though that to a large degree this reflects both the growing complexity of the OS over that time as well as the importance of software in general as it became more critical to people's lives.
Already, even in 1984 when it was first introduced, the Mac had a rich graphics library you would not want to have to implement yourself. (Although famously of course a few apps like Photoshop nonetheless did just that—leaning on the Mac simply for a final call to CopyBits() to display pixels from Adobe's buffer to the screen.)
You kind of have to accept abstraction when networking, multiple cores, multiple processes become integral to the machine. I guess I always understood that and did not feel too put out by it. If anything a good framework was somewhat of a relief—someone else's problem, ha ha. (And truly a beautiful API is just that: a beautiful thing. I enjoy working well constructed frameworks.)
But the latter issue, the increasing dominance of software on our lives is what I think contributed more to poisoning the well. Letting the inmates run the asylum more or less describes the way engineering worked when I began at Apple in 1995. We loved it that way. (Say what you want about that kind of bottom-up culture of that era, but our "users" were generally nerds just like us—we knew, or thought we knew anyway, better than marketing what the customer wanted and we pursued it.)
Agile development, unit tests, code reviews… all these weird things began to creep in and get in the way of coding. Worse, they felt like busywork meant simply to give management a sense of control… or some metric for progress.
"What is our code coverage for unit test?" a manager might ask. "90%," comes the reply from engineering. "I want to see 95% coverage by next month," comes the marching orders. Whatever.
I confess I am happy to have now left that arena behind. I still code in my retirement but it's back to those cowboy-programmer days around this house.
Yee haw!
My advice to everyone feeling existential vertigo over these tools is to remain confident and trust in yourself. If you were a smart dev before AI, chances are you will remain a smart dev with AI.
My experience so far is that to a first approximation, the quality of the code/software generated with AI corresponds to the quality of the developer using the AI tool surprisingly well. An inexperienced, bad dev will still generate a sub-par result while a great dev can produce great results.
The choices involved in using these tools are also not as binary as they are often made out to be, especially since agents have taken off. You can very much still decide to dedicate part of your day to chiseling away at important code to make it just right and make sure your brain is engaged in the result and exploring and growing with the problem at hand, while feeding background queues of agents with other tasks.
I would in fact say the biggest challenge of the AI tool revolution in terms of what to adapt to is just good ol' personal time management.
Wow... I really relate to this. I'm 50 as well, and I started coding in 1985 when I was 10... I remember literally every evolutionary leap forward and my experience with this change has been a bit different.
Steve Yegge recently did an interview on vibe coding (https://www.youtube.com/watch?v=zuJyJP517Uw) where he says, "arch mage engineers who fell out-of-love with the modern complexity of shipping meaningful code are rediscovering the magic that got them involved as engineers in the first place" <-- paraphrased for brevity.
I vividly remember, staying up all night to hand-code assembler primitive rendering libraries, the first time I built a voxel rendering engine and thinking it was like magic what you could do on a 486... I remember the early days at Relic, working on Homeworld and thinking we were casting spells, not writing software. Honestly, that magic faded and died for me. I don't personally think there is magic in building a Docker container. Call me old-fashioned.
These days, I've never been more excited about engineering. The tedium of the background wiring is gone. I'm back to creating new, magical things - I'm up at 2 AM again, sitting at my desk in the dark, surrounded by the soft glow of monitors and casting spells again.
What the author describes is also the feeling when you shift from being a developer all day to being a team lead or manager. When you become a lead you have to let go and get comfortable with the idea that the code is not going to be how you would do it. You can look at code produced by your team and attempt to replace it all with your craftsmanship but you're just setting yourself up to fail. The right approach is use your wisdom to make the team better, not the code. I think a lot of that applies to using AI when coding.
I'm turning 50 in April and am pretty excited about AI coding assistants. They make a lot of personal projects I've wanted to do but never had the time feasible.
I am much younger than the author, but I've been coding for most of my life and I find close to no joy in using AIs. For me coding has always been about the nitty-gritty quirkiness of computers, languages, solving issues and writing new cool things for the sake of it. It was always more about the journey than the end goal, and AI basically hollows out all of the interesting bits about coding. It feels like skipping straight to the end of a book, or somewhat like that.
I don't know if I am the only one, but developing with chatbots in my experience turns developing software into something that feels more akin to filling out forms or answering to emails. I grieve for the day we'll lose what was once a passion of mine, but unfortunately that's how the world has always worked. We can only accept that times change, and we should follow them instead of complaining about it.
> I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic. I’m fifty now, and the magic is different, and I’m learning to sit with that.
Don't take this the wrong way but this is more of an age thing rather than a technology advancement thing.
Kids growing up nowadays that are interested in computers grow up feeling the same magic. That magic is partly derived from not truly understanding the thing you are doing and creating a mental "map" by yourself. There is nothing intrinsic to computing nowadays that makes it less magic than fiddling around with config.sys, in 50 years there will be old programmers reminiscing of "Remember when all new models were coming out every few months and we could fiddle around with the vector dimensionality and chunking length to get the best of gpt-6.2 RAG? Those were the times".
It seems AI is putting senior developers into two camps. Both groups relate to the statement, "I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic. I’m fifty now, and the magic is different, and I’m learning to sit with that."
The difference is that the first camp is re-experiencing that feeling of wonder while the second camp is lamenting it. I thankfully fall in the first camp. AI is allowing me to build things I couldn't, not due to a lack of skills, but a lack of time. Do you want to spend all your time building the app user interface, or do you want to focus on that core ability that makes your program unique? Most of us want the latter, but the former takes up so much time.
Thank you for writing this. My feelings are very similar to the ones described by the author and the timeline almost matches. The thrill of tecnology for me started to fast decay since the early 2010s and now I see it as a no-return stage. I still have fun with my retro hardware & software but I am no longer an active practitioner and I have pivoted my attention and my efforts somewhere else. Unfortunately, I no longer feel excited for the future decades of tech and I am distancing myself from it.
I'm lucky because I work as an independent consultant. I get paid to deliver solutions, but I get to choose how to create those solutions. I write whatever code I want however I want. As long as it solves the problem, no one cares.
I started programming in 1980, and I having just as much fun now as I did then. I literally cannot wait to sit down at my IDE and start writing.
But that was not always true. When I worked for a larger company, even some startups, it was not always fun. There's something about having full control over my environment that makes the work feel like play.
If you feel like programming isn't fun anymore, maybe switching to a consulting gig will help. It will give you the independence and control that you might be craving.
6 or 7 , 38 now -- and having a blast.
it isn't all funeral marches and group crying sessions.
And don't let the blog post fool you , it is a rant about AI -- otherwise we would have heard complaints about the last 200 paradigm shifts in the industry over the past thirty years.
Sure, we got our share of dilbert-style agile/waterfall/tdd jokes shoved in our face, but no one wrote a blog post about how their identity was usurped by the waterfall model .
>And different in a way that challenges the identity I built around it and doesn’t satisfy in the way it did.
Everyone should do their own thing, but might I suggest that it is dangerous for anyone in this world to use a single pillar as their foundation for all identity and plinth of their character.
I'm the exact age as the author and this post could have been written by me (if I could write). It echoes my story and sentiment exactly right down to cutting my literal baby teeth on a rubber key ZX Spectrum.
The anxiety I have that the author might not be explicitly stating is that as we look for places we add genuine value in the crevices of frontier models' shortcomings those crevices are getting more narrow by the day and a bit harder to find.
Just last night I worked with Claude and at the end of the evening I had it explain to me what we actually did. It was a "Her" (as in the movie) moment for me where the AI was now handholding me and not the other way around.
I'm 60, started with a Tandy Model I in junior high, learned 6809 assembly for my Color Computer, loved the fact we could put certain values in particular memory positions and change the video mode and put pixels to the screen. It's been decades of losing that level of control, but for me coding is the fun part. I've never lost that spark of enjoyment and really obsession I felt early on. I enjoy the supposedly boring job of writing SQL and C with embedded SQL and working with business concepts to produce solutions. Coding is the fun part for me, even now.
I got moved up the chain to management and later worked to get myself moved back down to a dev role because I missed it and because I was running into the Peter Principle. I use AI to learn new concepts, but mostly as a search engine. I love the tech behind it, but I don't want it coding for me any more than I want it playing my video games for me. I was hoping AI would show up as robots doing my laundry, not doing the thing I most enjoy.
I'm a developer, mid/late fifties. My first computer was a Commodore Vic 20, so I guess I started writing code at about the same time as the OP even if I'm a few years older.
Yes, I mourn the end of my craft and all that that. But also:
This isn't the end of hand-written code. A few will still get paid to do it in niche domains. Some will do it as a hobby or craft activity - like oil painting or furniture making. The tooling will move on and become more specialised and expensive. Like owning Japanese woodworking tools.
But software construction as a human-based economic activity is clearly about to slam hard into a singularity, and many of us who rely on our hard-won skills to pay the bills and survive are going to find ourselves unemployed and unemployable. A few early adopters will get to stay on and sip their artesanal coffee and "build beautiful things" while their agent herds toil. But most of us won't. Software has always mostly been just CRUD apps, and that is going to need a whole lot less people going forward. People like me, perhaps, or you.
Some, who have sufficient financial and chronological runway, will go off and do other things. Many won't have that opportunity. I have personal experience of late-career unemployment - although I'm currently working - and its not pretty. A lot of lives are going to to be irreparably disrupted by this. Personally, I'd hoped that I could make it through to some stable kind of retirement, but I just don't see it anymore.
Having been in this game about 10 years longer I can understand how he feels. I distinctly remember when I realized that C compilers for the ARM produced better assembly than I could code by hand. Bitter sweet but the code being written became larger and more complex because of it.
Modern coding has become more complex than I would have ever thought possible. The number of technologies an individual would have to master to actually be a expert "full stack" coder is ludicrous. It is virtually impossible for an individual to prototype a complex Web based app by themselves. I think AI will lower that barrier.
In return we will get a lot more software - probably of dubious quality in many cases - as people with "ideas" but little knowledge start making apps. Not a totally bad thing but no utopia either. I also think it will likely reduce the amount of open source software. Content producers are already hoarding info to prevent AI bots from scraping it. I see no reason to believe this will not extend to code as more programmers find themselves in a situation more akin to musicians than engineers.
The contrast between this and https://news.ycombinator.com/item?id=46923543 (Software engineering is back) is kind of stark. I am using frontier models to get fun technical projects done that I simply didn't have time for since my late teens. It is still possible to understand an architecture down to the hardware if you want to, but it can happen a lot faster. The specifications are queryable now. Obscure bugs that at least one person has seen in the past are seconds away instead of minutes or hours of searching. Even new bugs have extra eyes on them. I haven't written a new operating system yet but it's now a tractable problem. So is using Lean or Julia or some similar system to formally specify it. So far I've been digging into modern multithreaded cache performance which is just as fascinating as directly programming VGA and sound was in the early PC days. Linux From Scratch is still up to date. You can get FPGAs that fit in your USB port [0]. Technical depth and low-level understanding is wherever you want to look for it.
[0] https://www.crowdsupply.com/sutajio-kosagi/fomu
Programming is not art for me. I do not find it useful to gold plate solutions. I prefer getting the job done, sometimes by any means necessary for "the vehicle" to continue running.
AI often generates parts of code for my hobby projects, which allow me speed running with my implementation. It often generates errors, but I am also skilled, so I fix error in the code.
I use AI as boiler plate code generator, or documentation assist, for languages I do not use daily. These solutions I rarely use 1:1, but if I had to go through readme's and readthedocs, it would take me a lot longer.
Would there be more elegant solutions? often - yes. Does it really matter? For me - not.
I don't disagree that technology is less fun in an AI era. The question is, what other careers are out there for someone who wants to make things?
About a decade ago, I went through a career crisis where I couldn't decide what job to do - whether technology was really the best choice for my particular temperament and skills.
Law? Too cutthroat. Civil service? Very bureaucratic. Academia? Bad pay. Journalism? An industry in decline.
It is a shame, what is happening. But I still think, even with AI hollowing out the fun parts, tech remains the best job for a smart, motivated person who's willing to learn new things.
I think one of the big distinctions between people who like building with AI and those who don't, is that the people who are pro-AI are building their own ideas, of which they have many.
The people who are anti-AI are largely building other people's ideas, for work. And they have no desire to ramp up velocity, and it's not helpful to them anyway because of bureaucratic processes that are the real bottleneck to what they're building.
Not everyone falls into these silos, of course.
Not going to pull age or title rank here -- but I suggest if your use of AI feels empty, take advantage of its speed and plasticity and iterate upon its output more, shape the code results. Use it as a sculptor might too -- begin with its output and make the code your own. I particularly like this latter approach when I am tasked with use of a language I view as inferior and/or awkward. While this might read as idealistic, and I agree that there are situations where this interaction is infeasible or inappropriate, you should also be encountering problems where AI decidedly falls on its face and you need to intervene.
At my first full time job in the early 2000s I was tasked with building a webscraper. We worked for law firms representing Fortune 500 companies and they wanted to know who was running "pump and dump" stock schemes on stocks using Yahoo Finance message boards.
At the time, I didn't know the LWP::Simple module existed in Perl so I ended up writing my own socket based HTTP library to pull down the posts, store them in a database etc. I loved that project as it taught me a lot about HTTP, networking, HTML, parsing and regexes.
Nowadays, I use playwright to scrape websites for thing I care about (e.g. rental prices at the Jersey Shore etc). I would never think to re-do my old HTTP library today while still loving the speed of modern automation tools.
Now, I too have felt the "but I loved coding!" sense of loss. I temper that with the above story that we will probably love what comes next too (eventually).
There's nothing "hollowed out" about directing an AI effectively, the feedback is as quick and tight as it always was. The trick is that you don't just "vibe code" and let the AI one-shot the whole thing: you should propose the change first and ask the AI about a good, detailed plan for implementing it. Then you review what the robot has proposed (which is trivial compared to revising code!) make sensible changes, ask for feedback again, and repeat. By the time the AI bot has to write actual code, it's not running on vibes anymore: it's been told exactly what to do and how to assess the result. You spend more time upfront, but a lot less on fixing the AI's mistakes.
A blacksmith was a person that picked up chunks of carbon and heated them to they were glowing red and beat the iron to submission with a hammer in their hands.
Today iron is produced by machines in factories by the mega-tonne.
We just happened to live in the age where code when from being beaten by hand to a mass produced product.
And so the change of technology goes.
Oh my god. This is me. If I were any better at writing, I could have written this, the author is even the same age as me (well, a year younger) and followed a similar trajectory. And a lot of what I've been feeling lately feels similar to burnout (in fact I've been calling it that), but it really isn't burnout. It's... this, whatever this is... a "fallow period" is a good term.
And I feel like an old man grumbling about things changing, but... it's not the same. I started programming in BASIC on my Tandy 1000 and went to college and learned how to build ISA cards with handwritten oscilloscope software in the Computer Engineering lab. My first job was writing firmware. I've climbed so far up the abstraction chain over a thirty year career and I guess I don't feel the same energy from writing software that first got me into this, and it's getting harder to force myself to press on.
This essay begins by promising not to be a "back in my day" piece, but ends up dunking on 20-year-olds who are only a few years into their career, as if they have any choice about when they were born.
I too get less of a kick out of writing enterprise middleware than I did making games as a kid in the 80s. Why did the industry do this to me?!
A lot of that magic still remains in embedded.
If vendors can't be bothered to use a C compiler from the last decade, I don't think they'll be adopting AI anytime soon.
At my work, as of 2026, we only now have a faction riled up about evangelizing clean code, OOP, and C++ design patterns. I hope the same delay keeps for all the rest of the "abstraction tower".
I gave up after the third “It’s not X, it’s Y” in like two paragraphs. Is nobody else allergic to that AI voice? Isn’t the author?
So depressing this is the current state of blogging. Can’t wait for this phase to be over.
I humbly submit this interview with Grady Booch (if you know, you know) talking about the "3rd golden age of software engineering - thanks to AI": https://youtu.be/OfMAtaocvJw
I feel like the conversation does a good job of couching the situation we find ourselves in.
I am a little older than OP. I don't think I've ever had that feeling about a programming project for work that came from someone else.
Generally, I get that feeling from work projects that I've self-initiated to solve a problem. Fortunately, I get the chance to do this a lot. With the advent of agentic coding, I am able to solve problems at a much higher rate.
Quite often, I'll still "raw dog" a solution without AI (except for doc lookups) for fun, kind of as a way to prove to myself I can still do it when the power's out.
I'm the exact same demographic as the author, just turned 50, writing code since childhood in BASIC. I'm dealing with the AI in programming issue by ignoring it.
I still enjoy the physical act of programming so I'm unsure why I should do anything that changes that. To me it's akin to asking a painter to become a photographer. Both are artists but the craft is different.
Even if the AI thing is here to stay, I think there will be room for people who program by hand for the same reason there's still room for people who paint, despite the invention of the camera.
But then, I'm somebody who doesn't even use an IDE. If I find an IDE obtrusive then I'm certain I'll find an AI agent even more so.
The deep, profound, cruel irony of this post is that it was written by AI.
Maybe if you work in the world of web and apps, AI will come for you. If you don't , and you work in industrial automation and safety, the I believe it will not.
I was 7 in 1987, learned LOGO and C64 BASIC that year, and I relate to this article as well.
It feels as though a window is closing upon the feeling that software can be a powerful voice for the true needs of humanity. Those of us who can sense the deepest problems and implications well in advance are already rare. We are no more immune to the atrophy of forgetting than anyone.
But there is a third option beyond embrace or self-extinguish. The author even uses the word, implying that consumers wanted computers to be nothing more than an appliance.
The third option is to follow in the steps of fiction, the Butlerians of Dune, to transform general computation into bounded execution. We can go back to the metal and create a new kind of computer; one that does have a kind of permanence.
From that foundation, we can build a new kind of software, one that forces users to treat the machine as appliance.
It has never been done. Maybe it won't even work. But, I need to know. It feels meaningful and it has me writing my first compiler after 39 years of software development. It feels like fighting back.
Total resonance with this part :
"They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of."
Wow this hits home - I just turned 51 and I also started coding at age 7, writing BASIC on a TRS-80 Model III.
I still have a very distinct memory when my father told me he was buying us our first home computer. I remember him telling me that you could use the computer to make games. I was so excited by the idea and amazing by this technology (that I hadn't yet even remotely understood). I remember saying "Oh, you just tell it to make a game? And it makes a game?" He explained to me then what programming was.
When we got the TRS-80, he and I worked together to build a game. We came up with an idea for a text adventure game called "Manhole Mania" - you were a city works employee exploring the sewers after reports of strange noises. We never finished much of it - maybe just the first few "rooms".
Maybe this weekend I will tell Codex to make me a game.
Same, been a product designer for years, still love design deep down but the essence is somehow not there anymore. reading this hit different. It's refreshing to see someone put it into words instead of the usual "stuff".
It lines up a lot with what I've been thinking as well and this is what I wrote today on my blog. https://www.immaculateconstellation.info/why-ai-challenges-u...
I turn 52 this year. I also started at 10 years old programming in a combination of AppleSoft BASIC and assembly language and typing machine code out of books so I could use Double Hires graphics since it wasn’t supported by BASIc and doing my own assembly language programming.
I stuck with C and C++ as my bread and butter from 1996-2011 with other languages in between.
I don’t miss “coding” because of AI. My vision has been larger than what I could do myself without delegating for over a decade - before LLMs.
“coding” and/or later coordinating with people (dotted line) reporting to me has been a necessary evil until a year or two ago to see my vision go to implementation.
I absolutely love this new world. For loops and while loops and if statements don’t excite me in my 50s. Seeing my vision come to life faster than I ever could before and having it well archited does.
I love talking to “the business” and solving XYProblems and getting to a solution 3x faster
Late 30s here, I have seen:
* dial-up being replaced by DSL
* CAT being replaced with fiber for companies
* VOIP replacing bulk BPX
* Cloud replacing on-prem to an extent
* Cloud services plague now called SaaS
* License for life being replaced by subscription
* AI driving everything to shit literally
The technology is no longer helping anything, it is actually tearing our society apart. Up to 2000s, things were indeed evolution, improvements, better life style be it personal or professional. Since 2000s, Enshitification started, everything gets worse, from services, to workflows, to processes, to products, to laws.
Gen-Z does not realize how bad things are, and how we are no longer becoming smarter but dumber, kids cannot even read but have every single social media account.
If they could spend one day back in early 2000s, the current generation would start a civil war in every single city across the globe.
idk, i'm loving the newness of all of it, I feel more empowered than ever before, like it's my time. Before startups would take like a year to get going, now it's like a month or so. It's exciting and scary, we have no idea where it's going. Not boring at all. I was getting bored as shit and bam, now i can dream up shit quick and have it validated to, ya i figured that out with an MCP so ya this is my jam. Program MCPs and speed it up!!!!!!
Well yes it has changed. But look at everything that can be accomplished with these abstractions/libraries/frameworks that exist.
Why reinvent the wheel.
Yes, there might be less room for the Wild Wild West approach, as mentioned in the article: But that is the structure of compounded knowledge/tooling/code available to developers/others to create more enriched software, in the sense that it runs on what is available now and provides value in today's age of computing.
I also had a 486DX2-66. And I recall coding in Assembly, Pascal, C etc.
I do not miss it. These days I can create experiences that reach so many more people (a matured Interneet with realtime possibilities - to simplify) and with so much more potential for Good. Good in the sense of usefulness for users, good in the sense of making money (yeah, that aspect still exists).
I do understand your sentiment and the despairing tone. There have been times when I was struck by the same.
But I do not miss 1995 and struggling with a low-level formatted HD and Assembly that screwed up my floppy disks, or the worms that reached my box, or the awful web sites in terms of UX that were around, or pulling coaxial cables around for LAN parties.
It's just a different world now. But I get what you are saying, and respect it. Stay optimistic. :)
I'm a few years behind you. I got started on my uncle's handed down vic 20 in the late 80s.
The culture change in tech has been the toughest part for me. I miss the combination of curiosity, optimism, creativity, and even the chaos that came with it. Nowadays it's much harder to find organizations like that.
Some farmers probably lamented the rise of machines because they feared their strength would no longer be needed in the fields. These farmers were no doubt more concerned with their own usefulness as laborers than in the goals of the farm: to produce food.
If you program as labor, consider what you might build with no boss. You’re better equipped to start your own farm than you think.
50 myself, and started coding with a Commodore 64, but only really picked it up seriously with the advent of open source software, and that feeling of being able to dig around any component of the system I wanted to was exhilarating.
I think that's one of the biggest things that gives me pause about AI: the fact that, if they prove to be a big productivity boost, you're beholden to huge corporations, and not just for a one-time purchase, but on an ongoing basis.
Maybe the open source models will improve, but if keeps being driven by raw compute power and big numbers, it seems to tilt things very much in favor of those with lots and lots of capital to deploy.
I think the true genuinely-love-programming type of people will increasingly have to do what so many other people do, and that's separation of work and personal enjoyment. You might have to AI-architect your code at work, and hand code your toy projects on the weekend.
I prefer to see it as the automtion of the IT age.
All other professions had their time when technology came and automated things.
For example wood carvers, blacksmiths, butchers, bakers, candlestickmakers etc etc. All of those professions have been mostly taken over by machines in factories.
I view 'ai' as new machines in factories for producing code. We have reached the point where we have code factories which can produce things much more efficiently and quicker than any human can alone.
Where the professions still thrive is in the artisan market. There is always demand for hand crafted things which have been created with love and care.
I am hoping this stays true for my coding analogy. Then people who really care about making a good product will still have a market from customers who want something different from the mass produced norm.
I can share a similar experience: I began to learn programming during my first school years, on an Apple II clone with Logo, a fancy language with turtle graphics as a most distinctive feature. We used to boot Logo off 5.25" floppy disks...
It's turned from SimCity into SimSimCity. It's like playing a simulation where you manage a person who's playing SimCity.
> The feedback loop has changed. The intimacy has gone. The thing that kept me up at night for decades — the puzzle, the chase, the moment where you finally understand why something isn’t working — that’s been compressed into a prompt and a response
It's so strange to read because to me its never been more fun to make software, its especially never been easier for an individual. The boring parts are being automated so I can work on the bespoke and artistic parts. The feedback loop is getting shorter to making something nice and workable. The investigation tools for profiling and pinpointing performance bottlenecks are better than ever, where Claude is just one new part of it.
I have given the topic some thoughts. I concluded that the ONLY way for ordinary people (non-genius, IQ <= 120) to be really good, be really close to the genius, is to sit down, condensate the past 40 or so year's tech history of three topics (Comp-Arch, OS and Compiler) into a 4-5 years of self-education.
Such education is COMPLETELY different from the one they offered in school, but closer to those offered in premium schools (MIT/Berkeley). Basically, I'd call it "Software engineering archaeology". Students are supposed to take on ancient software, compile them, and figure out how to add new features.
For example, for the OS kernel branch:
- Course 0: MIT xv6 lab, then figure out which subsystem you are interested in (fs? scheduler? drivers?)
- Course 0.5: System programming for modern Linux and NT, mostly to get familiar with user space development and syscalls
- Course 1: Build Linux 0.95, run all of your toolchains in a docker container. Move it to 64-bit. Say you are interested in fs -- figure out the VFS code and write a couple of fs for it. Linux 0.95 only has Minix fs so there are a lot of simpler options to choose from.
- Course 2: Maybe build a modern Linux, like 5.9, and then do the same thing. This time the student is supposed to implement a much more sophiscated fs, maybe something from the SunOS or WinNT that was not there.
- Course 3 & 4: Do the same thing with leaked NT 3.5 and NT 4.0 kernel. It's just for personal use so I wouldn't worry about the lawyers.
For reading, there are a lot of books about Linux kernels and NT kernels.
> They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
yup.
So tired of this sort of complaint (and I'm 62).
The computing the author enjoyed/enjoys is still out there, they are just looking for it in all the wrong places. Forget about (typical) web development (with its front and backend stacks). Forget about windows and macOS, and probably even mobile (though maybe not).
Hobby projects. C++/Rust/C/Go/some-current-Lisp. Maybe even Zig! Unix/Linux. Some sort of hardware interaction. GPL, so you can share and participate in a world of software created by people a lot more like you and a lot less like Gates and Jobs and Zuckerberg and ...
Sure, corporate programming generally tends to suck, but it always did. You can still easily do what you always loved, but probably not as a job.
At 62, as a native desktop C++ app developer doing realtime audio, my programming is as engrossing, cool, varied and awesome as it has ever been (probably even more so, since the GPL really has won in the world I live in). It hasn't been consumed by next-new-thing-ism, it hasn't been consumed by walled platforms, it hasn't been taken over by massive corporations, and it still very much involves Cool Stuff (TM).
Stop whining and start doing stuff you love.
I'm ~40ish but middle career and not in management. I envy this author, whatever joy he found in solving little puzzles and systems was extinguished in me very early in my career in an intense corporate environment. I was never one to love fussing much with code, but I do love solving system scale problems, which also involve code. I don't feel I am losing anything, the most annoying parts of code I deal with are now abstracted into human language and specs, and I can now architect/build more creatively than before. So I am happy. But, I was one of those types that never had a true passion for "code" and have meant plenty of people that do have that, and I feel for them. I worry for people that carved out being really good at programming as a niche, but you enter a point in your career where that becomes much less important than being able to execute and define requirements and understand business logic. And yea, that isn't very romantic or magical, but I find passion outside of what pays my bills, so I lost that ennui feeling a while ago.
I'm roughly the same (started at 9, currently 48), but programming hasn't really changed for me. What's changed is me having to have pointless arguments with people who obviously have no clue what they're talking about but feel qualified either because:
a) They asked an LLM
b) "This is what all our competitors are doing"
c) They saw a video on Youtube by some big influencer
d) [...insert any other absurd reason...]
True story:
In one of our recent Enterprise Architecture meetings, I was lamenting the lack of a plan to deal with our massive tech debt, and used an example of a 5000 line regulatory reporting stored procedure written 10 years ago that noone understood. I was told my complaint was irrelevant because I could just dump it into ChatGPT and it would explain it to me. These are words uttered by a so-called Senior Developer, in an Enterprise Architecture meeting.
I am in a very similar boat, age and experience-wise. I would like to work backward from the observation that there is no resource constraints and we're collectively hopelessly lost up the abstraction Jenga tower.
I observe that the way we taught math was not oriented on the idea that everyone would need to know trigonometric functions or how to do derivatives. I like to believe math curricula was centered around standardizing a system of thinking about maths and those of us who were serious about our educational development would all speak the same language. It was about learning a language and laying down processes that everyone else could understand. And that shaped us, and it's foolish to challenge or complain about that or, God forbid, radically change the way we teach math subjects because it damages our ability to think alike. (I know the above is probably completely idealistic verging on personal myth, but that's how I choose to look at it.)
In my opinion, we never approached software engineering the same way. We were so focused on the compiler and the type calculus, and we never taught people about what makes code valuable and robust. If I had FU money to burn today, I'd start a Mathnasium company focused around making kids into systems integrators with great soft skills and the ability to produce high quality software. I would pitch this business under the assumption that the jenga tower is going to be collapsing pretty much continuously for the next 25-50 years and civilization needs absolute unit super developers coming out of nowhere who will be able to make a small fortune helping companies dig their way out of 75 years of tech debt.
> I started programming when I was seven because a machine did exactly what I told it to, felt like something I could explore and ultimately know, and that felt like magic
I'm significantly younger than OP, but this was it for me too. I'm autistic and found the world around me confusing growing up. Computers were wonderful because they were the only thing that really made sense to me.
I was obsessed with computers since I was 5. I started programming probably around age 10. Then in my early teens I started creating Flash applications, writing PHP, Java, etc...
When I look back on my early career now it was almost magical. This in the mid to late 00s (late to some I know), but this was before the era of package managers, before resources like Stackoverflow, before modern IDEs. You had some fairly basic frameworks to work with, but that was really about it. Everything else had to be done fully by hand.
This was also before agile was really a thing too. The places I worked at the time didn't have stand-ups or retrospectives. There were no product managers.
It was also before the iPhone and the mass adoption of the internet.
Back then no one went into software engineering as a profession. It was just some thing weird computer kids did, and sometimes businesses would pay us to build them things. Everyone who coded back then I got along with great, now everyone is so normal it's hard for me to relate with me. The industry today is also so money focused.
The thing and bothers me the most though is that computers increasingly act like humans that I need to talk to to get things done, and if that wasn't bad enough I also have to talk with people constantly.
Even the stuff I build sucks. All the useful stuff has been build so in the last decade or so stuff I've built feels increasingly detached from reality. When I started I felt like I was solving real practical problems for companies, now I'm building chatbots and internal dashboards. It's all bollocks.
There was a post recently about builders vs coders (I can't remember exactly). But I'm definitely a coder. I miss coding. There was something rewarding about pouring hours into a HTML design, getting things pixel perfect. Sometimes it felt laborious, but that was part of the craft. Claude Code does a great job and it does it 50x faster than I could, but it doesn't give me the same satisfaction.
I do hope this is my last job in tech. Unfortunately I'm not old enough to retire, but I think I need to find something better suited to my programatic way of thinking. I quite like the idea of doing construction or some other manual labour job. Seems like they're still building things by hand and don't have so many stupid meetings all the time.
You can still have fun programming. Just sit down and write some code. Ain't nobody holding a gun to your head forcing you to use AI in your projects.
And the part of programming that wasn't your projects, whether back in the days of TPS reports and test coverage meetings, or in the age of generative AI, that bit was always kinda soul draining.
"Over four decades I’ve been through more technology transitions than I can count. New languages, new platforms, new paradigms. CLI to GUI. Desktop to web. Web to mobile. Monoliths to microservices. Tapes, floppy discs, hard drives, SSDs. JavaScript frameworks arriving and dying like mayflies."... made me think of
I've seen things you people wouldn't believe. Attack ships on fire off the shoulder of Orion. I watched C-beams glitter in the dark near the Tannhäuser Gate. All those moments will be lost in time, like tears in rain. Time to die.
where we came from and where we're going this whole time in my career those things are kind of hard to pinpoint. Abstraction is killing us for sure. Time to market above all else. It's no wonder why software in cars, appliances and medical equipment is a factor that is killing people.
This is quite the lament. Very well written.
I'm about ten years ahead of the author. I felt this a long time before AI arrived. I went from solving problems for people to everything I tried to ending up in an endless grind of yak-shaving.
I worked my way through it, though. It made me both give up programming, at least in the commercial sense, and appreciate the journey he and I have gone through. It's truly an amazing time to be alive.
Now, however, I'm feeling sucked back into the vortex. I'm excited about solving problems in a way I haven't been in a long time. I was just telling somebody that I spent 4-6 hours last night watching Claude code. I watched TV. I scratched my butt. I played HexaCrush. All the time it was just chugging along, solving a problem in code that I have wanted to solve for a decade or more. I told him that it wasn't watching the code go by. That would be too easy to do. It was paying attention to what Claude was doing and _feeling that pain_. OMG, I would see it hit a wall, I would recognize the wall, and then it'd just keep chugging along until it fixed it. It was the kind of thing that didn't have damned thing to do with the problem but would have held me up for hours. Instead, I watched Pitt with my wife. Every now I then I'd see a prompt, pop up, and guide/direct/orchestrate/consult/? with Claude.
It ain't coding. But, frankly, coding ain't coding. It hasn't been in a long, long time.
If a lot of your job seems like senseless bullshit, I'm sad to say you're on the way out. If it doesn't, stick around.
I view AI as an extinction level threat. That hasn't changed, mainly because of how humans are using it. It has nothing to do with the tech. But I'm a bit perplexed now as to what to do with my new-found superpowers. I feel like that kid on the first Spiderman movie. The world is amazing. I've got half-a-dozen projects I'm doing right now. I'm publishing my own daily newspaper, just for me to read, and dang if it's not pretty good! No matter how this plays out, it is truly an amazing time to be alive, and old codgers like us have had a hella ride.
I too have felt these feelings (though I'm much younger than the author). I think as I've grown older I have to remind myself
1. I shouldn't be so tied to what other people think of me (craftsman, programmer, low level developer)
2. I shouldn't measure my satisfaction by comparing my work to others'. Quality still matters especially in shared systems, but my responsibility is to the standards I choose to hold, not to whether others meet them. Plus there are still community of people that still care about this (handmade network, openbsd devs, languages like Odin) that I can be part of it I want to
3. If my values are not being met either in my work or personal life I need to take ownership of that myself. The magic is still there, I just have to go looking for it
Is there some magic lost also when using AI to write your blog post?
Well-written and it expresses a mood, a feeling, a sense of both loss and awe. I was there too in the 8-bit era, fully understanding every byte of RAM and ROM.
The sense of nostalgia that can turn too easily into a lament is powerful and real. But for me this all came well before AI had become all consuming... It's the just the latest manifestation of the process. I knew I didn't really understand computers anymore, not in the way I used to. I still love coding and building but it's no longer central to my job or lif3. It's useful, I enjoy it but at the same time I also marvel at the future that I find myself living in. I've done things with AI that I wouldn't have dared to start for lack of time. It's amazing and transformative and I love that too.
But I will always miss the Olden Days. I think more than anything it's the nostalgia for the 8-bit era that made me enjoy Stranger Things so much. :)
I found that feeling again while building a game on the EVM. All of the constraints were new and different. Solidity feels somewhere between and high and low level language, not as abstracted as most popular languages today but a solid step above writing assembly.
A lot of people started building projects like mine when the EVM was newer. Some managed to get a little bit of popularity, like Dark Forest. But most were never noticed. The crypto scene has distracted everyone from the work of tinkerers and artists who just wanted to play with a new paradigm. The whole thing became increasingly toxic.
It was like one last breath of fresh cool air before the pollution of AI tools arrived on the scene. It's a bitter sweet feeling.
maybe we just change, honestly. i think when i were younger there was nothing to lose, time felt unlimited, no "career" to gamble with, no billion dollar idea, just learning and tinkering and playing with whatever was out there because it was cool and interesting to me. in some respects i miss that.
not sure how that relates to llms but it does become an unblocker to regain some of that "magic", but also i know to deep dive requires an investment i cannot shortcut.
the new generation of devs are already playing with things few dinosaurs will get to experience fully, having sunk decades into the systems built and afraid to let it go. some of that is good (to lean on experience) and some of it holding us back.
Fantastic Article, well written, thoughtful. Here are a couple of my favorite quotes:
To relate to the author, I think with a lot of whats going on I feel the same about, but other parts I feel differently than they do. There appears to be a shallowness with this... yes we can build faster than ever, but so much of what we are building we should really be asking ourselves why do we have to build this at all? Its like sitting through the meeting that could have been an email, or using hand tools for 3 hours because the power tool purchase/rental is just obscenely expensive for the ~20min you need it.I'm 55 and I started at age 13 on a TI-99/4A, then progressed through Commodore 64, Amiga 2000, an Amiga XT Sidecar, then a real XT, and on and on. DOS, Windows, Unix, the first Linux. I ran a tiny BBS and felt so excited when I heard the modem singing from someone dialing in. The first time I "logged into the Internet" was to a Linux prompt. Gopher was still a bigger thing than the nascent World-Wide Web.
The author is right. The magic has faded. It's sad. I'm still excited about what's possible, but it'll never create that same sense of awe, that knowledge that you can own the entire system from the power coming from the wall to the pixels on your screen.
> They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
But sure. AI is the moment they lost track of what’s happening.
I feel this is conflating different things. Yes, the abstraction tower was massive already before, but at least the abstractions were mostly well-defined and understandable through interfaces: even if you don't understand the intricacies of your storage device, driver and kernel, you can usually get a quite reliable and predictable mental representation how files work. Same goes for network protocols, higher-level programming languages or the web platform.
Sure, there are edge cases where the abstraction breaks down and you have to get into the lower levels, but those situations are the exception, not the norm.
With AI, there is no clearly defined interface, and no one really knows what (precise) input a given output will produce. Or maybe to put it better, the interface is human language and your mental representation is the one you have talking to a human - which is far more vague than previous technical abstractions.
On the bright side, at least we (still) have the intermediate layer of generated code to reason about, which offsets the unpredictability a bit.
I've had the same journey, same age markers. The sentiment is the same, but at the same time this new world affords me super powers I'm currently drunk on. When that drunkenness becomes a hangover I hope I won't be disappointed.
You can still write code yourself. Just like you can still walk to work, you do not need to use a car.
Yeah I could use Cursor or whatever but I don't, I like writing code. I guess that makes me a luddite or something, although I still develop agents. I enjoy architecting things (I don't consider myself an architect) I'm talking about my hobby hardware projects.
The irony is that you could still code the way you always did, where you control every pixel. Nothing is stopping you.
But you would not be able to make anything anywhere near as complex as you can with modern tools.
I know exactly how you feel. I don't know how many hours I sat in front of this debugger (https://www.jasik.com) poking around and trying to learn everything at a lower level. Now its so different.
Was this text run through LLM before posting? I recognize that writing style honestly; or did we simply speak to machines enough to now speak like machines?
Oh boy this hits home.
At this point I entered surviving mode, and curious to see where we will be 6 months, 2 years from now. I am pessimistic.
I want to tinker with my beloved Z80 again.
> I wrote my first line of code in 1983. I was seven years old, typing BASIC into a machine that had less processing power than the chip in your washing machine
I think there may be a counterpoint hiding in plain sight here: back in 1983 the washing machine didn't have a chip in it. Now there are more low-level embedded CPUs and microcontrollers to develop for than before, but maybe it's all the same now. Unfathomable levels of abstraction, uniformly applied by language models?
Same, but it changed when I was 17 and again when I was 27 and then 37 and so on. It has always been changing dramatically, but this latest leap is just so incredibly different that it seems unique.
Cool, at 7? I started at 9 and I'm 53 now. And Claude does all the things. Need to get adjusted to that though. Still not there.
Last year I found out that I always was a creator, not a coder.
Starting code when I was 14, sold my first bit of code at 17, which was written in 6502 assembler.
40+ years later, been through many BASICs, C, C++ (CFront on onwards) and now NodeJS, and I still love writing code.
Tinkering with RPi, getting used to having a coding assistant, looking forward to having some time to work on other fun projects and getting back into C++ sooooon.
What's not to love?
I think it's the loss of control.
Even if you can achieve awesome things with LLMs you give up the control over tiny details, it's just faster to generate and regenerate until it fits the spec.
But you never quite know how long it takes or how much you have to shave that square peg.
Did hardware engineers back in the 1970s-80s* think that software took the joy out of their craft? What do those engineers now think in retrospect?
*I'm picking that era because it seems to be when most electronic machines' business logic moved from hardware to software.
I'm 46 but same. I'm not quite as melancholy about it, but I do feel a lot of this.
I retired a few years ago and it's very clear that was a good thing.
Are you me?
I'm 49.... Started at 12... In the same boat
First 286 machine had a CMOS battery that was loose so I had to figure that out to make it boot into ms-dos
This time it does feel different and while I'm using them ai more than ever, it feels soulless and empty even when I 'ship' something
> Cheaper. Faster. But hollowed out.
Given the bazillions poured into it I have yet to see this proven to be cheaper.
This is at least partially AI-written, by the way
I'm 43. Took a year or so off from contracting after being flat out for years without taking any breaks, just poked around with some personal projects, did some stuff for my wife's company, petitioned the NHS to fix some stuff. Used Claude Code for much of it. Travelled a bit too.
I feel like I turned around and there seem to be no jobs now (500+ applications deep is a lot when you've always been given the first role you'd applied to) unless you have 2+ years commercial AI experience, which I don't, or perhaps want to sit in a SOC, which I don't. It's like a whole industry just disappeared while I had my back turned.
I looked at Java in Google Trends the other day, it doesn't feel like it was that long ago that people were bemoaning how abstracted that was, but it was everywhere. It doesn't seem to be anymore. I've tried telling myself that maybe it's because people are using LLMs to code, so it's not being searched for, but I think the game's probably up, we're in a different era now.
Not sure what I'm going to do for the next 20 years. I'm looking at getting a motorbike licence just to keep busy, but that won't pay the bills.
The deepest thing I read from HN in months. Respect.
> the VGA Mode X tricks in Doom
Doom does not use mode-X :P ! It uses mode-Y.
That being said as a 47 years old having given 40 years to this thing as well, I can relate to the feeling.
Great post. Good to see someone posting something positive for a change about the shift in development.
It'd be more strange if the thing you learned 43 years ago was exactly the same today. We should expect change. When that change is positive we call it progress.
I am younger than the author but damn this somehow hit me hard. I do remember growing up as a kid with a 486...
I think more than ever programmers need jobs where performance matters and the naive way the AI does things doesn't cut it. When no one cares about things other than correctness your job turns into AI Slop. The good news right now is that AI tends to produce things that AI struggles to do well with so large scale projects often descend into crap. You can write a C-compiler for $20,000 with an explosive stack of agents, but that C-compiler isn't anywhere close to efficient or performant.
As model costs come down that $20,000 will become a viable number for doing entirely AI-generate coding. So more than ever you don't want to be doing work that the AI is good enough at. Either jobs where performance matters or being able to code the stack of agents needed to produce high quality code in an application context.
I don't know what these people from our now traditional daily lamentation session are coding where Claude can do all the work for them just with a few prompts and minimal reviews.
Claude is a godsend to me, but fuck, it is sometimes dumb as door, loves to create regressions, is a fucking terrible designer. Small, tiny changes? Those are actually the worse, it is easy for claude, on the first setback, decides to burn the whole world and start from zero again. Not to mention when it gets stuck in an eternal loop where it increasingly degenerates the code.
If I care about what I deliver, I have to actively participate in coding.
I'm 47 and excited to live in a time of the moat important innovation since the printing press.
A bit younger, and exact opposite. Probably the most excited I've ever been about the state of development!
Abstractions can take away but many add tremendous value.
For example, the author has coded for their entire career on silicon-based CPUs but never had to deal with the shittiness of wire-wrapped memory, where a bit-flip might happen in one place because of a manufacturing defect and good luck tracking that down. Ever since lithography and CPU packaging, the CPU is protected from the elements and its thermal limits are well known and computed ahead of time and those limits baked into thermal management so it doesn’t melt but still goes as fast as we understand to be possible for its size, and we make billions of these every day and have done for over 50 years.
Moving up the stack you can move your mouse “just so” and click, no need to bit-twiddle the USB port (and we can talk about USB negotiation or many other things that happen on the way) and your click gets translated into an action and you can do this hundreds of times a day without disturbing your flow.
Or javascript jit compilation, where the js engine watches code run and emits faster versions of it that make assumptions about types of variables - with escape hatches if the code stops behaving predictably so you don’t get confusing bugs that only happen if the browser jitted some code. Python has something similar. Thanks to these jit engines you can write ergonomic code that in the typical scenario is fast enough for your users and gets faster with each new language release, with no code changes.
Lets talk about the decades of research that went into autoregressive transformer models, instruction tuning, and RLHF, and then chat harnesses. Type to a model and get a response back, because behind the scenes your message is prefixed with “User: “, triggering latent capabilities in the model to hold its end of a conversation. Scale that up and call it a “low key research preview” and you have ChatGPT. Wildly simple idea, massive implications.
These abstractions take you further from the machine and yet despite that they were adopted en masse. You have to account for the ruthless competition out there - each one would’ve been eliminated if they hadn’t proven to be worth something.
You’ll never understand the whole machine so just work at the level you’re comfortable with and peer behind the curtain if and when you need (eg. when optimizing or debugging).
Or to take a moment to marvel.
Same as assembly programmers felt when C came along I guess
As someone who has always enjoyed designing things, but was never really into PUZZLES, I always felt like an outsider in the programming domain. People around me really enjoyed the "fun" of programming, whereas I was more interested in the Engineering of the thing - balancing tradeoffs until within acceptable margins and then actually calling it "DONE". People around me rarely called things "done", they rewrote it and rewrote it so that it kept satisfying their need for puzzle-solving (today, it's Ruby, tomorrow, it's rewritten in Scala, and the day after that, it's Golang or Zig!)
I feel that LLMs have finally put the ball in MY court. I feel sorry for the others, but you can always find puzzles in the toy section of the bookstore.
"They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of."
and they still call themselves 'full stack developers' :eyeroll:
'It’s not a “back in my day” piece.'
That's exactly what it is.
I have been around for a similar amount of time. Another change I have seen over the years is the shift from programming being an exercise in creative excellence at work to being a white-collar ditch-digging job.
yeah coding is a lot more fun and useful now
At least parts of this were written with AI
>"The abstraction tower
Here’s the part that makes me laugh, darkly.
I saw someone on LinkedIn recently — early twenties, a few years into their career — lamenting that with AI they “didn’t really know what was going on anymore.” And I thought: mate, you were already so far up the abstraction chain you didn’t even realise you were teetering on top of a wobbly Jenga tower.
They’re writing TypeScript that compiles to JavaScript that runs in a V8 engine written in C++ that’s making system calls to an OS kernel that’s scheduling threads across cores they’ve never thought about, hitting RAM through a memory controller with caching layers they couldn’t diagram, all while npm pulls in 400 packages they’ve never read a line of.
But sure. AI is the moment they lost track of what’s happening.
The abstraction ship sailed decades ago. We just didn’t notice because each layer arrived gradually enough that we could pretend we still understood the whole stack.
AI is just the layer that made the pretence impossible to maintain."
Absolutely brilliant writing!
Heck -- absolutely brilliant communicating! (Which is really what great writing is all about!)
You definitely get it!
Some other people here on HN do too, yours truly included in that bunch...
Anyway, stellar writing!
Related:
https://www.joelonsoftware.com/2002/11/11/the-law-of-leaky-a...
https://en.wikipedia.org/wiki/Tower_of_Babel
https://en.wikipedia.org/wiki/Abstraction_(computer_science)
https://en.wikipedia.org/wiki/Abstraction
https://ecommons.cornell.edu/entities/publication/3e2850f6-c...
I was happy riding my horse when this dude invented a car.
>But sure. AI is the moment they lost track of what’s happening. The abstraction ship sailed decades ago.
Bullshit. While abstraction has increased over time, AI is no mere incremental change. And the almost natural language interaction with an agent is not the same as Typescript over assembly (not to mention you could very well right C or Rust and the like, and know most of the details of the machine by heart, and no, microcode and low level abstractions are not a real counter-argument to that). Even less so if agents turn autonomous and you just herd them onto completion.
Programming changed all along.
New concepts came out all along.
They became standardized all along and came down market to smaller and smaller projects.
Source control.
Cloud.
Agile/Scrum.
Code completion IDEs.
Higher Level languages.
These were not LLMs but did represent a shift that had to be kept up with.
LLMs are no different, just a bigger jump.
There is just as much opportunity here.
Software development and software developers are not going away.
More software that never could be built will now be built.
For the forseeable future there will always be software that needs to be overseen by a human.
Humans have a special knack for taking the humanity out of basically anything. It's a bizarre pattern.
I have the opposite take. There’s nothing stopping you from jumping into any component to polish things up. You can code whatever you wish. And AI takes away nearly all of the drudgery : boilerplate, test cases, inspecting poor documentation, absurd tooling.
It also lets me focus more on improving things since I feel more liberated to scrap low quality components. I’m much braver to take on large refactors now – things that would have taken days now take minutes.
In many ways AI has made up for my growing lack of patience and inability to stay on task until 3am.
I've written sse2 optimized C, web apps, and probably everything in between (hw, datasci, etl, devops).
I like coding with AI both vibe and assisted, since as soon as the question enters my head I can create a prototype or a test or a xyz to verify my thoughts. The whole time I'm writing in my notebook or whiteboard or any other thing I would have gotten up to. This is enabling tech, the trouble for me is there is a small thread that leads out of the room into the pockets of billion dollar companies.
It is no longer you vs the machine.
I have spent tons of time debugging weird undocumented hardware with throwaway code, or sat in a debugger doing hex math.
I think one wire that is crossed right now in this world is that computing is more corporate than ever, with what seems like ever growing platforms and wealth extraction at scale. Don't let them get you down, host your own shit and ignore them. YES IT WILL COST MORE -> YOUR FREEDOM HAS A PRICE.
Another observation is that people that got into the game for pure money are big mad right now. I didn't make money in the 00s, I did in the end of the 10s, and we're back at job desolation. In my groups, the most annoyed are code boot campers who have faked it until they made it and have just managed to survive this cycle with javascript.
Cycles come and go, the tech changes, but problem solving is always there.
It's not like it's changing by itself, you can always opt out of the slop race and scratch your itches instead.
https://gitlab.com/codr7/rem
“... when I was 7. I'm 50 now and the thing I loved has changed”
Welcome to the human condition, my friend. The good news is that a plurality of novels, TV shows, country songs, etc. can provide empathy for and insight into your experience.
I'm 57 and wrote my first line of BASIC in 1980, so while I can still chime in on this specific demographic I feel that I ought to. So im like this guy, but like a lot of other people in my specific demographic we aren't writing these long melancholy blog posts about AI because it's not that big of a deal. As an OSS maintainer most of my work is a lot of boring slog adding features to libraries to suit new features in upstream dependencies, nitpicky things people point out, new docs, tons of tedium. Claude helps a ton with all of that. no way is Claude doing the real architectural puzzle stuff, that's still fully on me! I can just use Claude to help implement it. It's like the ultimate junior programmer assistant. It's certainly a new, different and unique experience in one's programming career but it really feels like another tool, like an autocomplete or code refactoring tool that is just a lot better, with similar caveats. I mean in my career, I've had to battle the whole time people who don't "get" source code control (starting with me), who don't "get" IDEs (starting with me), people who dont "get" distributed version control (same), people who don't "get" ORMs (oh yes, same for me though this one I took much more dramatic steps to appreciate them), people who don't "get" code formatters, now we're battling people who don't "get" LLMs used for coding, in that sense the whole thing doesnt feel like that novel of a situation.
it's the LLMs that are spitting out fake photos and videos and generating lots of shitty graphics for local businesses, that's where I'm still wielding a pitchfork...
The irony of these "My craft is dead" posts is that they consistently, heavily leverage AI for their writing. So you're crying about losing one craft to AI while using AI to kill another. It's disingenuous. And yes it is so damn obvious.
There's 3-4 of these posts a day - why don't people spend more time hand-building things for fun in their free time? That's what led a lot of us to this career path to start with. I have a solid mix of hand-code and AI-assisted projects in my free time.
>>The machines I fell in love with became instruments of surveillance and extraction.
Surveillance and Extraction
"We were promised flying cars", and what we got was "investors" running the industry off the cliff into cheap ways to extract money from people instead of real innovation.
> I started programming when I was seven because a machine did exactly what I told it to
What a poetic ending. So beautiful! And true, in my experience.
This isn't new. It's the same feeling the first commercial programmers had working in assembly, or machine code, once compilers became available. Ultimately I think even Mel Kaye forsook being able to handpick memory locations for optimum drum access before his retirement, in favor of being able to build vastly more complex software than before.
AI has just vastly extended your reach. No sense crying about it. It is literally foolish to lament the evolution of our field into something more.
Programming is dead. In the last 4 days I've done 2 months of work. The future is finally here.
Bad times to be a programmer. Start learning business.
I'm 57. I was there when the ZX81 came out.
I had my first paid programming job when I was 11, writing a database for the guy that we rented our pirate VHS tapes from.
AI is great.
Don't program as a career, but am also 50 and programming since TRS-80. AI has transformed this era, and I LOVE IT! I can focus on making and not APIs or syntax or all of the bootstrapping.
Professional development is changing dramatically. Nothing stops anyone from coding "the old way," though. Your hobby project remains yours, exactly the way you want it. Your professional project, on the other hand, was never about you in the first place. It's always about the customer/audience/user, period full stop.
Please stop upvoting these posts. We have gotten to the point where both the front page and new page is polluted with these laments
It’s literally the same argument over and over and it’s the same comments over and over and over
HN will either get back to interesting stuff or simply turn into a support group for aging “coders” that refuse to adapt
I’m going to start flagging these as spam
same bud.
maybe that just means it's a maturing field and we gotta adapt?
yes, the promise has changed, but you still gotta do it for the love of the game. anything else doesnt work.
I’m 50 too and I’ve complained and yearned about the “old” days too, a lot of this is nostalgia as we reminisce about periods of time in our youth when we had the exuberance and time to play and build with technology of our own time
Working in AI startups strangely enough I see a lot of the same spirit of play and creativity applied to LLM based tools - I mean what is OpenClaw but a fun experiment
Those kids these days are going to reminisce about the early days of AI when prompts would be handwritten and LLMs would hallucinate
I’m not really sure 1983, 1993 or 2003 really was that gold of age but we look at it with rose colored glasses
11 and now 45. I am still interested in it, but I feel like in my 20s I would get a dopamine rush when a row showed up in a database. In my 30s I would get that only if a message passed through a system and updated on-screen analytics within 10 seconds. Thank god for LLMs because all of it became extremely boring, I can't stand having to get these little milestones each new company or each new product I'm working on. At least with LLMs the dopamine hit comes from being in awe of the code that gets generated and realizing it found every model, every messaging system interface, every API, and figuring out how to make it backwards compatible, updating the UI - something that would take half a day, now in 5 minutes or less.
> I’ve had that experience. And losing it — even acknowledging that it was lost
What are you talking about? You don't know how 99% of the systems in your own body work yet they don't confront you similarly. As if this "knowledge" is a switch that can be on or off.
> I gave 42 years to this thing, and the thing changed into something I’m not sure I recognise anymore.
Stop doing it for a paycheck. You'll get your brain back.
Old Man Yells at Clouds