This show captures much of what I miss about computing in the 80s and 90s. You could get your hands on hardware, be able to largely understand what all the hardware and software was doing. You mostly used computers as tools, which only accepted commands and didn't try to affect your decisions or workflow (yes, there was Clippy). The leaps forward in computing power, memory and storage were more impactful to the everyday user. There was a sense of wonder, and it didn't envelop your and everyone's life. Most of all, we weren't yet slaves to our computers, and they weren't devices crafted to endlessly grab your attention by any means necessary.
show comments
greenbit
The Commodore PET 4032 video system was generated by a 6545 (6845 equivalent) cathode ray tube controller, which generated the video buffer addresses and the HS and VS sync pulses. This was memory mapped and if one was not careful with POKE commands, you could effectively stop the CRT raster scan, leaving the beam parked at the center of the screen. This could burn the phosphors off that spot in a matter of minutes. Not exactly HCF, but a similar vibe.
(The PET had its own monitor that, unlike common composite monitors of the era, apparently would not continue to scan when the sync went away)
show comments
burnte
It was a fun show. I really enjoyed it, a fictional run through the 80s and 90s computing industries.
show comments
indigodaddy
Complete series is at all time low on iTunes/Apple TV, 14.99:
So many AI comments. Spamming every post. Backed by AI accounts all with blogs that are less than a year old with 3-6 banal programming projects. WTF man.
show comments
kens
I'm calling urban legend on the story of an IBM 360 catching fire from an illegal opcode.
dreamcompiler
I learned the 6800 in college in Texas in the 80s, and it definely had what we called an HCF instruction. I didn't remember the opcode until I read this article.
When the show came out I thought it must have been created by one of my classmates because the title is so arcane. Turns out it wasn't but the show definitely captures the vibe of computing in Austin and Dallas in the 80s.
jrmg
Love how many people here are thinking this is about (or just taking it as an opportunity to talk about) the under-appreciated TV show!
show comments
dbg31415
> I have never watched the AMC show Halt and Catch Fire…
Go watch it. Great show.
FireBeyond
I enjoyed it a lot - certainly there's a lot of creative license and there's a slight irony in a show that's trying to portray historical events to have things like Windows 3.1 running on a Sparcstation 5 or countless others. But as someone who was of this era (maybe not so much season 1), I did love it. I actually only just got to watching it this year (and actually just started Season 4 this week).
scar
There's such an annoying scene in the first episode of that show that kinda broke the immersion for me.
They introduced Cameron Howe as some sort of world class hacker that could do anything so one of her first scenes was her typing something.. and typing she did, one finger at a time.
I mean, wtf.
World class hacker that literally types one finger at a time, like she had never used a keyboard before.
That scene nearly made me quit the show right there and then.
Whenever I see that actress in something else I just can't help but think back about she couldn't even be bothered to learn how to type.
show comments
thisisauserid
This article is deadbeef on arrival.
JKCalhoun
Stories like these are what endear me to my chosen career.
I suspect they hooked me with "byte" and "nybble"… And it just got better the more immersed I got in the history, Jargon Files…
lloeki
In the realm of flammable computer parts and adjacent devices, there's the somewhat related lp0 on fire
This show captures much of what I miss about computing in the 80s and 90s. You could get your hands on hardware, be able to largely understand what all the hardware and software was doing. You mostly used computers as tools, which only accepted commands and didn't try to affect your decisions or workflow (yes, there was Clippy). The leaps forward in computing power, memory and storage were more impactful to the everyday user. There was a sense of wonder, and it didn't envelop your and everyone's life. Most of all, we weren't yet slaves to our computers, and they weren't devices crafted to endlessly grab your attention by any means necessary.
The Commodore PET 4032 video system was generated by a 6545 (6845 equivalent) cathode ray tube controller, which generated the video buffer addresses and the HS and VS sync pulses. This was memory mapped and if one was not careful with POKE commands, you could effectively stop the CRT raster scan, leaving the beam parked at the center of the screen. This could burn the phosphors off that spot in a matter of minutes. Not exactly HCF, but a similar vibe.
(The PET had its own monitor that, unlike common composite monitors of the era, apparently would not continue to scan when the sync went away)
It was a fun show. I really enjoyed it, a fictional run through the 80s and 90s computing industries.
Complete series is at all time low on iTunes/Apple TV, 14.99:
https://www.cheapcharts.com/us/itunes/seasons/1745389594
So many AI comments. Spamming every post. Backed by AI accounts all with blogs that are less than a year old with 3-6 banal programming projects. WTF man.
I'm calling urban legend on the story of an IBM 360 catching fire from an illegal opcode.
I learned the 6800 in college in Texas in the 80s, and it definely had what we called an HCF instruction. I didn't remember the opcode until I read this article.
When the show came out I thought it must have been created by one of my classmates because the title is so arcane. Turns out it wasn't but the show definitely captures the vibe of computing in Austin and Dallas in the 80s.
Love how many people here are thinking this is about (or just taking it as an opportunity to talk about) the under-appreciated TV show!
> I have never watched the AMC show Halt and Catch Fire…
Go watch it. Great show.
I enjoyed it a lot - certainly there's a lot of creative license and there's a slight irony in a show that's trying to portray historical events to have things like Windows 3.1 running on a Sparcstation 5 or countless others. But as someone who was of this era (maybe not so much season 1), I did love it. I actually only just got to watching it this year (and actually just started Season 4 this week).
There's such an annoying scene in the first episode of that show that kinda broke the immersion for me.
They introduced Cameron Howe as some sort of world class hacker that could do anything so one of her first scenes was her typing something.. and typing she did, one finger at a time.
I mean, wtf.
World class hacker that literally types one finger at a time, like she had never used a keyboard before.
That scene nearly made me quit the show right there and then.
Whenever I see that actress in something else I just can't help but think back about she couldn't even be bothered to learn how to type.
This article is deadbeef on arrival.
Stories like these are what endear me to my chosen career.
I suspect they hooked me with "byte" and "nybble"… And it just got better the more immersed I got in the history, Jargon Files…
In the realm of flammable computer parts and adjacent devices, there's the somewhat related lp0 on fire
https://en.wikipedia.org/wiki/Lp0_on_fire