The Megahertz Wars were an exciting time. Going from 75 MHz to 200 MHz meant that everything (CPU limited) ran 2x as fast (or better with architectural improvements).
Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.
show comments
nunez
What a time to be a kid then.
We had a hand-me-down DEC x86 desktop at home with a Pentium II running at 233 MHz until I want to say 2002? This was around the time I learned how to build a PC since doing that was cheaper than buying one and no-one in my family had the money for that!
I saved whatever money I could to buy a 128MB stick of RAM from Staples (maybe it was 256MB?), a few other things from TigerDirect/Newegg and _this processor_. With some help from my uncle and a guide I printed from somewhere whose website started with '3D' (it was quite popular back then; I don't think it exists anymore), I got it done.
Going from 233 MHz to this was like going from walking to flying in a jet! Everything was SO MUCH F**ING FASTER. Windows XP _flew_. (The DEC barely made the minimum requirements for it, and boy did I feel it.) Trying to install Longhorn on it a year or two later brought me back into walking again, though. :D
show comments
hedora
The Athlon XP was the bigger milestone, as I remember it.
They were both "seventh generation" according to their marketing, but you could get an entire GHz+ Athlon XP machine for much less than half the $990 tray price from the article.
I distinctly remember the day work bought a 5 or 6 node cluster for $2000. (A local computer shop gave us a bulk discount and assembled it for them, so sadly, I didn't poke around inside the boxes much.)
We had a Solaris workstation that retailed for $10K in the same office. Its per-core speed was comparable to one Athlon machine, so the cluster ran circles around it for our workload.
Intel was completely missing in action at that point, despite being the market leader. They were about to release the Pentium 4, and didn't put anything decent out from then to the Core 2 Duo. (The Pentium 4 had high clock rates, but low instructions per cycle, so it didn't really matter. Then AMD beat Intel to market with 64 bit support.)
I suspect history is in the process of repeating itself. My $550 AMD box happily runs Qwen 3.5 (32B parameters). An nvidia board that can run that costs > 4x as much.
show comments
Sharlin
The i486DX 33MHz was introduced in May 1990. A 30x increase, or about five doublings, in clock speeds over ten years. That's of course not the whole truth; the Athlon could do much more in one cycle than the 486. In any case, in 2010 we clearly did not have 30GHz processors – by then, the era of exponentially rising clock speeds was very decidedly over. I bought an original quadcore i7 in 2009 and used it for the next fifteen years. In that time, roughly one doubling in the number of cores and one doubling in clock speeds occurred.
show comments
random3
Fun times. Coolers, paste, fans, supply watts, dip switches and jumpers. Quake, Voodoo 3dfx vs NVidia GForce. This is where it all started, kids.
I was in high school and was running a "computer games club" (~ Internet cafe for games and kids) since 1998 when we got a place, renovated it ourselves, got custom built furniture (cheap narrow desks) and initially 6 computers - AMDs at 300Mhz. By 2000 we broke a wall in the adjacent space and had ~15, cable + satellite internet for downloads and whatever video cards we could buy or scrap. It was wild.
show comments
mtucker502
What progress is being made in overcoming the current thermal limits blocking us from high clock rates (10Ghz+)?
show comments
ehnto
I bought a whole bunch of parts with my first Athlon. I think I bought a Soundblaster, and a Radeon GFX card if I am remembering the timeline right. The soundblaster came with a demo of a Lara Croft game that used the then incredible spatial audio processing to great effect. The industry promptly forgot about that technology, and to this day game audio rarely matches the potential of real time spatial dynamics that we once reached 20 years ago.
show comments
dd_xplore
I remember back in 2006 I used to browse overclock forums to overclock my pentium 4, I tons of fun consuming lots of instructions, I learned the bios, changed PLL clocks, mem clocks etc.
show comments
fleventynine
I upgraded to this exact CPU from a 200MHz pentium in the fall of 2000. Easily the largest jump in performance of any upgrade I've ever done.
paulryanrogers
My first 1GHz was an AMD, also my first non-Intel, and its required fan was so loud that I was glad to get rid of it.
The speed was nice, and some competition helped lower prices.
herodoturtle
I remember upgrading my 486 DX2 66Mhz to a DX4 100Mhz and all of a sudden being able to run winamp and Quake. That felt pretty epic at the time.
davidee
I have very fond memories of my first dual-cpu Athlon machine.
It was the workstation on which I learned Logic Audio before, you know, Apple bought Emagic. I took that machine, running very low latency Reason to live gigs with my band.
Carting around a full-tower computer (not to mention the large CRT monitor we needed) next to a bunch of tube Fender & Ampeg amps was wild at the time. Finding a good drummer was hard; we turned that challenge into a lot of fun programming rhythm sections we could jam to, and control in real-time, live.
nikanj
The craziest thing is, I don't actually know how many gigahertz either my PC or my macbook are. The megahertz race used to fierce!
show comments
jmyeet
I have a hard time remembering what computers I had in the 1990s now. I had an 8086 in the 1980s. I think the next one I had was a 486/33 in the early 90s and I had this for years. I remember having a Cyrix 586 at some point later. I think the next jump was in the early 2000s and I honestly don't rmeember what that CPU was so I can't say when I got my first 1GHz+ CPU. Probably that 2002 PC. No idea what it was now. But it did survive in some form for another 12 years.
Fun fact #1: many today may not know that the only reason switched to the Pentium name was because a court ruled that they couldn't trademark a number and AMD had cross-licensed the microarchitecture and instruction set to AMD and Cyrix.
It was the Pentium 4 when clock speeds went insane and became a huge marketing point even though Pentium chips had lower IPC than Athlons (at that time). There was a belief that CPUs would keep going to 10GHz+. Instead they hit a ceiling at about ~3GHz, that's barely increased to this day (ignoring burst modes).
Intel originally intended to move workstations and servers to the EPIC architecture (eg Merced was an early chip in this series). This began in the 1990s but was years delayed and required writing software a very particular way. It never delievered on its promise.
And AMD, thanks to the earlier cross-licensing agreement, just ate Intel's lunch with the Athlon 64 starting in 2003 by adding the x86_64 instructions, which we still use today.
Fun Fact #2: it was the Pentium 3 that saved Intel's hide long after it was discontinued in favor of the Pentium 4.
The early 2000s were the nascent era of multi-core CPUs. The Pentium 3 had survived in mobile chips and become the Pentium-M and then the Core Duo (and Core 2 Duo later). This was the Centrino platform and included wireless (IIRC 802.11b/g). The Pentium 4 hit the Gigahertz ceiling and EPIC wasn't going to happen to Intel went back to the drawing board, revived the mobile Pentium-3 platform, adding AMD's 64 bit instructions and released their desktop CPUs. Even modern Intel CPUs are in many ways a derivation of the Pentium-3 [1].
Argh. The headline. The opener. Awful. Where are editors in 2026? There's no way an LLM would write this.
The GHz barrier wasn't special. What was much more important was the fact that AMD was giving Intel a hard time and there was finally hard competition.
The Megahertz Wars were an exciting time. Going from 75 MHz to 200 MHz meant that everything (CPU limited) ran 2x as fast (or better with architectural improvements).
Nothing since has packed nearly the impact with the exception of going from spinning disks to SSDs.
What a time to be a kid then.
We had a hand-me-down DEC x86 desktop at home with a Pentium II running at 233 MHz until I want to say 2002? This was around the time I learned how to build a PC since doing that was cheaper than buying one and no-one in my family had the money for that!
I saved whatever money I could to buy a 128MB stick of RAM from Staples (maybe it was 256MB?), a few other things from TigerDirect/Newegg and _this processor_. With some help from my uncle and a guide I printed from somewhere whose website started with '3D' (it was quite popular back then; I don't think it exists anymore), I got it done.
Going from 233 MHz to this was like going from walking to flying in a jet! Everything was SO MUCH F**ING FASTER. Windows XP _flew_. (The DEC barely made the minimum requirements for it, and boy did I feel it.) Trying to install Longhorn on it a year or two later brought me back into walking again, though. :D
The Athlon XP was the bigger milestone, as I remember it.
They were both "seventh generation" according to their marketing, but you could get an entire GHz+ Athlon XP machine for much less than half the $990 tray price from the article.
I distinctly remember the day work bought a 5 or 6 node cluster for $2000. (A local computer shop gave us a bulk discount and assembled it for them, so sadly, I didn't poke around inside the boxes much.)
We had a Solaris workstation that retailed for $10K in the same office. Its per-core speed was comparable to one Athlon machine, so the cluster ran circles around it for our workload.
Intel was completely missing in action at that point, despite being the market leader. They were about to release the Pentium 4, and didn't put anything decent out from then to the Core 2 Duo. (The Pentium 4 had high clock rates, but low instructions per cycle, so it didn't really matter. Then AMD beat Intel to market with 64 bit support.)
I suspect history is in the process of repeating itself. My $550 AMD box happily runs Qwen 3.5 (32B parameters). An nvidia board that can run that costs > 4x as much.
The i486DX 33MHz was introduced in May 1990. A 30x increase, or about five doublings, in clock speeds over ten years. That's of course not the whole truth; the Athlon could do much more in one cycle than the 486. In any case, in 2010 we clearly did not have 30GHz processors – by then, the era of exponentially rising clock speeds was very decidedly over. I bought an original quadcore i7 in 2009 and used it for the next fifteen years. In that time, roughly one doubling in the number of cores and one doubling in clock speeds occurred.
Fun times. Coolers, paste, fans, supply watts, dip switches and jumpers. Quake, Voodoo 3dfx vs NVidia GForce. This is where it all started, kids.
I was in high school and was running a "computer games club" (~ Internet cafe for games and kids) since 1998 when we got a place, renovated it ourselves, got custom built furniture (cheap narrow desks) and initially 6 computers - AMDs at 300Mhz. By 2000 we broke a wall in the adjacent space and had ~15, cable + satellite internet for downloads and whatever video cards we could buy or scrap. It was wild.
What progress is being made in overcoming the current thermal limits blocking us from high clock rates (10Ghz+)?
I bought a whole bunch of parts with my first Athlon. I think I bought a Soundblaster, and a Radeon GFX card if I am remembering the timeline right. The soundblaster came with a demo of a Lara Croft game that used the then incredible spatial audio processing to great effect. The industry promptly forgot about that technology, and to this day game audio rarely matches the potential of real time spatial dynamics that we once reached 20 years ago.
I remember back in 2006 I used to browse overclock forums to overclock my pentium 4, I tons of fun consuming lots of instructions, I learned the bios, changed PLL clocks, mem clocks etc.
I upgraded to this exact CPU from a 200MHz pentium in the fall of 2000. Easily the largest jump in performance of any upgrade I've ever done.
My first 1GHz was an AMD, also my first non-Intel, and its required fan was so loud that I was glad to get rid of it.
The speed was nice, and some competition helped lower prices.
I remember upgrading my 486 DX2 66Mhz to a DX4 100Mhz and all of a sudden being able to run winamp and Quake. That felt pretty epic at the time.
I have very fond memories of my first dual-cpu Athlon machine.
It was the workstation on which I learned Logic Audio before, you know, Apple bought Emagic. I took that machine, running very low latency Reason to live gigs with my band.
Carting around a full-tower computer (not to mention the large CRT monitor we needed) next to a bunch of tube Fender & Ampeg amps was wild at the time. Finding a good drummer was hard; we turned that challenge into a lot of fun programming rhythm sections we could jam to, and control in real-time, live.
The craziest thing is, I don't actually know how many gigahertz either my PC or my macbook are. The megahertz race used to fierce!
I have a hard time remembering what computers I had in the 1990s now. I had an 8086 in the 1980s. I think the next one I had was a 486/33 in the early 90s and I had this for years. I remember having a Cyrix 586 at some point later. I think the next jump was in the early 2000s and I honestly don't rmeember what that CPU was so I can't say when I got my first 1GHz+ CPU. Probably that 2002 PC. No idea what it was now. But it did survive in some form for another 12 years.
Fun fact #1: many today may not know that the only reason switched to the Pentium name was because a court ruled that they couldn't trademark a number and AMD had cross-licensed the microarchitecture and instruction set to AMD and Cyrix.
It was the Pentium 4 when clock speeds went insane and became a huge marketing point even though Pentium chips had lower IPC than Athlons (at that time). There was a belief that CPUs would keep going to 10GHz+. Instead they hit a ceiling at about ~3GHz, that's barely increased to this day (ignoring burst modes).
Intel originally intended to move workstations and servers to the EPIC architecture (eg Merced was an early chip in this series). This began in the 1990s but was years delayed and required writing software a very particular way. It never delievered on its promise.
And AMD, thanks to the earlier cross-licensing agreement, just ate Intel's lunch with the Athlon 64 starting in 2003 by adding the x86_64 instructions, which we still use today.
Fun Fact #2: it was the Pentium 3 that saved Intel's hide long after it was discontinued in favor of the Pentium 4.
The early 2000s were the nascent era of multi-core CPUs. The Pentium 3 had survived in mobile chips and become the Pentium-M and then the Core Duo (and Core 2 Duo later). This was the Centrino platform and included wireless (IIRC 802.11b/g). The Pentium 4 hit the Gigahertz ceiling and EPIC wasn't going to happen to Intel went back to the drawing board, revived the mobile Pentium-3 platform, adding AMD's 64 bit instructions and released their desktop CPUs. Even modern Intel CPUs are in many ways a derivation of the Pentium-3 [1].
[1]: https://en.wikipedia.org/wiki/List_of_Intel_Core_processors
Argh. The headline. The opener. Awful. Where are editors in 2026? There's no way an LLM would write this.
The GHz barrier wasn't special. What was much more important was the fact that AMD was giving Intel a hard time and there was finally hard competition.