Computers today are so enormously improved from the devices I used in my childhood, it scarcely feels accurate to refer to them with the same word. Mostly, that’s a good thing. I do not miss the days when hard drive specifications had to be manually keyed into BIOSes, when CPUs would cheerfully drive themselves into thermal runaway until they melted to the board. I am glad end-users no longer have to manually set the CPU voltage for a new chip using a series of jumpers or DIP switches, knowing if they got it wrong, they’d fry the CPU, the motherboard, or both. I don’t tear up when I remember hunting for a SoundBlaster Live OEM CD because Creative wouldn’t put the OEM drivers online.
But there are things I miss about the era. Back in 2000, I had a 16MB PC66 SDRAM DIMM that I could overclock to 133MHz — so long as I put it in the RAM slot farthest from the CPU. I overclocked my network card by pushing my system bus clock beyond rated spec and meaningfully improved network transfer speeds in doing so.
There’s one specific way in which the computers of the late 1990s and early 2000s blow the machines of today out of the water: achievable price/performance ratios. Want a low-cost dual-processor system? Grab an Abit BP6 and a pair of Celerons for a fraction of the price of a normal 2S system. Want Pentium II performance at Celeron prices? Grab a 300A and overclock it.
The AMD Athlon might have gotten all the accolades, but I always loved the original Duron more. As far as I can tell, just about every Duron 600MHz ever manufactured could hit at least 800MHz (8x100MHz). Plenty of them ran faster — my own Duron 700 was capable of 1045MHz on a 190MHz FSB (more on that in a moment). The equivalent improvement today would be buying a Core i3 with a clock of, say, 3.5GHz and cranking it to 5.2GHz while increasing your DRAM clock from DDR4-3200 to ~DDR4-4500.
The components to pull these kinds of tricks, moreover, were remarkably cheap compared with today, even accounting for inflation. Back then, Duron’s didn’t like to hit high bus speeds and generally could not boot at a 133MHz FSB clock. Motherboards of that time had just switched over to what were called Soft Menu settings for overclocking, instead of physical DIP switches or jumpers. Soft Menus were much easier to use, but they had a known flaw: There was a non-zero amount of time before the motherboard would adjust the CPU voltage to whatever was programmed into the Soft Menu.
When the CPU initialized, it had to be capable of hitting its default multiplier * the programmed FSB, and it had to be able to do this at its default voltage. If the CPU couldn’t initialize at these rates, even for a fraction of a second, the overclock failed. Plenty of Duron CPUs could hit 800MHz to 1GHz, but they mostly couldn’t do it at default voltage. As a result, Duron’s couldn’t run at a 133MHz FSB.
But I had an idea. You could also pencil mod the voltage on a Duron. I intuited that by locking the CPU to the highest voltage (1.85v), I could ensure that the CPU had all the voltage it needed. Once booted, I could bring the voltage down to something saner. Because the CPU initialized before the BIOS did, it would pull 1.85v for the split-second required to initialize the motherboard.
So what did all this cost me, in terms of specialized components or aggressively binned CPUs? Nothing. I paid a small premium for a 256MB stick of Tonicom BGA-mounted SDRAM clocked at 166MHz. I bought an IWILL KK266-R, then one of the best-regarded KT133A boards. I used a cooler I already owned and cranked my Duron up to 1045MHz on a 190MHz FSB. When the first-generation DDR motherboards shipped, my SDRAM-equipped KT133A was faster than any VIA KT266 board ever built. It took the KT266A chipset to put my SDRAM system in its place. I unlocked my CPU multiplier and voltage with a pencil.
I’m not just telling this story to toot my own horn. The point is, these intuited moments and enthusiast angles existed. You didn’t have to be rich or capable of affording top-notch hardware to take advantage of them. It wasn’t unusual, back in the day, to discover that a midrange GPU was actually a high-end card in disguise, with more performance potentially on tap. Today, companies like AMD, Intel, and Nvidia are far better at extracting every ounce of value from their own products. Launches are far more polished, products slot neatly into segmentation, and generally speaking, things are very orderly. We still get surprises, to be sure — but they’re usually not aimed at the kind of ill-advised, risky, and fun tinkering that used to define the industry. Insane overclocks used to involve large tanks of water. Now they require large tanks of liquid nitrogen.
The flip side to this is that you don’t need to wonder if your motherboard southbridge and sound card will disagree and irrecoverably destroy all of the data in your RAID array. Being on the bleeding edge may cost a lot more money these days, but you’re a lot less likely to get sliced trying to balance there. Computers today are a hell of a lot better than the Windows 98SE machines that typify the era I’m discussing, but they aren’t necessarily quite as much fun.
99 percent of the time, I think this is a good thing. The rest of the time… well. Patrick Stewart sums it up better than I do:
That’s what I miss about the old days. What about you? Is there an OS you felt had tremendous potential but never got a chance? A CPU architecture you favored?
Also, if you absolutely hated them and just want to complain about Plug’n’Pray or, say, ATI’s 16-bit color implementation, that’s fine too. There are a lot of reasons to prefer the modern era as well.
Now Read:
Leave a Reply