Speed bump
Shifting to a PowerBook for my home computing has underlined a something that I should have realised a while ago: it's time to rethink the obsessive focus on speed in computing. Case in point: I'm writing this on a computer with a "mere" 1.33 GHz CPU with what I recently heard someone call an "anemic" 167MHz bus connecting it to the rest of the system. This laptop has a relatively slow hard drive, memory and a "small" 1024x768 display. By the yardsticks of many hardware fetishists, this rates as a truly poky system speed-wise. And yet this is hands-down the smoothest and nicest-looking PC I've ever used. At work I have a 2.4GHz system with vastly faster components, and it somehow doesn't manage to feel any faster.
I realise I haven't been excited about computer hardware performance improvements in quite a while. This may be just a sign of getting jaded, but it wasn't always this way and I do believe a change is happening. My first PC was a 4.7MHz IBM XT clone. Then a 25 MHz 386 which felt like a supercomputer - this thing was at least 15 times the speed of the XT, try to remember when you last had that big an upgrade. Then a 66MHz 486 which was nice but not as huge as the previous jump.
After Uni when I started working, I got the chance to use one of the first 100MHz Pentium PC's on the site - once again I had a little supercomputer on my desk. Coupled with NT 4, which was an equally huge improvement over the squalid world of Windows 95, this just felt like a next generation system. But since then, speed has become less and less of a noticeable improvement (and newer versions of Windows have failed to improve much either, but that's another story).
This article points out a hard truth that hasn't been discussed much: CPU speed has hit a wall, one which is not likely to shift for a while. The article includes a graph of CPU speed over time, and shows that it was in 2003 that we hit the wall. In 2005 we should have 10GHz CPU's, in reality we may get a 3.7GHz one this year.
And what if we did have 10GHz CPU's? Maybe I'm just limited in imagination, but what would we actually do with them? Fancier graphics? This "anemic" Mac already has some serious graphics chrome thanks to the GPU. Faster compiles for us developers? It's been years since I had to wait for the compiler thanks to incremental compiles. Games? Get a PS2, for $250 it'll give you way more bang for buck than a tricked-out PC at ten times the price that draws more current than a toaster oven. There are always uses for more power in tasks requiring sheer calculating grunt e.g. audio compression, video/graphics editing etc, but these won't change the computing experience for most of us.
The good bit is, what gets me excited today (professionally anyway ;) is the software, not the hardware. Technologies built on platforms like Java, OS X, and next gen languages are evolving nicely. I just hope that the hardware speed wall won't impede the progress of these technologies.
I realise I haven't been excited about computer hardware performance improvements in quite a while. This may be just a sign of getting jaded, but it wasn't always this way and I do believe a change is happening. My first PC was a 4.7MHz IBM XT clone. Then a 25 MHz 386 which felt like a supercomputer - this thing was at least 15 times the speed of the XT, try to remember when you last had that big an upgrade. Then a 66MHz 486 which was nice but not as huge as the previous jump.
After Uni when I started working, I got the chance to use one of the first 100MHz Pentium PC's on the site - once again I had a little supercomputer on my desk. Coupled with NT 4, which was an equally huge improvement over the squalid world of Windows 95, this just felt like a next generation system. But since then, speed has become less and less of a noticeable improvement (and newer versions of Windows have failed to improve much either, but that's another story).
This article points out a hard truth that hasn't been discussed much: CPU speed has hit a wall, one which is not likely to shift for a while. The article includes a graph of CPU speed over time, and shows that it was in 2003 that we hit the wall. In 2005 we should have 10GHz CPU's, in reality we may get a 3.7GHz one this year.
And what if we did have 10GHz CPU's? Maybe I'm just limited in imagination, but what would we actually do with them? Fancier graphics? This "anemic" Mac already has some serious graphics chrome thanks to the GPU. Faster compiles for us developers? It's been years since I had to wait for the compiler thanks to incremental compiles. Games? Get a PS2, for $250 it'll give you way more bang for buck than a tricked-out PC at ten times the price that draws more current than a toaster oven. There are always uses for more power in tasks requiring sheer calculating grunt e.g. audio compression, video/graphics editing etc, but these won't change the computing experience for most of us.
The good bit is, what gets me excited today (professionally anyway ;) is the software, not the hardware. Technologies built on platforms like Java, OS X, and next gen languages are evolving nicely. I just hope that the hardware speed wall won't impede the progress of these technologies.