Wednesday, September 13, 2006

Requiem for a Geek?

For years now, I have gradually gotten more and more adept at taking care of the physical parts of my computers. I had a computer in high school, and I got another while in grad school. It was, of course, a gradual thing. I learned about adding memory, then swapping out video cards. Then I graduated to installing CD-ROM drives, then hard drives.

For a long time I wasn't quite ready to put everything together myself, but when the time came to change out everything instead of a part or two, I would spend weeks going over part reviews and pricing, trying to get the most bang for my buck and be placed so I could upgrade easily (for you non-hardware geeks, hardware upgradeability is the single trickiest part of building a PC). Then I would order from the outfit that could provide most of the gear I wanted, and I'd put the finishing touches on myself. I stayed at that level for a while, but a year or two ago I bought the parts and put together my current machine myself. It took a lot of fussing but in the end I was pretty proud of myself for pulling it off. I was sure I had a machine that I could just slot parts into as needed for years to come. Hah.

What got me was this: I was playing the recent RPG release Oblivion (not a bad game, BTW. It was technically very proficient, but ultimately a bit of a let-down) and decided I really needed to get a better videocard. I don’t mind running newer games with some of the visual bells and whistles turned down, but this time it was really bad.

So I start checking on newer mid-range cards only to discover Something Awful.

Pretty much none of the better mid-ranges would fit in my machine. In the two years since I built my current box, a new interface had arrived and pretty much taken over in the video world, an interface that as far as I can tell had not even been on the radar when I made my choices. Sure, I had read about the new interface as it came out, but what I was unaware of was the degree to which it had pushed the old one out. And yeah, I know two years can be an eternity in the computer world, but trust me when I tell you that interface changes don’t usually go that fast.

I eventually managed to find a video card that would serve, but it was still quite a shock to me, and I couldn't help but wonder if it was something similar to what my Dad felt as car engine technology advanced and got all fuel-injected and computery in the 80's. Dad had been a mechanic in the Army, as a teenager he used to soup up his cars to the limit, and he remained a fair shade-tree mechanic for a long time. But I noted as the 70's petered out and the 80's wore on he spent less time doing repair work himself, eventually stopping altogether.

I don't think I've quite reached that pass. And maybe this is more just one of those things, like buying a Betamax VCR, or a laserdisc player. You know, just a bad move. Still, it was startling, and not in a fun way.

1 comment:

DSK said...

The thing is that video cards aren't video cards in the traditional sense anymore. They are their own little computer within a computer (they're probably all turing complete these days). So you've basically got to have better and better special buses between the video card computer and the general computer.

As for cars, at least you can still change your own oil without needing anything wonky.