Making the switch....
After all that has happened in the local political scene in the last couple of days, I guess it would be a good change of pace to blog about something different in the meantime, to get my mind (and yours too) off the frustrating turn of events which have been plaguing this country.
Let's talk tech. Oh, I can already hear the groans from here. Sorry people, I can't help it. I'm a techie, and there's probably nothing I can do to change that, so I might as well live it up, at least here in this blog. If you find this subject boring, stop right now! I'm sure there are other posts you may find interesting here. However, if you insist...
I made the switch. After more than two years in ATI's camp, I went back to NVIDIA. If you don't understand what I'm talking about, I'm referring to video cards, those sophisticated circuit boards you plug in your PC responsible for generating all the lovely 3D graphics of your favorite games.
For more than two years, I had been a satisfied user of a HIS Excalibur Radeon 9600 Pro which I promptly sold a week ago (see previous post here).
It was powered by an ATI Radeon 9600 Pro VPU (R350), which is plenty powerful by most standards. However, if you prefer to game at higher resolutions with all the works, with antialiasing and anisotropic filtering enabled, you'd probably need a more capable video card.
Strangely enough, buying a new video card isn't as straightforward as it was only a few years ago. You can't just get any video card of the shelf and pop in your PC just like that. Of course, while price and performance are still primary considerations, nowadays, you also have to take into account power consumption and heat generation. Contemporary midrange to high-end video cards can now eat up as much current and generate as much heat as your CPU. Sometimes even more. As such, before choosing a video card you have to make sure your power supply is up to the task and that your case is adequately cooled. Check out this earlier post on this particular topic.
Adding to the confusion is the rising popularity of video cards based on the PCI Express interface, which is completely different from the traditional AGP interface. Since I'm only upgrading my video subsystem and not my entire motherboard, I still need to get an AGP video card. However, if you're into high-end gaming and planning on building a new system, I'd recommend you get a motherboard with PCI Express instead.
So what video card did I get? After carefully considering all of the above, I chose a card based on NVIDIA's GeForce 6800 LE GPU (NV40 LE), specifically an Inno3D GeForce 6800 LE. Its a somewhat crippled version of the full GeForce 6800 (GT and Ultra), but that doesn't mean that its performance is anything to sneeze at. I was originally planning on getting an Inno3D GeForce 6600GT, but to my mind the larger amount of memory, full 256-bit memory interface, soft-mod potential and lower power consumption of the 6800 LE were more desirable than the brute force performance of the 6600GT.
In a nutshell, this GPU runs at 300 MHz, boasts 222 million transistors, eight pixel pipelines and four vertex engines running off 256 MB of 256-bit GDDR3 memory running at 500 MHz (1 GHz effective). By way of contrast, the R350 used in the Radeon 9600 Pro runs at 400 MHz, "only" has 75 million transistors, four pixel pipelines, two vertex engines, and 128 MB of 128-bit DDR memory running at 300 MHz (600 MHz effective).
Informal tests (using different versions of 3DMark and various games) show that using this configuration, performance is more than double in most games, even with settings set on maximum. I achieved these results despite the fact that my CPU, an AMD Athlon XP 2400+, was introduced almost four years ago, and the AGP interface on my motherboard, an ECS K7VTA3 v3.1, is limited to 4x transfers.
Since the GeForce 6800 LE is based on the same NV40 GPU in the 6800, 6800 GT and 6800 Ultra, internally it has a total of sixteen pixel pipelines and six vertex engines. Eight of the pixel pipelines and two of the vertex engines are disabled in 6800 LE trim, but they can be enabled (soft-modded) using RivaTuner. Unfortunately, this does not work with all 6800 LEs (only about 40%) since NV40s are tested and bin separated off the production line. The high performers end up being GTs and Ultras, twelve pipe capable chips end up as plain-vanilla 6800s, and eight pipe chips end up as LEs.
In my case, enabling all sixteen pixel pipelines created a lot of texture corruption. Even the use of twelve pipelines produced similar results. Apparently the additional eight pipelines (two quads) are damaged in my particular GPU, limiting it to the use of only eight pipelines. I was, however, able to activate two additional vertex engines, improving hardware T&L performance. Your own results may vary. Soft-modding doesn't damage your hardware, so it doesn't hurt to try. There are some who are lucky enough to enable all sixteen pixel pipelines, turning their LE to the equivalent of a GT for a fraction of the price.
My old Radeon 9600 Pro was a good overclocker, and this GeForce 6800 LE seems to be no different. Right now I'm running it at 360 MHz core/525 MHz (1.05 GHz effective) memory, from stock 300/500 (1 GHz effective) speeds. I've run the core to as high as 370 MHz, but decided to throttle down to lessen heat production.
The GeForce 6800 LE is at least as powerful, probably even more so, compared to the previous generation's flagship video cards, like ATI's Radeon 9800 XT and NVIDIA's GeForce FX 5950 Ultra. It's just amazing how much performance you can get nowadays from a video card for significantly less money than before.
This upgrade should buy my rig a couple more years (at most) of gaming bliss. Hopefully by then I could afford to build a Socket 939 or M2 AMD dual-core system, perhaps with 2 GB of memory, an SLI or Crossfire graphics subsystem, and 64-bit Windows Vista. Now that would be something. :-)
Inno3D GeForce 6800 LE. Based on NVIDIA's reference design and boasting a custom cooling solution developed by Cooler Master.
Comments