Duh Voodoo Man's Geforce 6800 Card Mod Misadventure
A GOOD NEWS/BAD NEWS STORY WITH A HAPPY ENDING

 

   NEVER ONE TO PASS UP A BARGAIN...   

EVGA Geforce 6800 AGP 128MB

With my son's MSI GeforceFX 5900XT video card starting to get a bit long in the tooth, I recently started keeping an eye open for a good value in a new video card for his PC. In keeping with my personal philosophy of being devoutly cheap, I wanted to keep the cost as close to $150 as possible. I had toyed with the idea of a Geforce 6600GT or Radeon 700 Pro, but ultimately rejected both on the basis of their meager 128-bit memory interface. Limiting my choices to a 256-bit card, this basically translated to either the Geforce 6800 or the Radeon X800GT among recent models remotely within my price range. Also, I needed an AGP version to match up with his motherboard. These restrictions yielded several choices, but all were still $25 or more above my price target.

Between the Geforce 6800 and Radeon X800GT, the choice was not easy. The Radeon card offered higher core and memory clockspeeds (475/980) and 256MB of DDR3 memory, but with the drawback of only 8 pixel pipelines. The Geforce 6800 was clocked at much more modest levels (325/700) and used only 128MB of regular DDR memory, but offered the substantial advantage of 12 pixel pipelines. I wasn't successful in locating a head-to-head comparison of the two models, though a couple of forum postings suggested similar overall performance.

What finally broke the tie for me was THIS ARTICLE that I came across at the Firing Squad site. It described how, using Alexey "Unwinder" Nicolaychuk's handy RivaTuner video card tweak utility, the additional four locked pixel pipelines of the 6800 could be enabled, along with an additional vertex shader. This process is referred to generically as "softmodding", since the operation of the card is modified through software and not by making any actual physical changes to it. Apparently, the 6800 AGP models use the same GPU as the more expensive 6800GT and Ultra models, but with 12 operating pixel pipelines and 5 vertex shaders versus the 16 and 6 in the two higher-end cards. (NOTE: This is not true for the PCI Express version of the 6800, which uses a different core with only 12 pixel pipes total.) As is often the case in such situations, rather than incurring the considerable additional expense of modifying the GPU itself, Nvidia opted to merely disable a portion of the hardware through the card's BIOS. The problem with this approach seems to be that sooner or later (and it's usually sooner), some talented geek figures out how to unlock these disabled features.

However, this still does not guarantee that, once turned back on, they will work properly. The prevailing theory is that when these chips are manufactured, they are tested and segregated by performance. Those that have a pixel pipeline or vertex shader that doesn't cut it for use in a 6800GT or Ultra can be turned into a perfectly good 6800 by disabling the offending component, as long as the rest of the pipelines and shaders are okay. Since the pixel pipelines function together in groups of four (a.k.a. a "quad"), the original 16 must be reduced to 12 in order to disable a bad pipe. So unfortunately, the only way to know if a given 6800 can run with all pipelines and shaders enabled is to try it. The good news is that the method works perfectly for many users. The bad news is that it doesn't work for a sizeable number, too. And nobody has data on what the proportions are, so it's truly a matter of paying your nickel and taking your chances. But even if it doesn't work, you're still left with a very capable video card.

So with all of this bouncing around in my cranium, I happened across an EVGA Geforce 6800 AGP 128MB card on sale at NewEgg.com for $164 final cost, after a $20 mail-in rebate. SOLD!!

 

   THE EVGA GEFORCE 6800 AGP 128MB   

The new video card arrived a mere 26 hours after I had ordered it online, even with the free 3-day UPS shipping. NewEgg's shipping location in Edison, NJ had everything to do with this, since it's only about 180 miles from me. This was a full retail box, not a bare-bones OEM card swathed in bubble-wrap. Still, it was a pretty basic package, containing the card, documentation, a driver/utility CD, a couple of cables, and a DVI/VGA adapter. Fine by me, because I really wasn't looking for any extra bells & whistles, and the games they typically bundle with these cards stink anyway. The card itself looked like any of a number of 6800 series models from several different manufacturers, which seem to differ mainly in the color of the PCB and the stencil on the cooler cover.

Here are some photos of the card and accessories, courtesy of NewEgg:

EVGA Geforce 6800 AGP 128MB Retail Package
(Click on individual photos to see larger version)
Top view
Back view
Bundled accessories & software
Retail box


Installation of the card and drivers was straightforward. First, I entered device manager and uninstalled the existing video card, which also removes the old drivers. After powering down and swapping in the new EVGA card, I booted back up and cancelled the wizard that launched when the OS detected the presence of new hardware. Next, I installed the current 81.85 release version of the Nvidia "ForceWare" reference drivers, available at the Nvidia driver download page, and then merged the "Coolbits" Forceware Registry hack, to enable the drivers' hidden overclocking capability. After confirming that it was running smoothly and stably, it was time to start putting the card through its paces....

 

   OPENING THE PIPES....   

Click on pic for larger image
Forceware Clock Frequency tab

After running a "baseline" set of benchmarks with the card in its stock configuration, it was time to try enabling the four pixel pipelines and one vertex shader that turned off in the 6800 BIOS. Following the directions in the Firing Squad article mentioned previously, this was a simple matter. The screenshot to the right shows the RivaTuner NVStrap tab with the disabled pipes and shader turned back on. In the case of this particular EVGA 6800, these were the third pixel quad and the fifth vertex shader, but this can change from one card to another. After a system reboot, I fired up my 3D "Aquarium" screensaver for a quick check of the image quality.

MAYDAY!! MAYDAY!! It was immediately obvious that the third pixel quad on my particular card had been disabled for a definite reason! The screen was filled with a rapidly flashing checkered pattern typical of a bad pixel pipeline. You can get some idea of the visual effect by looking at THIS SCREENSHOT from AquaMark3. In "real time", it looks much worse, because the checkered patterns flash on and off very quickly in constantly changing locations on the display, creating the illusion that there are even more of them.

So it was clear that I was one of the unlucky ones who couldn't take advantage of the speed boost afforded by enabling the additional pipelines, unless I was willing to live with an extremely ugly display, which I wasn't. I'd rolled the dice, and come up with "craps". But, hey, thems the breaks! So I fired up RivaTuner again and shut the pixel quad back off, along with the vertex shader. I probably could have left the shader enabled, but all I'd read suggested that the performance boost from the sixth shader was negligible. The extra pixel quad was what really drove the speed increase. Just for grins, while I had the extra pipes enabled, I ran a couple of quick benchmarks to see what the performance impact was. AquaMark3 showed a modest 3.5% boost, while 3DMark05 increased by a more substantial 10%. Oh well, water under the bridge....

So with the Great 6800 Pixel Pipe Experiment an abject failure, it was time to see what extra speed could be squeezed from the card using the more conventional approach--overclocking the core and memory.

 

So how fast can we make it go??