PDA

View Full Version : Which graphics card is better?



mined
August 17th, 2007, 05:25 PM
I have more-or-less moved my Halo related inventory over to my work computer since I have little time to mess with it at home and the home computer is a crapbox anyway.

On my personal work machine I have an ATI Radeon X1300. On one of our older machines I have an Nvidia Geforce 6800 GTO. I was just curious if anyone of you tech savvy people could give me a recommendation at which is best between the two and why? If the Geforce turns out to be better, I plan on moving it over to my machine in hopes of raising my vista score a bit. I am sitting at a 4.1 right now.

Atty
August 17th, 2007, 05:26 PM
6800 is the clear winner.

bleach
August 17th, 2007, 05:26 PM
Nvidia Geforce 6800 > ATI Radeon X1300
What is your power supply for the computer with the Nvidia 6800?
If you got something above 400 or 500 watts, then you could probably get a Nvidia 7900, depending how much money you have.

mined
August 17th, 2007, 05:31 PM
Thanks Atty. Is it simply because it is Nvidia?


Bleach, thanks but no thanks. I am not in the market to buy a new card. I simply want to have the best card between the two for my machine. If it helps, though, I am running an AMD Athlon 64 X2 Dual Core 5600+ with 3.25 GB RAM @ 2.81 Ghz. As for the power supply, I have not checked.

KIWIDOGGIE
August 17th, 2007, 08:33 PM
yea i think nvidia is over ati atm because the only dx10 cards cost $$$$$$$ from ati and they are cheaper for nvidia and more perfomance and gfx (youll need 2 8800GTX sli'd for GoW)

CN3089
August 17th, 2007, 08:54 PM
Thanks Atty. Is it simply because it is Nvidia?

No, it's because the X1300s are budget cards, and the 6800s are high-performance gaming cards.

Agamemnon
August 17th, 2007, 09:23 PM
He's right. The X1300 doesn't even compare in a benchmark to the X800. My X1600XT might boast 512VRAM, but the benchmark is so horrible that it doesn't even matter. Bigger does not always equate better.

Mr Buckshot
August 18th, 2007, 12:32 AM
The 6800 owns the X1300 by a large margin. Not because of brand, but because of technical features - 256-bit memory and 12 pipelines (or 16?) in the 6800 versus 64-bit memory and 4 pipelines in the X1300.

The 6800, while great, is becoming obsolete. If you can sell both cards, or if you're willing to spend a little, go for, say, a Geforce 7900/7950 or a Radeon X1950 Pro.

I own a Geforce 7600 GT. It is similar to the 6800 in that it has 12 pipelines, but only has 128-bit memory yet has faster clock speeds. Both support SM3.0 (DX9c).

bleach
August 18th, 2007, 01:32 AM
......X1300 :poke:6800........

Zeph
August 18th, 2007, 06:22 AM
Radeon 9600 is a better card than the X1300. Well, aside from SM3, if it even uses it.

KIWIDOGGIE
August 18th, 2007, 06:56 AM
No, it's because the X1300s are budget cards, and the 6800s are high-performance gaming cards.

True, ATI has always been for performance on the cheap, but there cards dont overclock and bygod superclock very well at all(BSOD randomly when overclocking a X300) and Halo 2 using an x300 got 800x600 all low 30FPS. thats why i just broke down(stole some money from dog) and got a 8800. (2 of them GTX, GTS but the GTX i got for 50$ wont run at its clocked speed)

CN3089
August 18th, 2007, 07:00 AM
So your X300 was worse than your 8800(s). I guess that means ATi must really suck! http://i29.photobucket.com/albums/c251/CN3089/Emoticons/374927153_4aae9e7e97_o.gif

KIWIDOGGIE
August 18th, 2007, 07:03 AM
So your X300 was worse than your 8800(s). I guess that means ATi must really suck!

no not really, the X300 is like the lowest PCI-E card you can get atm. I got it when it was brand new. But comared to the x300

lets just say when i had pulled both cards out(x300 out of case and 8800 out of box) the 8800 said:fuckoff:also:picsorstfu:(pics = gfx)

mined
August 18th, 2007, 09:13 AM
Ok, so I started swinging with the graphics cards. While I was at it I noticed that our server had two DVI connectors. Knowing that this meant another card to throw into the mix, I checked out the properties. It turns out I have a Nvidia Quadro FX card on our server. So can anyone tell me how this blends into the mix?

InnerGoat
August 18th, 2007, 10:16 AM
What model? There's a bunch in the Quadro FX line...

mined
August 18th, 2007, 10:30 AM
Its the Quadro FX 3450 SDI

InnerGoat
August 18th, 2007, 01:40 PM
Apparently that is a 6800Ultra.

legionaire45
August 18th, 2007, 01:52 PM
Its the Quadro FX 3450 SDI
Nice find.

mined
August 18th, 2007, 05:04 PM
I can't believe it is being wasted on the server. I looked more into it and due to the power source and requirements I can't move it over to my machine. I am wondering, however, how easy it would be to simply switch out the hard drives between the two so my current machine would act as the server. I'll have to look into that a bit more.

Agamemnon
August 18th, 2007, 05:36 PM
You could just buy a new PSU...

KIWIDOGGIE
August 18th, 2007, 05:44 PM
or if you have any extra power cords you can plug in 2 into some adapter. Then plug into the gfx card

Zeph
August 18th, 2007, 10:30 PM
I can't believe it is being wasted on the server. I looked more into it and due to the power source and requirements I can't move it over to my machine. I am wondering, however, how easy it would be to simply switch out the hard drives between the two so my current machine would act as the server. I'll have to look into that a bit more.

It's a rendering card. It's used for a lot of open GL, etc., stuff. If memory serves, your using that to render scenes in applications like Max? I dont call that a waste at all. It's more of a blessing than anything.

Mr Buckshot
August 18th, 2007, 11:49 PM
Supporting a shader model is irrelevant if the card cannot take advantage of it (except maybe in less demanding 3D designer programs). The X1300 is too weak to use its SM3.0, therefore the higher-end SM2.0 cards (Radeon 9700, 9800, X700, X800) all beat it.