Radeon X800 XT
tsunami
I've seen quite a few benchmark comparisons between the 6800ultra and the x800pro and in every case, the x800 has the better performance. Looks like I'm going to be buying my first ATI card.
digitalwanderer
It won't be my first, but it'll be the first time I ever bought a new generation card right when they came out! I've already been calling the local stores to see if they have an availability yet, Brent's review said they were ACTUALLY SHIPPING THE PROS OUT TODAY!!!! :D MUST upgrade! 8-)
Yakumo
except now it's ATI that is not performing up to spec, being that it is not fully dx9c compliant, no branching, only fp24 (still) it's doing better where it is cos it's still doing less work.
Jaybo
I don't know about all of this recent talk. No one is bothering to pat Nvidia on the back for getting themselves back on track. No one is noticing their improved AA. No one is mentioning how it can beat the ATI card in some tests including OpenGL performance which will make a big impact considering all the Doom3 engine games no doubt will be released. They are really, really close this round and everyone is all over the ATI card again. YES I love my 9700. No fanboyism here. The whole power supply thing is a complete non-issue for most of us at this board, and if you are short a dongle I bet they will include one or two splitters with the card or you can get one for nearly nothing. The first thing I said to myself when reading the numbers is 'whoah, pretty damn close.' If they are that close, and one offers a new feature the other doesn't have a replacement for (PS3) why is everyone still hating Nvidia? Don't they deserve at the very least a second look this time around? Temporal AA should not even be brought up since technically I am sure ATI or Omega will add it to all cards. It has its ups and downs. I may go ATI this round also but I just don't like it that it seems to be very biased out there this time. I say cheers to Nvidia for pushing a real power card up on retail shelves that gives you better IQ than past offerings while also possibly taking the performance crown.
thomas997
Yeah I have to agree with this. The x800 is just much more efficient. The heatsink is much smaller, no ramsinks at all, and a smaller board with what looks like less components. Nvidia takes the brute force method again, and you will see ATIs cards will have much nicer prices in the future. Just from differences in production costs. If Nvidia can get tile based rendering to work, then they wont need as much power as they are using now. I doubt they will go that way though. 6800 is good, x800 is better :)
thomas997
Get a better monitor :) Sony 21" can do 85hz at 1600x1200 But right now I have an older monitor with the best refresh rate at 1024x768. So I stick with that and FSAA. Id probably prefer 1024 and fsaa, to 1600 no fsaa.
MB
I agree that the ATI card is more efficent in some ways. But for me that is no concern. I had enough bad experiences incl. my friends with ATI because of their crappy drivers and support for games. Nvidia was always far superior in that case. People have been using the argument "OpenGL will impove" since the GF4 and it never improved drastically. ATI isnt that good at OpenGL and I doubt it will change this time. Also there is still the 6800UE, which was in the reviews I read, on par or slighty/much faster than the X800XT. If the cooling works, and manufacturers will for sure make better cooling solutions, then Nvidia is the king this time for real hardcore gamers who always want the best. Oh yeah, it also looks like the Nvidia is able to be overclocked much more than the X800 from what ive seen. But its still funny to see all the fanATIcs who wont admit even for $100 that ATI isnt that great this time, ignoring the facts. :) Ill lean back, wait for my 6800UE and laugh at the children bashing each other.
JemyM
My 19" is capable of 72hz at 1940x1440!!! And I often run games in that resolution. Morrowind looks fantastic without any AA at all. If I even attempt running at 1024x768, it feels like the monitor smack me in my face. What a waste of space... and what a blurry mess.