If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below. |
|
|
Thread Tools | Display Modes |
#11
|
|||
|
|||
ATI's OpenGL drivers aren't so great. They are workable but not great.
The only thing impressive about the new Geforce cards is instancing support in Vertex Shader 3.0. And so far it's been used in exactly one game, and I don't expect that to change much for a long time. ATI hard their cards out first. Unlike NVidia, they don't need to cook their drivers. NVidia will have to work very hard to earn back my trust. |
#12
|
|||
|
|||
"NightSky 421" writes:
For example, I hope they bench it on processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. From the article: "As of this afternoon we were playing DOOM 3 on a 1.5GHz Pentium 4 box with a GeForce 4 MX440 video card and having a surprisingly good gaming experience. Even a subtle jump to an AMD 2500+ with a GeForce 3 video card that is two years old will deliver a solid gaming experience that will let you enjoy the game the way id Software designed it to be." Not a benchmark, but at least it's positive (if subjective). Nick -- # sigmask || 0.2 || 20030107 || public domain || feed this to a python print reduce(lambda x,y:x+chr(ord(y)-1),' Ojdl!Wbshjti!=obwAcboefstobudi/psh?') |
#13
|
|||
|
|||
On Thu, 22 Jul 2004 16:57:10 +1000, "Darkfalz"
wrote: "rms" wrote in message .. . http://www2.hardocp.com/article.html?art=NjQy Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny version was $345 shipped from provantage). They didn't bench anything older than 5950... what a bunch of clowns. Yep - so much for seeing how my GeForce 3 performed at 1024x768. I also loved the line "We figured that 1600x1200 resolution would be the place to start..." WTF ? I assume this article is aimed at people with high-end system, presumably overclocked ones. Who the **** plays games in 1600x1200 ? And on what ? A 23" monitor ??? -- Bunnies aren't just cute like everybody supposes ! They got them hoppy legs and twitchy little noses ! And what's with all the carrots ? What do they need such good eyesight for anyway ? Bunnies ! Bunnies ! It must be BUNNIES ! |
#14
|
|||
|
|||
"Darkfalz" wrote in message ... "rms" wrote in message ... http://www2.hardocp.com/article.html?art=NjQy Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny version was $345 shipped from provantage). They didn't bench anything older than 5950... what a bunch of clowns. My thoughts exactly... |
#15
|
|||
|
|||
"magnulus" wrote in message ... ATI's OpenGL drivers aren't so great. They are workable but not great. The only thing impressive about the new Geforce cards is instancing support in Vertex Shader 3.0. And so far it's been used in exactly one game, and I don't expect that to change much for a long time. ATI hard their cards out first. Unlike NVidia, they don't need to cook their drivers. NVidia will have to work very hard to earn back my trust. Sour grapes? --- Outgoing mail is certified Virus Free. Checked by AVG anti-virus system (http://www.grisoft.com). Version: 6.0.725 / Virus Database: 480 - Release Date: 7/19/2004 |
#16
|
|||
|
|||
Mark Morrison left a note on my windscreen which said:
Yep - so much for seeing how my GeForce 3 performed at 1024x768. I also loved the line "We figured that 1600x1200 resolution would be the place to start..." WTF ? I assume this article is aimed at people with high-end system, presumably overclocked ones. Who the **** plays games in 1600x1200 ? And on what ? A 23" monitor ??? I do a fair bit. 22" monitor. -- Stoneskin [Insert sig text here] |
#17
|
|||
|
|||
"Darkfalz" wrote:
"rms" wrote: http://www2.hardocp.com/article.html?art=NjQy Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny version was $345 shipped from provantage). They didn't bench anything older than 5950... what a bunch of clowns. I thought it was an okay preview benchmarking article, and I'm pretty sure that once the game is out, we'll see plenty of good benchmarkings. Keep an eye on www.xbitlabs.com in the upcoming weeks. I'd say that if we with our average graphics cards cut out the anisotropic filtering seen on the 5950 Ultra benchmark table, the framerate will most likely stay around the same speeds with 9800 Pros and 5900 XTs. As far as the engine's flexibility goes, I'd take that with a grain of ginger when it comes to the "high detail" modes. I personally won't consider playing the game anything less than Radeon 9800 or GeForce 5900. Will GeForce 3 be able to swoop it with high details? Hell, no. That dog won't hunt. |
#18
|
|||
|
|||
"NightSky 421" wrote:
"rms" wrote: http://www2.hardocp.com/article.html?art=NjQy Looks like the 6800GT is the sweet spot, if you can get it cheap (my pny version was $345 shipped from provantage). rms I thought it was a good article and it makes me happy I have a 9800 Pro video card. However, I can't wait to see how Doom 3 plays on systems that are a little more "real world". For example, I hope they bench it on processors 1.5GHz and up with GeForce4 MX and GeForce3 cards and up. I'd like to see an all-round comparison with as many combinations of CPU and video cards as possible. GeForce 4 MX will perform like a turd stuck in toilet seat. Heck, even GeForce 3 will drown into the quicksand. I have no idea how much difference there is between the "medium detail"- mode and "high detail"- mode, but I just refuse to believe that "GeForce 3" would surf the game with high details. I couldn't even turn on all the details in "Unreal 2" without diving into the bottom of the chart. |
#19
|
|||
|
|||
"ginfest" wrote in message news:V3OLc.140764$%_6.110988@attbi_s01... Sour grapes? No... I parted with 400 dollars for the GeForce FX 5900 card. I'm not going to fall for NVidia's crap a second time. Nothing sucks worse than to have a brand new videocard become underpowered technology in only five months. ATI is more honest with their products. They don't rewrite other peoples' shaders so that they use lower precision. And they don't require two molex power connectors or large fans. Sure, Geforce is faster for ONE GAME. Wow. That's justification for plopping down 500 dollars on a new videocard! For all we know, the ATI and NVidia cards aren't even running on the same codepaths, don't have the same visual quality, etc. (NVidia cards running with a 16-bit precision would of course run faster than ATI's 24 bit precision). When the FX 5900 came out, it was faster in Unreal Tournament 2003/2004- alot faster. But as history showed, that really didn't matter because it ran like crap in games like Deus Ex or Thief III. And Doom III may not be an important engine in the future of gaming, you never now. Right now the Unreal engine is pulling in alot of developers, and it runs on Direct 3D. OpenGL is pretty much dead in PC gaming. Doom III doesn't do anything that you cannot do with the Unreal engine, and it will no doubt cost more to license. So why would developers use it? The time has come for gamers to put away childish things and grow up a little beyond these stupid pecker contests. You just cannot compare two benchmarks now days without also comparing image quality. People should also be considering power requirements, thermal and cooling requirements, and so on. On all these accounts, the GeForce 6 loses. If you go out and buy a GeForce FX 6800 just because it runs faster in Doom III, you're a fool. End of line. |
#20
|
|||
|
|||
"Nada" wrote in message
om... GeForce 4 MX will perform like a turd stuck in toilet seat. LOL, I love that description! Heck, even GeForce 3 will drown into the quicksand. I have no idea how much difference there is between the "medium detail"- mode and "high detail"- mode, but I just refuse to believe that "GeForce 3" would surf the game with high details. I couldn't even turn on all the details in "Unreal 2" without diving into the bottom of the chart. Well when I read the article, I was under the impression myself that the game details would have to be turned down in order to get a decent playing experience with GeForce3 and Radeon 8500 cards. As to what low detail will actually look like, we will see. Not that I'm immediately inclined to find out myself, of course. :-) As the release date for Doom 3 draws nearer, I for whatever reason find myself willing to loosen up the purse strings somewhat. Still, I'm going to wait and see if there are any technical or driver issues before taking the plunge. I very much look forward to seeing this newsgroup next week! |
Thread Tools | |
Display Modes | |
|
|
Similar Threads | ||||
Thread | Thread Starter | Forum | Replies | Last Post |
DOOM3 on P4P800E deluxe 9800Pro ??? | Art Simpson | Asus Motherboards | 5 | August 11th 04 10:31 PM |
Up-To-Date CPU Benchmarks | Fao, Sean | General | 1 | March 22nd 04 11:43 PM |
Tualatin on P2B Benchmarks? | P2B | Asus Motherboards | 7 | January 19th 04 02:45 AM |
Tualatin on P2B Benchmarks? | P2B | Overclocking | 8 | December 29th 03 06:52 AM |
confusion about doom3 vs HL2 benchmarks | Sumedh | Ati Videocards | 15 | September 16th 03 03:44 AM |