PDA

View Full Version : DX9 (HL2 & Doom3) on ATI vs Nvidia


Roger Squires
September 9th 03, 07:37 PM
Good article here:
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm
Nvidia is hurting and will have to lower IQ again to compete.

rms

Ben Pope
September 9th 03, 08:21 PM
Roger Squires wrote:
> Good article here:
>
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm
> Nvidia is hurting and will have to lower IQ again to compete.

OK, thats the 3rd time today (inthe ATI group, at least)....

We've seen the link!!!!

No need to start a flame war by cross-posting. And no need to respond to
this if you just have bad things to say about the other
company/product/users.

:-P

Ben
--
I'm not just a number. To many, I'm known as a String...

@drian
September 9th 03, 09:38 PM
> OK, thats the 3rd time today (inthe ATI group, at least)....
>
> We've seen the link!!!!

LOL!

@drian.

ginfest
September 10th 03, 12:28 AM
"@drian" > wrote in message
...
> > OK, thats the 3rd time today (inthe ATI group, at least)....
> >
> > We've seen the link!!!!
>
> LOL!
>
> @drian.
>
Evidentially he regrets purchasing his ATI card.


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.516 / Virus Database: 313 - Release Date: 9/01/03

Strontium
September 10th 03, 12:49 AM
-
ginfest stood up at show-n-tell, in [email protected], and
said:

> "@drian" > wrote in message
> ...
>>> OK, thats the 3rd time today (inthe ATI group, at least)....
>>>
>>> We've seen the link!!!!
>>
>> LOL!
>>
>> @drian.
>>
> Evidentially he regrets purchasing his ATI card.

Why would he 'evident[ly]' regret purchasing an ATI card?
From the article:

I don’t know how anyone could objectively look at the performance we’ve seen
in these benchmarks and conclude that a 5900 Ultra is a smarter buying
decision than a 9800 Pro – even the 128MB 9800 Pro (as used in the tests
here) trumps the lofty 256MB 5900 Ultra. If you’re still “stuck in the past”
and think that ATI is plagued with driver issues, than go ahead and keep
your head stuck in the sand like an ostrich, buy a 5900 Ultra and then start
crying when your pals are smoking your ass in games like Half Life 2 and
Halo because they’re running ATI hardware.

What can NVIDIA do right now to turn things around? First off, lower the
damn price of its high-end cards – even if they were priced the same, a 9800
Pro would be a better choice for Pixel Shader 2.0 performance. Secondly,
they’d better pull out every stop in their playbook of tricks to fine-tune
their drivers.

We have no doubt that NVIDIA is hard at work right now on its next-gen
silicon, which will undoubtedly be extremely fast – for their sake and the
sake of gamers like us, it had better be!




>
>
> ---
> Outgoing mail is certified Virus Free.
> Checked by AVG anti-virus system (http://www.grisoft.com).
> Version: 6.0.516 / Virus Database: 313 - Release Date: 9/01/03

--
Strontium

"It's no surprise, to me. I am my own worst enemy. `Cause every
now, and then, I kick the livin' **** `outta me." - Lit

Ben Pope
September 10th 03, 09:54 AM
ginfest wrote:
> "@drian" > wrote in message
> ...
>>> OK, thats the 3rd time today (inthe ATI group, at least)....
>>>
>>> We've seen the link!!!!
>>
>> LOL!
>>
>> @drian.
>>
> Evidentially he regrets purchasing his ATI card.

Actually I don't. I'm incredibly impressed by the speed and visual quality
of the graphics. Turning on 2xAA works for a start. I could have spent
more money on an nVidia and card and had less speed and less graphics
quality, but that sounds like a lose, lose, lose situation to me.

Admittedly Linux 3D Driver support is far better from nVidia than ATI, but
it's not like I'll be giving up Windows completely and playing games in
Linux. Not for a considerably while anyway. For a start getting most games
don't even support Linux. I have very little use for 3D graphics in Linux.

I'm happy with my purchase, and if somebody offered me the two cards (nVidia
5900 Ultra vs 9800 Pro) at the same price (Under £300) - I'd go with the ATI
again.

Thank you.

Ben
--
I'm not just a number. To many, I'm known as a String...

Ben Pope
September 10th 03, 09:55 AM
Strontium wrote:
> Why would he 'evident[ly]' regret purchasing an ATI card?

Regretful nVidia purchaser in denial.

Ben
--
I'm not just a number. To many, I'm known as a String...

methylenedioxy
September 10th 03, 11:13 AM
"Ben Pope" > wrote in message
...
> ginfest wrote:
> > "@drian" > wrote in message
> > ...
> >>> OK, thats the 3rd time today (inthe ATI group, at least)....
> >>>
> >>> We've seen the link!!!!
> >>
> >> LOL!
> >>
> >> @drian.
> >>
> > Evidentially he regrets purchasing his ATI card.
>
> Actually I don't. I'm incredibly impressed by the speed and visual
quality
> of the graphics. Turning on 2xAA works for a start. I could have spent
> more money on an nVidia and card and had less speed and less graphics
> quality, but that sounds like a lose, lose, lose situation to me.
I'm just wondering what card you have? I don't run ANY games at less than
6XAA and they run superb (apart from command and conquer generals that is,
but that's badly coded anyway, could have a super super system and it
wouldn't run)

Ben Pope
September 10th 03, 11:20 AM
methylenedioxy wrote:
> I'm just wondering what card you have? I don't run ANY games at less
> than 6XAA and they run superb (apart from command and conquer
> generals that is, but that's badly coded anyway, could have a super
> super system and it wouldn't run)

I'm using a Crucial 9800 Pro.

I've only really played a couple of games... GTA Vice City and Splinter
Cell, both at 1600x1200 with 6xAA and 8xAF, and everything set to quality
and no slow downs whatsoever. Some of the scenes in Splinter Cell are very
pretty - some really nice lighting effects - haven't played it very much to
be honest.

That was when I had my XP2500 at 200MHz x 11 - it's at default at the
moment, pending troubleshooting of the occasional error in Prime95. I don't
expect any slowdowns at this CPU speed though.

Ben
--
I'm not just a number. To many, I'm known as a String...

methylenedioxy
September 10th 03, 11:27 AM
"Ben Pope" > wrote in message
...
> methylenedioxy wrote:
> > I'm just wondering what card you have? I don't run ANY games at less
> > than 6XAA and they run superb (apart from command and conquer
> > generals that is, but that's badly coded anyway, could have a super
> > super system and it wouldn't run)
>
> I'm using a Crucial 9800 Pro.
>
> I've only really played a couple of games... GTA Vice City and Splinter
> Cell, both at 1600x1200 with 6xAA and 8xAF, and everything set to quality
> and no slow downs whatsoever. Some of the scenes in Splinter Cell are
very
> pretty - some really nice lighting effects - haven't played it very much
to
> be honest.
>
> That was when I had my XP2500 at 200MHz x 11 - it's at default at the
> moment, pending troubleshooting of the occasional error in Prime95. I
don't
> expect any slowdowns at this CPU speed though.
>
> Ben
> --
> I'm not just a number. To many, I'm known as a String...
>
Interesting read, were you aware that Splinter Cell doesn't actually support
AA? It also makes the game go "wonky" as in not run properly with AA
switched on...

Ben Pope
September 10th 03, 12:05 PM
methylenedioxy wrote:
> "Ben Pope" > wrote in message
> ...
>>
>> I'm using a Crucial 9800 Pro.
>>
>> I've only really played a couple of games... GTA Vice City and
>> Splinter Cell, both at 1600x1200 with 6xAA and 8xAF, and everything
>> set to quality and no slow downs whatsoever. Some of the scenes in
>> Splinter Cell are very pretty - some really nice lighting effects -
>> haven't played it very much to be honest.
>>
>> That was when I had my XP2500 at 200MHz x 11 - it's at default at the
>> moment, pending troubleshooting of the occasional error in Prime95.
>> I don't expect any slowdowns at this CPU speed though.
>>
> Interesting read, were you aware that Splinter Cell doesn't actually
> support AA? It also makes the game go "wonky" as in not run properly
> with AA switched on...

I couldn't find the options in either game - I forced it in the ATI control
panel and it does work...

No wonkiness here from what I've seen.

Ben
--
I'm not just a number. To many, I'm known as a String...

Johannes Tümler
September 10th 03, 01:58 PM
> "DX9 (HL2 & Doom3) on ATI vs Nvidia"

Doom3 is OpenGL not DirectX.

methylenedioxy
September 10th 03, 02:14 PM
"Johannes Tümler" > wrote in message
...
> > "DX9 (HL2 & Doom3) on ATI vs Nvidia"
>
> Doom3 is OpenGL not DirectX.
>
regardless it still uses pixel shaders....

Lenny
September 10th 03, 03:40 PM
> Interesting read, were you aware that Splinter Cell doesn't actually
support
> AA? It also makes the game go "wonky" as in not run properly with AA
> switched on...

The "wonkyness" issue with SS is the exact same as the one people were
yelling and screaming so much about regarding half-life 2. The game packs
several different textures into one, so on some polygon angles the texturing
hardware will read texels belonging to a different sub-texture when using
multisample antialiasing. That's what causes the "wonkyness". It can be
circumvented using various techniques, the simplest one being the driver
silently just ignoring your setting to force AA on...

Lenny
September 10th 03, 03:44 PM
> R350 Pixel shader program length is "unlimited" on the R350, a significant
> improvement from 160 on the R300 - this is the limitation John Carmack
> mentions in the above article.

Except, Doom3 will have NO problems whatsoever even with a 160 instruction
pixel shader. If a game used 160 instruction shaders throughout on every
single polygon, it would run so slow it wouldn't be playable at all. 1024
instruction shaders of NV30 just makes things much Much MUCH worse of course
(especially coupled with NV3x architecture's inherently lower shader
execution speed).

Ben Pope
September 10th 03, 04:14 PM
Lenny wrote:
>> R350 Pixel shader program length is "unlimited" on the R350, a
>> significant improvement from 160 on the R300 - this is the
>> limitation John Carmack mentions in the above article.
>
> Except, Doom3 will have NO problems whatsoever even with a 160
> instruction pixel shader. If a game used 160 instruction shaders
> throughout on every single polygon, it would run so slow it wouldn't
> be playable at all. 1024 instruction shaders of NV30 just makes
> things much Much MUCH worse of course (especially coupled with NV3x
> architecture's inherently lower shader execution speed).

Indeed, I'm sure that you can produce some pretty amazing effects with 1024
or even "unlimited" instructions, but then, you can probably produce some
pretty amazing effects with 100 instructions and you'd probably be hard
pushed to tell the difference if done correctly.

I haven't looked into the details of pixel shaders so I don't really know
what sort of instruction sets are available.

DX9.0 specifies 192 instructions for Pixel Shaders - I can't imagine DX10
specifying more than 1024, but who knows.

Ben
--
I'm not just a number. To many, I'm known as a String...

Jaymeister
September 10th 03, 08:30 PM
At the end of the day its not about the hardware but the driver support.
I have been plagued by ATI driver issues in the past and have found Nvidia
drivers to be spot on but if you ask me (which you didn't) 3DFX were always
No.1 until of course, Nvidia bought them up and binned them.
I use Nvidia hardware but I will never forgive them :-)

Jaymeister

N.B My old Voodoo 3 is still kicking ass on my kids
computer........................the legend lives on !!!



"Roger Squires" > wrote in message
.. .
> Good article here:
>
http://www.gamersdepot.com/hardware/video_cards/ati_vs_nvidia/dx9_desktop/001.htm
> Nvidia is hurting and will have to lower IQ again to compete.
>
> rms
>
>


---
Outgoing mail is certified Virus Free.
Checked by AVG anti-virus system (http://www.grisoft.com).
Version: 6.0.516 / Virus Database: 313 - Release Date: 01/09/2003

Ben Pope
September 10th 03, 08:37 PM
Jaymeister wrote:
> At the end of the day its not about the hardware but the driver
> support.
> I have been plagued by ATI driver issues in the past and have found

Pre-Catalyst? Things have changed...

> Nvidia drivers to be spot on but if you ask me (which you didn't)
> 3DFX were always
> No.1 until of course, Nvidia bought them up and binned them.
> I use Nvidia hardware but I will never forgive them :-)

Hehe, 3DFX had some good programmers...

Ben
--
I'm not just a number. To many, I'm known as a String...

tq96
September 11th 03, 07:42 AM
>> Interesting read, were you aware that Splinter Cell doesn't actually
> support

> The "wonkyness" issue with SS is the exact same as the one people were
> yelling and screaming so much about regarding half-life 2. The game

I've encountered several games with a GF4 that didn't it when FSAA was
enabled. In each case switching from multisampling to supersamlping (via
aTuner) caused all of the wonkiness to go away. Surely the ATI cards also
support some form of pure supersampling?

Ben Pope
September 11th 03, 11:27 AM
tq96 wrote:
> I've encountered several games with a GF4 that didn't it when FSAA was
> enabled. In each case switching from multisampling to supersamlping
> (via aTuner) caused all of the wonkiness to go away. Surely the ATI
> cards also support some form of pure supersampling?

Yeah - it's called running at a higher resolution :-)

:-)

There doesn't appear to be an option in the control panel.

Ben
--
I'm not just a number. To many, I'm known as a String...

tq96
September 11th 03, 06:47 PM
>> (via aTuner) caused all of the wonkiness to go away. Surely the ATI
>> cards also support some form of pure supersampling?
>
> Yeah - it's called running at a higher resolution :-)

That's always an option, unless you're using TV-Out. The standard
definition of a television is very close to 640x480 and applying 4x
supersampling on top of that makes a world of difference.