PDA

View Full Version : Aquamark 3 *again*


methylenedioxy
September 14th 03, 02:08 PM
http://www.driverheaven.net/articles/aquamark3/index2.htm

If you look at the differences in jumps between the various nvidia card
drivers you really do have to wonder what they are cutting back on. Look at
Ati, it jumped only 1000 points between cat 3.6 and 3.7, now look at the
equivalnet nvidia card driver jumps, between 44.03 drivers and 45.23 it
jumped by 11,000 POINTS!!! theres no way that they aren't skimping on
features, you just can't do that without really "optimising" drivers to such
an extent that something is bveing cut back on. Ansd then the jump between
45.23 and 51.75 drivers is another 8,000 points so in total between 2 driver
releases an astonishing jump of 19,000 points, sureley this speaks volumes
in itself????

not me
September 14th 03, 03:00 PM
"methylenedioxy" > wrote in message
...
> http://www.driverheaven.net/articles/aquamark3/index2.htm
>
> If you look at the differences in jumps between the various nvidia card
> drivers you really do have to wonder what they are cutting back on. Look
at
> Ati, it jumped only 1000 points between cat 3.6 and 3.7, now look at the
> equivalnet nvidia card driver jumps, between 44.03 drivers and 45.23 it
> jumped by 11,000 POINTS!!! theres no way that they aren't skimping on
> features, you just can't do that without really "optimising" drivers to
such
> an extent that something is bveing cut back on. Ansd then the jump between
> 45.23 and 51.75 drivers is another 8,000 points so in total between 2
driver
> releases an astonishing jump of 19,000 points, sureley this speaks volumes
> in itself????
>

Or the 44.03's were that **** poor...

fish
September 14th 03, 07:39 PM
I don't think that's the case. NV has allot to loose and an ego to protect.
My guess it that they will cheat and lie in order to appear winners.

ATI is being gracious and has begun a policy that assures that no current or
future driver will be optimized for a specific benchmark. Its a positive
beginning.

If Nvidia would just come clean and be honest they would gain much more fan
support. But they continue to be deceiving. It really ruins my opinion of
them. Very sad.



"not me" > wrote in message
...
> "methylenedioxy" > wrote in message
> ...
> > http://www.driverheaven.net/articles/aquamark3/index2.htm
> >
> > If you look at the differences in jumps between the various nvidia card
> > drivers you really do have to wonder what they are cutting back on. Look
> at
> > Ati, it jumped only 1000 points between cat 3.6 and 3.7, now look at the
> > equivalnet nvidia card driver jumps, between 44.03 drivers and 45.23 it
> > jumped by 11,000 POINTS!!! theres no way that they aren't skimping on
> > features, you just can't do that without really "optimising" drivers to
> such
> > an extent that something is bveing cut back on. Ansd then the jump
between
> > 45.23 and 51.75 drivers is another 8,000 points so in total between 2
> driver
> > releases an astonishing jump of 19,000 points, sureley this speaks
volumes
> > in itself????
> >
>
> Or the 44.03's were that **** poor...
>
>

Mario Kadastik
September 14th 03, 08:29 PM
methylenedioxy wrote:
> http://www.driverheaven.net/articles/aquamark3/index2.htm
>
> If you look at the differences in jumps between the various nvidia card
> drivers you really do have to wonder what they are cutting back on. Look at
> Ati, it jumped only 1000 points between cat 3.6 and 3.7, now look at the
> equivalnet nvidia card driver jumps, between 44.03 drivers and 45.23 it
> jumped by 11,000 POINTS!!! theres no way that they aren't skimping on
> features, you just can't do that without really "optimising" drivers to such
> an extent that something is bveing cut back on. Ansd then the jump between
> 45.23 and 51.75 drivers is another 8,000 points so in total between 2 driver
> releases an astonishing jump of 19,000 points, sureley this speaks volumes
> in itself????
>
>
>
Well ... as I have understood nVidia has done the FX GPU-s with
different pixel shader code than the DX9 spec says. That would mean that
when raw DX9 code path is used the card will perform not too well. Now
if they write a wrapper that will rewrite the pixel shader data to the
nVidia specific format then the hardware will run the code a lot faster.
Now if the nVidia written code is superior to the DX9 code (just let's
make an assumption). Then it would be possible to actually increase the
performance of games just by driver upgrades and not just 5% but 50% or so.

So why do you always assume that nVidia has to downgrade some other
options to optimize some game specific code in the drivers? Maybe they
just have to write in some converters that will give you the same output
(quality way) but with some additional work from the driver. Now if
there would be a game that uses directly the nVidia supported paths then
it might be a lot faster than using standard DX9 paths with ATI. You
don't know that. Noone knows that (excpet nVidia and some spies maybe).

So just keep an open mind regarding driver optimization and don't bitch
around.

Mario

Chimera
September 15th 03, 07:25 AM
fish wrote:
> I don't think that's the case. NV has allot to loose and an ego to
protect.
> My guess it that they will cheat and lie in order to appear winners.
>
> ATI is being gracious and has begun a policy that assures that no current
or
> future driver will be optimized for a specific benchmark. Its a positive
> beginning.
>
> If Nvidia would just come clean and be honest they would gain much more
fan
> support. But they continue to be deceiving. It really ruins my opinion of
> them. Very sad.

Dont go demonising one company and praising another, both seem to have a
detachment between what they say and what they do.

fish
September 15th 03, 01:03 PM
I assume you meant demonizing which is a strange word to describe my
comments and is incorrect, in my opinion.


"Chimera" > wrote in message
...
> fish wrote:
> > I don't think that's the case. NV has allot to loose and an ego to
> protect.
> > My guess it that they will cheat and lie in order to appear winners.
> >
> > ATI is being gracious and has begun a policy that assures that no
current
> or
> > future driver will be optimized for a specific benchmark. Its a positive
> > beginning.
> >
> > If Nvidia would just come clean and be honest they would gain much more
> fan
> > support. But they continue to be deceiving. It really ruins my opinion
of
> > them. Very sad.
>
> Dont go demonising one company and praising another, both seem to have a
> detachment between what they say and what they do.
>
>

Bratboy
September 15th 03, 09:21 PM
"Mario Kadastik" > wrote in message
...
> Well ... as I have understood nVidia has done the FX GPU-s with
> different pixel shader code than the DX9 spec says. That would mean that
> when raw DX9 code path is used the card will perform not too well. Now
> if they write a wrapper that will rewrite the pixel shader data to the
> nVidia specific format then the hardware will run the code a lot faster.


Nvidia CHOSE to ignore the standard, they instead wrote their own and now
complain that companies like valve wont redo it for their "special" method
so they don't score as well. The idea behind a standard is that EVERYONE
(end user, Card makers and Game Makers) can rely on it which isnt the case
here. Its great if Nv's implementation works as well or better but when your
going to sell a card and tout its DX9 conformity/ability its a tad dishonest
if it really only does it well if programs do NOT use the standard. Heck I
think every card out there would run games better if each company makeing a
game made custom running instructs based on each card out there but that
isnt what standards are supposed to require as I understood things. Using a
"wrapper" to make a card thats supposed to be dx9 part out of box run DX9
features just seems wacked to me.

Just my 2 cents
Just Jess

Dark Avenger
September 16th 03, 10:53 PM
> Nvidia CHOSE to ignore the standard, they instead wrote their own and now
> complain that companies like valve wont redo it for their "special" method
> so they don't score as well. The idea behind a standard is that EVERYONE
> (end user, Card makers and Game Makers) can rely on it which isnt the case
> here. Its great if Nv's implementation works as well or better but when your
> going to sell a card and tout its DX9 conformity/ability its a tad dishonest
> if it really only does it well if programs do NOT use the standard. Heck I
> think every card out there would run games better if each company makeing a
> game made custom running instructs based on each card out there but that
> isnt what standards are supposed to require as I understood things. Using a
> "wrapper" to make a card thats supposed to be dx9 part out of box run DX9
> features just seems wacked to me.
>
> Just my 2 cents
> Just Jess

My point to, game makers should just be able to write RAW DX9 code and
ofcourse DX8 code for the older cards. And NviDia should JUST follow
that DX9 standard so that games can run normally.

Now they FORCE game makers to invest time and money in Nvidia's
Standard....because the performance in STANDARD DX9 is
hell...sluggish...

Tim
September 22nd 03, 02:42 PM
Nvidia can't run the standard Dx9 path..Microsoft made the requirement at least
24bit, which is ATI hardware. So basically Nvidia basically ends up running at
32bit, it support 16 and 12..The peroformance of the shader is very slow at 32bit.
So they basically make everything run the 16bit path, which isn't really dx9 but
it's almost there. Dark Avenger wrote:

> > Nvidia CHOSE to ignore the standard, they instead wrote their own and now
> > complain that companies like valve wont redo it for their "special" method
> > so they don't score as well. The idea behind a standard is that EVERYONE
> > (end user, Card makers and Game Makers) can rely on it which isnt the case
> > here. Its great if Nv's implementation works as well or better but when your
> > going to sell a card and tout its DX9 conformity/ability its a tad dishonest
> > if it really only does it well if programs do NOT use the standard. Heck I
> > think every card out there would run games better if each company makeing a
> > game made custom running instructs based on each card out there but that
> > isnt what standards are supposed to require as I understood things. Using a
> > "wrapper" to make a card thats supposed to be dx9 part out of box run DX9
> > features just seems wacked to me.
> >
> > Just my 2 cents
> > Just Jess
>
> My point to, game makers should just be able to write RAW DX9 code and
> ofcourse DX8 code for the older cards. And NviDia should JUST follow
> that DX9 standard so that games can run normally.
>
> Now they FORCE game makers to invest time and money in Nvidia's
> Standard....because the performance in STANDARD DX9 is
> hell...sluggish...

Dark Avenger
September 22nd 03, 06:09 PM
Tim > wrote in message >...
> Nvidia can't run the standard Dx9 path..Microsoft made the requirement at least
> 24bit, which is ATI hardware. So basically Nvidia basically ends up running at
> 32bit, it support 16 and 12..The peroformance of the shader is very slow at 32bit.
> So they basically make everything run the 16bit path, which isn't really dx9 but
> it's almost there. Dark Avenger wrote:
>
> > > Nvidia CHOSE to ignore the standard, they instead wrote their own and now
> > > complain that companies like valve wont redo it for their "special" method
> > > so they don't score as well. The idea behind a standard is that EVERYONE
> > > (end user, Card makers and Game Makers) can rely on it which isnt the case
> > > here. Its great if Nv's implementation works as well or better but when your
> > > going to sell a card and tout its DX9 conformity/ability its a tad dishonest
> > > if it really only does it well if programs do NOT use the standard. Heck I
> > > think every card out there would run games better if each company makeing a
> > > game made custom running instructs based on each card out there but that
> > > isnt what standards are supposed to require as I understood things. Using a
> > > "wrapper" to make a card thats supposed to be dx9 part out of box run DX9
> > > features just seems wacked to me.
> > >
> > > Just my 2 cents
> > > Just Jess
> >
> > My point to, game makers should just be able to write RAW DX9 code and
> > ofcourse DX8 code for the older cards. And NviDia should JUST follow
> > that DX9 standard so that games can run normally.
> >
> > Now they FORCE game makers to invest time and money in Nvidia's
> > Standard....because the performance in STANDARD DX9 is
> > hell...sluggish...

Well this time atleast a reasonable anwser. Yes indeed, nvidia CAN run
DX9...but then the performance is rather slughish and that is where
game programmers fall over. Why should game programmers have to
program ways around problems just because the card makers make a low
performance card at a high price.

Hell, with nv40 ....we probably see 2x8 shaders...now..at that
moment...performance will not be a problem anymore.

The FX serie though.....well....lets say don't expect any fireworks of
them under DX9.

Nada
September 23rd 03, 09:56 AM
(Dark Avenger) wrote:
> Tim > wrote:

> > > My point to, game makers should just be able to write RAW DX9 code and
> > > ofcourse DX8 code for the older cards. And NviDia should JUST follow
> > > that DX9 standard so that games can run normally.
> > >
> > > Now they FORCE game makers to invest time and money in Nvidia's
> > > Standard....because the performance in STANDARD DX9 is
> > > hell...sluggish...
>
> Well this time atleast a reasonable anwser. Yes indeed, nvidia CAN run
> DX9...but then the performance is rather slughish and that is where
> game programmers fall over.

The last time when John Carmack fell over his chair was around in last
Christmas, when Trent Reznor made paper planes out of his musical
notebook and threw them at Carmack's eyeglasses one by one.