PDA

View Full Version : Brilinear filtering.....What!!


Jack
November 2nd 03, 07:38 AM
Hi there

I was amazed reading this.
http://www.3dcenter.org/artikel/2003/10-26_a_english.php
Nvidia has planned bad pic quality all along.
It`s just numbers with them apparantly,
Well excuse me then for giving them a number for quality performance. A
zero.
And another one for cheating...a very high one.

I think new vid card buyers, gamers especially must look to others to find
their card (ATI)

Way to go Nvidia.....make it even worse and see if you can get away with it.
They could try.

BTW. written when drunk. So commend me.

BYE

Jack

CHEEZEMURDA
November 2nd 03, 11:31 AM
/me slaps Jack around a bit with a large trout.


OMG..tell someone who cares.

what a ****in loser

TMack
November 2nd 03, 12:14 PM
"CHEEZEMURDA" <getoffmydick.com> wrote in message
...
> /me slaps Jack around a bit with a large trout.
>
>
> OMG..tell someone who cares.
>
> what a ****in loser

Couldn't agee more - we dont need any discussion of image quality - this is
a nvidia group, nobody here cares about image quality do they?

Tony

somnambulist
November 2nd 03, 12:48 PM
TMack wrote:
> "CHEEZEMURDA" <getoffmydick.com> wrote in message
> ...
>> /me slaps Jack around a bit with a large trout.
>>
>>
>> OMG..tell someone who cares.
>>
>> what a ****in loser
>
> Couldn't agee more - we dont need any discussion of image quality -
> this is a nvidia group, nobody here cares about image quality do they?

Definitely not. I'd have bought an ATI card if I did.

--
somnambulist

Nic
November 2nd 03, 12:49 PM
> Definitely not. I'd have bought an ATI card if I did.

Just came from ATi - no difference...

somnambulist
November 2nd 03, 01:52 PM
Nic wrote:
>> Definitely not. I'd have bought an ATI card if I did.
>
> Just came from ATi - no difference...

<hey Hank, I got a bite>

IYHO

I can tell the difference though. Which is why I've got a Radeon 9700 Pro
in the main gaming machine, 440mx SE in another machine and a Ti4600 in
another.

YMMV and all that crap though.

--
somnambulist

Flow
November 2nd 03, 02:33 PM
I allways found nvidia cards have bad image quality.
Never the less i bought some of them and they are fast.
And i will keep on buying them if they hold up to what they have been doing
allready.


"Jack" > schreef in bericht
...
> Hi there
>
> I was amazed reading this.
> http://www.3dcenter.org/artikel/2003/10-26_a_english.php
> Nvidia has planned bad pic quality all along.
> It`s just numbers with them apparantly,
> Well excuse me then for giving them a number for quality performance. A
> zero.
> And another one for cheating...a very high one.
>
> I think new vid card buyers, gamers especially must look to others to find
> their card (ATI)
>
> Way to go Nvidia.....make it even worse and see if you can get away with
it.
> They could try.
>
> BTW. written when drunk. So commend me.
>
> BYE
>
> Jack
>
>

Nic
November 2nd 03, 03:50 PM
> I can tell the difference though. Which is why I've got a Radeon 9700 Pro

Me neither - none at all.

somnambulist
November 2nd 03, 03:53 PM
Nic wrote:
>> I can tell the difference though. Which is why I've got a Radeon
>> 9700 Pro
>
> Me neither - none at all.

Which is why IYHO, YMMV and all that crap.

--
somnambulist

Lenny
November 2nd 03, 04:05 PM
> > I can tell the difference though. Which is why I've got a Radeon 9700
Pro

> Me neither - none at all.

You probably don't know what to look for. The diff between bilinear and
trilinear is as night and day once it's been pointed out to you.

Darkfalz
November 2nd 03, 04:20 PM
"somnambulist" > wrote in message
...
> Nic wrote:
> >> I can tell the difference though. Which is why I've got a Radeon
> >> 9700 Pro
> >
> > Me neither - none at all.
>
> Which is why IYHO, YMMV and all that crap.

The only really horrible quality of NVIDIA is their 16 bit post filtering.
Coming from a Voodoo5, where the difference between 16 bit and 32 bit was
hardly noticible (except in performance), I couldn't believe how ugly and
obvious the 16 bit dithering matrix was on my new Geforce FX. It renders 16
bit unusable on NVIDIA cards (I know I know, we should be using 32 bit
anyway).

Lenny
November 2nd 03, 06:39 PM
> The only really horrible quality of NVIDIA is their 16 bit post filtering.

....Which isn't so strange, because it doesn't HAVE any! :)

> Coming from a Voodoo5, where the difference between 16 bit and 32 bit was
> hardly noticible (except in performance)

Well, it would be considerably more noticeable if you ran some modern
software on the thing that really uses lots of texture layers. Older
software was understandably more conservative in the way of such things,
since fillrate didn't exactly grow on trees back then. :)

> I couldn't believe how ugly and
> obvious the 16 bit dithering matrix was on my new Geforce FX.

Why are you running games in 16-bit anyway? It's not any faster anyway, or
at least not more than marginally.

Derek Wildstar
November 2nd 03, 07:29 PM
"Jack" > wrote in message
...
> Hi there

>
> BTW. written when drunk. So commend me.
>


The original article is an exercise in eye-strain and brain strain. Damn
teutonic translators. Only a German could be so incourteous to his verbs.

Oh, and btw, post-filter anisotropic filtering kicks ass..this 20th century
bi-linear and tri-linear crap is for luddites.

somnambulist
November 2nd 03, 07:46 PM
Lenny wrote:
>>> I can tell the difference though. Which is why I've got a Radeon
>>> 9700 Pro
>
>> Me neither - none at all.
>
> You probably don't know what to look for. The diff between bilinear
> and trilinear is as night and day once it's been pointed out to you.

Just a little bit!!

--
somnambulist

phobos
November 3rd 03, 01:04 AM
Jack wrote:
> Hi there
>
> I was amazed reading this.
> http://www.3dcenter.org/artikel/2003/10-26_a_english.php
> Nvidia has planned bad pic quality all along.
> It`s just numbers with them apparantly,
> Well excuse me then for giving them a number for quality performance. A
> zero.
> And another one for cheating...a very high one.
>
> I think new vid card buyers, gamers especially must look to others to find
> their card (ATI)
>
> Way to go Nvidia.....make it even worse and see if you can get away with it.
> They could try.
>
> BTW. written when drunk. So commend me.
>
> BYE
>
> Jack
>
>

I read that, and I think a lot of these sites are WAY overemphasizing
filtering stages and confusing bilinear/trilinear filtering. They're
acting as if NVidia is just throwing away trilinear filtering.

What's happening (especially when these sites post shots showing colored
mip bands) is that instead of filtering every single texel, often you
don't need to. It's an error level. For example, if a pixel was within
a value of say 0.03 to it's next nearst filtered neighbor, you can
safely discard it's filtering without affecting end image quality. Or
you can calculate every single sample and come out with the exact same
visual result.

This kind of optimization is a good blend of mathematical reduction
along with some very unnoticable visual degredation under extreme
circumstances (such as reviewers nitpicking mip banding shots). In real
game play it makes no difference. This is the kind of balanced decision
every video card manufacturer should NOT be afraid to make, but ARE due
to the weasel word "cheating" being tossed about when video drivers are
concerned.

Another practical consideration for this change in filtering is that NV
analyzed the vast majority of all situations in which you can safely
adjust the error bounds of the mipmap levels without affecting image
quality and significantly reduce the load on anisotropic filtering.

Anyone who has done a bit of raytracing with such programs as POV-Ray,
etc. will know this concept of an error level. You don't need to trace
every single texel, just if it's color is "a=0.1" to the neighboring
texel and speed up rendering. Same applies with fillrate and video cards.

phobos
November 3rd 03, 01:08 AM
Lenny wrote:

>>The only really horrible quality of NVIDIA is their 16 bit post filtering.
>
>
> ...Which isn't so strange, because it doesn't HAVE any! :)
>
>
>>Coming from a Voodoo5, where the difference between 16 bit and 32 bit was
>>hardly noticible (except in performance)
>
>
> Well, it would be considerably more noticeable if you ran some modern
> software on the thing that really uses lots of texture layers. Older
> software was understandably more conservative in the way of such things,
> since fillrate didn't exactly grow on trees back then. :)
>
>
>>I couldn't believe how ugly and
>>obvious the 16 bit dithering matrix was on my new Geforce FX.
>
>
> Why are you running games in 16-bit anyway? It's not any faster anyway, or
> at least not more than marginally.
>
>

FYI for whoever cares -- the NVidia/3DFX acquisition was strictly over
ip property, patents, and algorithems. None of the previous hardware
features from the Voodoo series were "ported over" to the Geforce FX.

No 22-bit post filter, no TMU design, nothing. The connection lies
solely in the unerlying chip design, some culling methods (likely), and
possibly some MSAA algorithems integrated into new FSAA modes.

Derek Wildstar
November 3rd 03, 03:26 AM
"phobos" > wrote in message
...


> What's happening (especially when these sites post shots showing colored
> mip bands) is that instead of filtering every single texel, often you
> don't need to. It's an error level. For example, if a pixel was within
> a value of say 0.03 to it's next nearst filtered neighbor, you can
> safely discard it's filtering without affecting end image quality. Or
> you can calculate every single sample and come out with the exact same
> visual result.
>
> This kind of optimization is a good blend of mathematical reduction
> along with some very unnoticable visual degredation under extreme
> circumstances (such as reviewers nitpicking mip banding shots). In real
> game play it makes no difference.

Regrettably, this isn't true in all applications. One of the more immersion
destroying artifacts, and a direct result of this driver behavior, is the
pervasive texture shimmering (texture aliasing) in FS9. What happens is
exactly as the article author describes as the 'bow wave' of textel error
(that you describe): As the angle of incidence increases to the textel,
chances increase that the textel will change color, and then revert, and go
through this rapid oscillation until the textel is firmly within a 'band'.

All this occurs regardles of frame rate, fill rate, or bandwidth
availability. It's a bad mathematical function that is *not* representative
of the world we see. In most cases, lighting and other texture processing
can make the textel shifting moot, but in FS9, it can be a real deal
breaker.

There is a solution, and it's ansiotropic filitering, but for most people, I
suspect the hardware can't deal with all that sampling and maintain their
current levels of performance. Bad filtering = forced upgrade! That's the
co-conspirator in me talking.

Falkentyne
November 3rd 03, 11:39 AM
On Sun, 2 Nov 2003 14:33:00 +0100, "Flow" >
enlightened us by scribbling this gem of wisdom:

>I allways found nvidia cards have bad image quality.
>Never the less i bought some of them and they are fast.
>And i will keep on buying them if they hold up to what they have been doing
>allready.
>

Now where exactly have I heard these words before......?




<Gibs> When you kill 6 people in Unreal Tournament
it is "MonsterKill", In Quake3 it is "Excellent",
in Counter-Strike it is "Kicked by console"

Falkentyne
November 4th 03, 05:32 AM
On Mon, 03 Nov 2003 02:26:44 GMT, "Derek Wildstar" >
enlightened us by scribbling this gem of wisdom:


>Regrettably, this isn't true in all applications. One of the more immersion
>destroying artifacts, and a direct result of this driver behavior, is the
>pervasive texture shimmering (texture aliasing) in FS9. What happens is
>exactly as the article author describes as the 'bow wave' of textel error
>(that you describe): As the angle of incidence increases to the textel,
>chances increase that the textel will change color, and then revert, and go
>through this rapid oscillation until the textel is firmly within a 'band'.
>
>All this occurs regardles of frame rate, fill rate, or bandwidth
>availability. It's a bad mathematical function that is *not* representative
>of the world we see. In most cases, lighting and other texture processing
>can make the textel shifting moot, but in FS9, it can be a real deal
>breaker.
>
>There is a solution, and it's ansiotropic filitering, but for most people, I
>suspect the hardware can't deal with all that sampling and maintain their
>current levels of performance. Bad filtering = forced upgrade! That's the
>co-conspirator in me talking.

You have a valid point there.
Indeed, Anisotropic filtering (on the Ti series cards) is simply too
hard on the hardware in almost any game made within 2003-present.
Great in older games, but even then, turn on 4x FSAA and 8x AF, and
you will have trouble even in games going back as far as 1999,
depending on the resolution, of course.

Sure, the FX cards can do better, but on the newest games, even this
isn't going to be possible. (try playing 1024x768 6x FSAA 8x (or 16
x?) AF, in Half Life 2 or Doom3 ...good luck!. Though admittedly it
was the same when the GF4 hit the streets 4x FSAA; 1024x768 and even
higher was great in every game on the market, until UT2003 came along.

Anyway, the point is, you should, at the LEAST, have an option to turn
on full image quality, if you _WANT_ to, and not have lesser quality
forced upon you, saying "the user doesn't need it/ can't tell the
difference".

I hate to say it, but this is "3dfx" all over again.

Now yes, 3dfx's 16 bit filtering was the *BEST* out of any video card,
(even better than ATI's, which is better than any Nvidia cards--which
have NONE), but you remember their "32 bit is too hard on the
hardware, and you can't tell the difference, easily, anyway"--gamers
don't need it right now...they need speed. " Framerate is king"....

Hmm...

Of course, back then, games were very forgiving as to the # of texture
layers, and that, combined with the TNT/GFs lack of 16 bit filtering,
made, for example, Unreal on a Voodoo2 at 16 bit, look just as good as
on a TNT @ 32 bit (if it even RAN...). But developers had their hands
tied, in making 16 bit look good. Remember Carmack's article about
the precision errors in 32 bit rendering as compared to 128 bit ?

Anyway, why doesn't quality optimizations turn on full trilinear (also
being allowed in Rivatuner), like in the older drivers?

And the OTHER people saying how "if they wanted image quality, they
would have gone with ATI", is just TOTAL teenager bullsh*t.

Those SAME stupid kids were BASHING 3dfx for their 16 bit, not having
the superiour 32 bit quality (pre-V5), and now theyre saying highest
IQ isn't important, now that they don't have it, and ATI does. Can
you say......FANBOY?

Some people just aren't very clever....


<Gibs> When you kill 6 people in Unreal Tournament
it is "MonsterKill", In Quake3 it is "Excellent",
in Counter-Strike it is "Kicked by console"

Granulated
November 4th 03, 02:42 PM
On Sun, 02 Nov 2003 15:05:31 GMT "Lenny" > meeped :

>
>> > I can tell the difference though. Which is why I've got a Radeon 9700
>Pro
>
>> Me neither - none at all.
>
>You probably don't know what to look for. The diff between bilinear and
>trilinear is as night and day once it's been pointed out to you.
>


especially with regards to MIP banding "effects"

Granulated
November 4th 03, 02:44 PM
On Tue, 04 Nov 2003 04:32:26 GMT Falkentyne > meeped :

>Anyway, the point is, you should, at the LEAST, have an option to turn
>on full image quality, if you _WANT_ to, and not have lesser quality
>forced upon you, saying "the user doesn't need it/ can't tell the
>difference".


YAH !!!!!!!!

Jack
November 4th 03, 06:46 PM
Hi there
Seems my commend did something.
Btw. Couldn`t agree more than with the last post. Great Lenny.

Even hard evidence and performance numbers don`t push NV fanboys from their
place. As can be seen in the very first reaction.
But than again i`m in the wrong news group.

BTW anybody test the new nForce drivers.

BYE

Jack

Derek Wildstar
November 4th 03, 11:43 PM
"Falkentyne" > wrote in message
...


<Quality Snippage>

> Anyway, the point is, you should, at the LEAST, have an option to turn
> on full image quality, if you _WANT_ to, and not have lesser quality
> forced upon you, saying "the user doesn't need it/ can't tell the
> difference".

It does all boil down to that doesn't it...why the choice isn't offered.
There are certain instances when I absolutely want the best filtering
possible, and the loss of a few FPS isn't going to matter much, but the
improvement in IQ (Image Quality) does matter.

I brought up the example of FS9 (Fight simulator 2004), in this particular
app, 24 FPS is more than necessary, and 20FPS (sustained) is acceptable. If
your system can perform at 20FPS with full pixel/texel processing, then the
cost is worth it to get rid of any interior texture shimmering.

OTOH, if you are playing 56FPS and you really need 60FPS (or proportionally
more), and the subsequent loss in IQ isn't in your threshold of
observation....whatcha do then? Nvidia seems to know, give them speed! More
speed! Ah, the good olde days.

Clearly nvidia thinks end users are a bunch of mewling kittens who haven't a
clue how to optimize their system for their preferred applications, yet
those same novices are ones who would jump ship if it was known that the
competition had 'faster' results. Madness, all.