PDA

View Full Version : Image Quality, Optimizations, etc.


Magnulus
July 24th 05, 09:27 AM
I've been noticing lately that in games there's alot of moire on
textures, specificly around the areas that mipmaps would be for bilinear
filtering, when using trilinear filtering + anisotropic filtering with
GeForce 6600 cards (and also GeForce 6800). Even when I set the image
quality to "high quality" and disable all optimizations.

I did a search and apparrently alot of other folks on forums are having
issues too, but curiously enough, none of the major review sites seem to be
paying any attention to this IQ (image quality) issue. Some ordinary
forumers are speculating perhaps there is junk left over from the GeForce FX
days. In the last year or two, both ATI and NVidia have become
increasingly aggressive with the use of optimizations in an attempt to
one-up their competitor. I can't help but wonder how much of the
"performance" of these newer cards is simply due to cheating and shortcuts.
Others think maybe Nvidia is not using true anisotropic filtering at all
anymore, but some other method, to perhaps gain speed- however, obviously
there are IQ issues they are ignoring.

Now, some folks and ATI/NVidia claim these optimizations have little or no
IQ affects. Well, you'ld have to be blind to not spot the moire in many
games when using anisotropic filtering. You can clearly see it on
repetitive patterns such as grating, floor tiles, roads, and those sort of
textures that have alot of repetitive, fine detail. Look at levels in UT
2004 like asbestos or Oceanic, you can clearly seet it on the floors. I can
also spot it in Grand Theft Auto: San Andreas and several other games (I
don't have may games installed on my PC currently, so it's a small sample).

Einstine
July 24th 05, 10:00 AM
"Magnulus" > wrote in message . ..
> I've been noticing lately that in games there's alot of moire on textures, specificly around the areas that mipmaps would be for
> bilinear filtering, when using trilinear filtering + anisotropic filtering with GeForce 6600 cards (and also GeForce 6800). Even
> when I set the image quality to "high quality" and disable all optimizations.
>
> I did a search and apparrently alot of other folks on forums are having issues too, but curiously enough, none of the major
> review sites seem to be paying any attention to this IQ (image quality) issue. Some ordinary forumers are speculating perhaps
> there is junk left over from the GeForce FX days. In the last year or two, both ATI and NVidia have become increasingly
> aggressive with the use of optimizations in an attempt to one-up their competitor. I can't help but wonder how much of the
> "performance" of these newer cards is simply due to cheating and shortcuts. Others think maybe Nvidia is not using true
> anisotropic filtering at all anymore, but some other method, to perhaps gain speed- however, obviously there are IQ issues they
> are ignoring.
>
> Now, some folks and ATI/NVidia claim these optimizations have little or no IQ affects. Well, you'ld have to be blind to not spot
> the moire in many games when using anisotropic filtering. You can clearly see it on repetitive patterns such as grating, floor
> tiles, roads, and those sort of textures that have alot of repetitive, fine detail. Look at levels in UT 2004 like asbestos or
> Oceanic, you can clearly seet it on the floors. I can also spot it in Grand Theft Auto: San Andreas and several other games (I
> don't have may games installed on my PC currently, so it's a small sample).


I don't notice much as far as details except that I have been put off on buying
Battlefield 2 because it seems my eyes can never focus. I cannot explain it
much better than that. There is a lot of great stuff going on and buildings
to hide behind but it seems it is all a blur.

ATI 9800 Pro. 20/20 Vision.

deimos
July 24th 05, 11:22 PM
Magnulus wrote:
> I've been noticing lately that in games there's alot of moire on
> textures, specificly around the areas that mipmaps would be for bilinear
> filtering, when using trilinear filtering + anisotropic filtering with
> GeForce 6600 cards (and also GeForce 6800). Even when I set the image
> quality to "high quality" and disable all optimizations.
>
> I did a search and apparrently alot of other folks on forums are having
> issues too, but curiously enough, none of the major review sites seem to be
> paying any attention to this IQ (image quality) issue. Some ordinary
> forumers are speculating perhaps there is junk left over from the GeForce FX
> days. In the last year or two, both ATI and NVidia have become
> increasingly aggressive with the use of optimizations in an attempt to
> one-up their competitor. I can't help but wonder how much of the
> "performance" of these newer cards is simply due to cheating and shortcuts.
> Others think maybe Nvidia is not using true anisotropic filtering at all
> anymore, but some other method, to perhaps gain speed- however, obviously
> there are IQ issues they are ignoring.
>
> Now, some folks and ATI/NVidia claim these optimizations have little or no
> IQ affects. Well, you'ld have to be blind to not spot the moire in many
> games when using anisotropic filtering. You can clearly see it on
> repetitive patterns such as grating, floor tiles, roads, and those sort of
> textures that have alot of repetitive, fine detail. Look at levels in UT
> 2004 like asbestos or Oceanic, you can clearly seet it on the floors. I can
> also spot it in Grand Theft Auto: San Andreas and several other games (I
> don't have may games installed on my PC currently, so it's a small sample).
>
>

Your registry settings might be whacked. Make certain you're using the
newest drivers (77.76), then with coolbits on, click Restore under
Performance and Settings. This should set the default for all the
control panel settings.

Normally the registry settings should be cleared by the installer, but
if you upgrade overtop another driver they get left behind and values
change between versions.

Additionally, I've noticed that only in certain games, the driver seems
to be forcing a specific level of anisotropic filtering and
optimizations (user mips, af ops, stage ops). I think these are
application specific as they seem to change and get a little better with
successive versions. Many people complained about "shimmering" (like
you're describing) on mipmaps in Painkiller and other games for the
longest time. I've seen it in World of Warcraft and Battlefield 1942
and Doom 3.

Even with optimizations off, some games exhibit almost imperceptible
banding from "brilinear" filtering (at least on my FX5700 card). I think
some of this is hard coded. Take a look at the Doom 3 profile for
example, it won't let you touch the AF settings.

Magnulus
July 25th 05, 12:08 AM
I think it might be application specific, I don't know (UT 2004). I
have a fresh install of Windows XP 64 Pro. I downloaded a 64-bit compatible
version of rivatuner and set the mipmap LOD bias to 0 in both cases. It
seemed to help a little, but the effect is still there. I also installed
Serious Sam, and while this game looks much better in terms of texture
quality, you can still see some "texture aliasing" on some of the walls that
have horizontal or vertical features (relative to the texture, not the
camera). Increasing mipmap LOD bias via Rivatuner defeats this, but also
causes a little texture blurriness everywhere else (partially fixed by
anisotropic filtering), and it also cause the text in UT 2004 in the GUI to
go blurry.

Doing more reading/research, I came across an article on the "shortcuts"
both ATI and NVidia are using to eek out every last bit of speed. For
instance, in texture blending ATI uses only 5 bits per sample in Direct3D.
This is the Direct3D default rasterizer's recommended limit, but using more
bits (6) would help in blending operations in terms of quality, though of
course it would be a little slower. NVidia may do something similar, after
all, in the GeForce 6XXX series of cards they imitated ATI and went with
isotropic/brilinear filtering, rather than mathematically precise trilinear
filtering. Check out this website to get a good idea of what I'm talking
about: http://www.3dcenter.org/artikel/2003/11-21_b_english.php Banding
artifacts/moire are a good description of what I'm seeing.

Another possibility is that this stuff is not visible at all on a regular
monitor- perhaps they are just too blurry. An LCD monitor has a fixed
aspect ratio, has no inherent moire, and so on. Perhaps this stuff has been
there all along and nobody has really payed attention to it. It's
definitely a subtle effect and if you are busy fragging you probably won't
notice it.

It's interesting that ATI and NVidia are both pushing SLI/Crossfire cards
for their many image quality improvements. One of the improvements is
"texture quality", they often cite. Ie, reducing crawling textures. Well,
it would make more sense to me, rather than using a supersample
anti-aliasing mode and 2 video cards, to just "get it right", nip it in the
bud at the texture mapping and filtering stages rather than when the scene
is being rendered.

de Moni
July 25th 05, 09:40 AM
deimos wrote:
> Your registry settings might be whacked. Make certain you're using the
> newest drivers (77.76), then with coolbits on, click Restore under

AFAIK newest official drivers are still 77.72's. At least _I_ wouldn't
install any BETA drivers...

deimos
July 25th 05, 05:29 PM
de Moni wrote:
> deimos wrote:
>
>> Your registry settings might be whacked. Make certain you're using
>> the newest drivers (77.76), then with coolbits on, click Restore under
>
>
> AFAIK newest official drivers are still 77.72's. At least _I_ wouldn't
> install any BETA drivers...

To be perfectly honest, the 77.72 officials were a disaster. I've been
using every driver version since before the first Detonators (2.04) and
these were the most bugged in recent memory.

The 77.76 betas are mainly fixes (including a memory leak from .72!) and
usually the worst that comes from an nZone beta driver is usually WHQL
non-compliance.

de Moni
July 25th 05, 10:51 PM
deimos wrote:
> To be perfectly honest, the 77.72 officials were a disaster. I've been
> using every driver version since before the first Detonators (2.04) and
> these were the most bugged in recent memory.

Funny, because I haven't had even a single issue 77.72's... 6600GT.

Doug
July 26th 05, 12:19 AM
Magnulus, what is "inherent moire"? It would seem to me the moire effect
would be visible on either a LCD or a CRT?

--
there is no .sig
"Magnulus" > wrote in message
. ..
> Another possibility is that this stuff is not visible at all on a regular
> monitor- perhaps they are just too blurry. An LCD monitor has a fixed
> aspect ratio, has no inherent moire, and so on. Perhaps this stuff has
> been there all along and nobody has really payed attention to it. It's
> definitely a subtle effect and if you are busy fragging you probably won't
> notice it.
>
> It's interesting that ATI and NVidia are both pushing SLI/Crossfire cards
> for their many image quality improvements. One of the improvements is
> "texture quality", they often cite. Ie, reducing crawling textures.
> Well, it would make more sense to me, rather than using a supersample
> anti-aliasing mode and 2 video cards, to just "get it right", nip it in
> the bud at the texture mapping and filtering stages rather than when the
> scene is being rendered.
>
>
>

Phil Weldon
July 26th 05, 01:00 AM
| It would seem to me the moire effect would be
| visible on either a LCD or a CRT?
_____

Or a function of how a color NTSC signal is decoded into R,G,B because of
interference between the color subcarrier and luminance information. But
surely the original poster is not viewing NTSC (composite) output on a color
television!

Phil Weldon

"Doug" > wrote in message
. ..
> Magnulus, what is "inherent moire"? It would seem to me the moire effect
> would be visible on either a LCD or a CRT?
>
> --
> there is no .sig
> "Magnulus" > wrote in message
> . ..
>> Another possibility is that this stuff is not visible at all on a
>> regular monitor- perhaps they are just too blurry. An LCD monitor has a
>> fixed aspect ratio, has no inherent moire, and so on. Perhaps this stuff
>> has been there all along and nobody has really payed attention to it.
>> It's definitely a subtle effect and if you are busy fragging you probably
>> won't notice it.
>>
>> It's interesting that ATI and NVidia are both pushing SLI/Crossfire
>> cards for their many image quality improvements. One of the improvements
>> is "texture quality", they often cite. Ie, reducing crawling textures.
>> Well, it would make more sense to me, rather than using a supersample
>> anti-aliasing mode and 2 video cards, to just "get it right", nip it in
>> the bud at the texture mapping and filtering stages rather than when the
>> scene is being rendered.
>>
>>
>>
>
>

Magnulus
July 26th 05, 04:42 AM
You can get moire with CRT's, especially the cheaper ones, or ones that
are poorly adjusted or out of focus. With an LCD, you get no moire,
especially with a digital signal.

Phil Weldon
July 26th 05, 04:58 AM
'Magnulus' wrote:
| You can get moire with CRT's, especially the cheaper ones, or ones that
| are poorly adjusted or out of focus. With an LCD, you get no moire,
| especially with a digital signal.
_____

You can get a morie pattern with LCD screens also for certain spatial
relationship of the image detail and the LCD pixel pitch. Just as you can
see a morie effect when looking through two window screens rotated slightly,
or viewing watered silk, or rescreened print images.

A morie is an interference pattern.

Out of focus CRTs are much less likely to show a morie effect because the
resolution will be too low for many interference patterns. Think of an
out-of-focus CRT as cheap anti-aliasing through resolution reduction.

Phil Weldon

"Magnulus" > wrote in message
...
> You can get moire with CRT's, especially the cheaper ones, or ones that
> are poorly adjusted or out of focus. With an LCD, you get no moire,
> especially with a digital signal.
>