A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

8800 GTX or not?



 
 
Thread Tools Display Modes
  #1  
Old January 7th 07, 09:32 AM posted to alt.comp.periphs.videocards.nvidia
[email protected]
external usenet poster
 
Posts: 2
Default 8800 GTX or not?

Hi,
I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms on http://www.tomshardware.com/ and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...

BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...
at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps

So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...

Thanks
Gil

  #2  
Old January 7th 07, 10:19 AM posted to alt.comp.periphs.videocards.nvidia
goPostal
external usenet poster
 
Posts: 7
Default 8800 GTX or not?


wrote in message
oups.com...
Hi,
I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms on http://www.tomshardware.com/ and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...

BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...
at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps

So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...

Thanks
Gil


This is an old arguement and you are wrong. Your brain can see much faster
refresh than that. I'll post a few links in the morning about it.


  #3  
Old January 11th 07, 05:19 AM posted to alt.comp.periphs.videocards.nvidia
heycarnut
external usenet poster
 
Posts: 64
Default 8800 GTX or not?

goPostal wrote:
wrote in message
oups.com...
So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...

Thanks
Gil


This is an old arguement and you are wrong. Your brain can see much faster
refresh than that. I'll post a few links in the morning about it.


Please, don't post links to the psuedo-science BS paper all the gamers
keep referring to as 'proof' that humans can perceive the difference in
frame rates to some hundreds of FPS. It incorrectly uses data from an
air force study on image persistence, that has little if anything to do
with FPS discernability.

I have yet to see anyone do such a study within the constraints of a
'gamers' world, that is, with the kind of hardware and display devices
we use. I had a long conversation with a leading researcher in visual
perception at the local U, and he said there wasn't much interest in
doing such a test. It appears that the normal limit for humans in
seeing any difference in FPS is ~60-100, depending on the person, scene
characteristics, brightness, contrast, etc. There is *no* credible
evidence supporting claims of many hundreds of FPS making any
difference, just anecdotal statements like 'Well, *I* can tell the
difference" that have about as much weight as those by audio kooks that
claim using $1000.00 power cords makes their audio system sound
better...

R

  #4  
Old January 11th 07, 10:59 AM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

"heycarnut" wrote:

I have yet to see anyone do such a study within the constraints of a
'gamers' world, that is, with the kind of hardware and display devices
we use. I had a long conversation with a leading researcher in visual
perception at the local U, and he said there wasn't much interest in
doing such a test. It appears that the normal limit for humans in
seeing any difference in FPS is ~60-100, depending on the person, scene
characteristics, brightness, contrast, etc. There is *no* credible
evidence supporting claims of many hundreds of FPS making any
difference,


You don't need evidence. You don't need a study to determine at what speed
different people see wagon wheels turning backwards on TV because they all
see the same effect. The effect is an artifact of the sampling system used
which is easily understood and predictable.

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of samples
is easily understood and predictable just the same.

At the speeds people would like to see movement in FPS twitchers I know
(without needing studies or evidence) that any human will be able to see a
difference with increasing frame rates up to several thousand FPS.

Of course we don't have any mainstream display technology capable of more
than a couple of hundred FPS and the world is going towards LCDs which are
all stuck at a sucky 60fps anyway so the point is a bit moot.

It is a shame gamers are not demanding faster display devices, but it seems
most of what gamers demand (watercooled quad SLi kw power supply etc) is
willy waving bull**** which allows them to have bump mapped HDR x16
antialiased nuts on mosquitoes in their games rather than fun running round
fragging people.

--
  #5  
Old January 11th 07, 01:26 PM posted to alt.comp.periphs.videocards.nvidia
DRS
external usenet poster
 
Posts: 588
Default 8800 GTX or not?

"nospam" wrote in message


[...]

At the speeds people would like to see movement in FPS twitchers I
know (without needing studies or evidence) that any human will be
able to see a difference with increasing frame rates up to several
thousand FPS.


Yeah, who needs evidence?

Of course we don't have any mainstream display technology capable of
more than a couple of hundred FPS and the world is going towards LCDs
which are all stuck at a sucky 60fps anyway so the point is a bit
moot.


The refresh rate isn't fps. LCDs are not stuck at any partiucular FPS but
hey, you don't need evidence. Or facts, it seems.


  #6  
Old January 11th 07, 02:30 PM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

"DRS" wrote:

"nospam" wrote in message


[...]

At the speeds people would like to see movement in FPS twitchers I
know (without needing studies or evidence) that any human will be
able to see a difference with increasing frame rates up to several
thousand FPS.


Yeah, who needs evidence?


Not my fault some people are too thick to understand how things work.

Of course we don't have any mainstream display technology capable of
more than a couple of hundred FPS and the world is going towards LCDs
which are all stuck at a sucky 60fps anyway so the point is a bit
moot.


The refresh rate isn't fps. LCDs are not stuck at any partiucular FPS but
hey, you don't need evidence. Or facts, it seems.


Rendering more frames than can be sent to the display device is just a
waste of hardware and power which archives nothing but the possibility of
displaying torn images at the display refresh rate.

--
  #7  
Old January 20th 07, 04:51 AM posted to alt.comp.periphs.videocards.nvidia
Roger
external usenet poster
 
Posts: 76
Default 8800 GTX or not?

On Thu, 11 Jan 2007 10:59:12 +0000, nospam
wrote:

"heycarnut" wrote:

I have yet to see anyone do such a study within the constraints of a
'gamers' world, that is, with the kind of hardware and display devices
we use. I had a long conversation with a leading researcher in visual
perception at the local U, and he said there wasn't much interest in
doing such a test. It appears that the normal limit for humans in
seeing any difference in FPS is ~60-100, depending on the person, scene
characteristics, brightness, contrast, etc. There is *no* credible
evidence supporting claims of many hundreds of FPS making any
difference,


You don't need evidence. You don't need a study to determine at what speed
different people see wagon wheels turning backwards on TV because they all
see the same effect. The effect is an artifact of the sampling system used
which is easily understood and predictable.

But you are mixing apples and oranges. The wagon wheel turning
backwards is an artifact created between the image movement and the
sweep rate, or frequency.

In an LCD it is the image movement and refresh rate.

You can see them change just by changing the refresh rate.

Another problem is image persistence on the display. With the old
CRTs we could even purchase them with phosphors of different
persistence. Prior to good storage scopes we used long persistence
phosphors to measure the lifetime of minority carriers in
semiconductor material. Storage scopes sure improved the accuracy of
those measurements.

With even the best of CRT displays you could flash a bright line
across the screen and then watch it fade. It was that in normal mode
the next line was bright enough we didn't notice the remnants of the
previous image.

The human eye suffers from the same persistence problem. With bright
objects we call them after images, but those after images exist even
when viewing a display. As most displays have rather high contrast
and brightness the after images last longer than normal. This by
itself sets an upper limit on frame rates far below what would
theoretically possible.

Contrary to the usual explanation of motion and persistence of vision
the after image should detract and not aid in creating movement.

Although they could use some lessons in English, Joseph and Barbera
Anderson pretty much sum this up in the 10th paragraph of their paper
on "The Myth of Persistence of Vision Revisited" (Journal of Film and
Video (Spring 1993)
http://www.uca.edu/org/ccsmi/ccsmi/c...0Revisited.htm

Under the right conditions you can see a bit of flicker in good
fluorescent lights that are working right. Normally the flicker can
not be seen. If it weren't for the persistence of vision moving
around in a store with fluorescent lights would be like moving around
at a bar with the strobes on.

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of samples
is easily understood and predictable just the same.


Which means we should remove the artifacts.
Remove those artifacts and we'd see smooth movement without needing to
increase the frame rates.


At the speeds people would like to see movement in FPS twitchers I know
(without needing studies or evidence) that any human will be able to see a
difference with increasing frame rates up to several thousand FPS.


I seriously doubt any human can discern between images only a 1/2
millisecond apart unless aided by something. Again what turns up are
artifacts between the image movement and in the case of the LCD the
refresh rate. The viewer is seeing the results created by the
interaction/interference between two or more items, not the actual
movement of the image. Under some conditions it is possible to
increase the flutter by increasing frame rate. The best example is a
video of a turning airplane propeller using a variable shutter speed.
Starting with a normal shutter speed the prop is a blur as displayed
as well as in each image. As the shutter speed is increased the prop
will appear to slow,stop, reverse, and speed up in the opposite
direction. This apparent change in speed will continue until a point
is reached where the prop speed is no longer a harmonic of the shutter
speed.

We do something similar with a calibrated strobe to detect RPM. If the
prop speed is a harmonic (multiple) of the strobe frequency it will
appear to stand still.

Refresh rate on a LCD can be a bit confusing as when the "screen" is
refreshed the entire image is not refreshed. Only the potions of the
image that change in color or brightness are refreshed. Add to that
images may be buffered so the next image ready to be displayed already
exists and can be drawn (row and column) exceptionally fast with no
apparent lag from top to bottom or side to side. This rapid refresh
of both column and row coupled with image movement can generate
patterns, not just jerky movement. It's also more likely to create
harmonic relationships between the motion and the refresh per row and
column.
..
However changes in pixel output are limited by that elements ability
to change, or its persistence. That element persistence is currently
the limiting factor in what we have for refresh rates. Generally you
can increase the brightness faster than the light will fade to black
and the rate at which colors change in hue and intensity varies as
well. With LCDs on the order of 4 to 5 ms for a refresh rate we have
made some good steps. The laser screens that should be available in
another year or so may or may not give us a bit more speed. They
certainly will bring a new set of variables when it comes to
interference patterns.


Of course we don't have any mainstream display technology capable of more
than a couple of hundred FPS and the world is going towards LCDs which are


You are being generous with a couple hundred frames a second. Most of
our displays can just make half that despite the ability to refresh in
a few milliseconds. There are a few monitors that will run 120 fps,
but most are on the order of 60 to 75.

all stuck at a sucky 60fps anyway so the point is a bit moot.

It is a shame gamers are not demanding faster display devices, but it seems
most of what gamers demand (watercooled quad SLi kw power supply etc) is


That is because they don't see what you are seeing. And with the new
cooling systems you don't have to go with water, but it is quiet.
:-)) My latest system is all fans and it's the quietest I've had in
years. I do have to admit the vast majority of the noise comes from
the video card(s) and that does make water cooling a bit tempting.
There's not much you gain by going to quad unless running multiple
apps. A number of programs do take advantage of dual processor or
dual core setups but I don't recall any using quad as yet. On top of
that you then have to find an OS that will run more than two
processors or cores and the game of choice...or the app of choice.

willy waving bull**** which allows them to have bump mapped HDR x16
antialiased nuts on mosquitoes in their games rather than fun running round
fragging people.


OTOH running two or three monitors off a card can pretty well use up
that extra capacity in a hurry.

There are two gaming camps. Those running the shoot 'em up speed
demons and those running the flight sims demanding detail. Of course
the second camp is crippled by MS giving them a CPU bound program that
only runs on one processor or core.
I like 'em both, but I wish some one would write a new flight sim that
isn't CPU bound to run on the MS platform. But maybe the next
generation of computers will be powerful enough. Maybe the one I'm
building will handle it?
Roger Halstead (K8RI & ARRL life member)
(N833R, S# CD-2 Worlds oldest Debonair)
www.rogerhalstead.com
  #8  
Old January 20th 07, 01:00 PM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

Roger wrote:

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of samples
is easily understood and predictable just the same.


Which means we should remove the artifacts.
Remove those artifacts and we'd see smooth movement without needing to
increase the frame rates.


You can't remove artifacts which are created by missing information. You
can't add information when the information rate is already bound by the
display frame rate.

At the speeds people would like to see movement in FPS twitchers I know
(without needing studies or evidence) that any human will be able to see a
difference with increasing frame rates up to several thousand FPS.


I seriously doubt any human can discern between images only a 1/2
millisecond apart unless aided by something.


They are aided (or rather hindered) by the display system sampling rate,
just like a strobe.

Consider you are playing an FPS on a 24" 60Hz LCD and the graphics card is
powerful enough to always keep up with the display.

You are being shot from behind so you turn through 180 degrees, lets say
you are a bit crap at FPSes and take 0.5 seconds to turn. During that 0.5
seconds the display system shows you 30 frames. Each of those frames is 6
degrees apart. If your FOV is 90 degrees on a 24" widescreen 6 degrees is
about 1.1" across the screen.

As you turn you see 6 or 7 images of the guy shooting you spaced 1.1"
apart. Your eye tries to track the guy to aim at him and actually sees 2
flickering guys spaced 1.1" apart or if he is closer a fuzzy flickering guy
1.1" fatter than he really is.

That's at a solid 60 fps (not the 30fps the OP claims is enough for anyone)
and 180 degrees in 0.5 seconds is like slow motion to an FPS twitcher.

The effect will be visible until the double image spacing gets down to a
couple of pixels which requires around 3600fps then you just turn twice as
fast and see it again.

It is a shame gamers are not demanding faster display devices, but it seems
most of what gamers demand (watercooled quad SLi kw power supply etc) is


That is because they don't see what you are seeing.


There is nothing special about my eyes. I guess newer LCD gamers have never
seen faster frame rates to know what they are missing. People who have been
able to look know they can see a difference, they usually describe it as
'fluidity' probably without really understanding what the difference they
are seeing is.

--
  #9  
Old January 7th 07, 12:43 PM posted to alt.comp.periphs.videocards.nvidia
Franky
external usenet poster
 
Posts: 23
Default 8800 GTX or not?

I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms on http://www.tomshardware.com/ and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...


try it in Oblivion..or Rainbow Six Vegas..a card can never run too
well..wait till
Crysis is released


BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...


The 30" ones do. If you are running a Triplehead2Go system, the more power
the better


at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps

So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...

Thanks
Gil



  #10  
Old January 7th 07, 01:02 PM posted to alt.comp.periphs.videocards.nvidia
Cameron Walsh
external usenet poster
 
Posts: 17
Default 8800 GTX or not?

wrote:
Hi,
I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms on
http://www.tomshardware.com/ and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...


Doom 3 is old and was more of a tech demo than a game worth playing.
Try Oblivion, F.E.A.R, etc. for more technically impressive and
interesting games.

Reading the article
(http://www.tomshardware.com/2006/11/08/geforce_8800/ ?) I see F.E.A.R.
gets about 80fps at 1600x1200 with 4xAA and 8xAF. Most 19" CRTs can do
that resolution, and an average of 80fps means that even during intense
action scenes it shouldn't drop below a very playable 50fps.

As for Oblivion, ouch, only the GTX version can get above even 30fps (at
1600x1200 with max settings.) If the card struggles with current games,
that suggests to me that it is likely to struggle with next week's games
even more. People would buy a card like that in order to get the most
out of the games they play now and to be able to play future games at
acceptable frame rates.


BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...


My 17" laptop screen has a native resolution of 1920x1200, larger
monitors will go higher (if, as you say, money is not an issue...)

at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps

So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...


The following articles would argue otherwise, as would I. 210 fps is
excessive, yes, but 30fps is far too low.

http://www.daniele.ch/school/30vs60/30vs60_3.html explains it vaguely,
http://amo.net/NT/02-21-01FPS.html explains exactly why monitors suck at
60Hz, and a bit of the differences between refresh rates and frame rates.
http://www.tweakguides.com/Graphics_5.html explains in more readable
language why some game types play better with high frame rates etc.

In the end, if money is not an issue, why not get three cards, run two
in SLI and send one to me?

Hope that clarifies things,

Cameron.
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Asus Motherboards 6 December 12th 06 06:38 AM
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Nvidia Videocards 6 December 12th 06 06:38 AM
Problems playing some games with 8800 GTX Neil Catling Nvidia Videocards 6 December 9th 06 08:40 AM
nTune and SLI 8800 GTX Reggie Hillier Nvidia Videocards 3 December 3rd 06 02:08 AM
A8N32 SLI Deluxe - 8800 GTX Running SLI Reggie Hillier Asus Motherboards 9 November 25th 06 08:12 AM


All times are GMT +1. The time now is 10:07 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.