A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

8800 GTX or not?



 
 
Thread Tools Display Modes
  #11  
Old January 11th 07, 12:44 AM posted to alt.comp.periphs.videocards.nvidia
First of One
external usenet poster
 
Posts: 312
Default 8800 GTX or not?

I beg to differ. Remember, Half Life 2 suffered significant delays; it was
originally planned to be bundled with the Radeon 9700. By today's standards,
its graphics can certainly be improved upon. The levels don't use as much
bumpmapping as Quake 4, Oblivion or the XBox's Gears of War. The
subsequently added "integer" HDR effects cannot be shown through
transluscent surfaces like glass or water.

I still can't think of any visually important feature in DX10/DX10.1 that
DX9.0c cannot do.

--
"War is the continuation of politics by other means.
It can therefore be said that politics is war without
bloodshed while war is politics with bloodshed."

"adr" wrote in message
...
Half life 2 with all setting set to max is as good as this generation of
games will offer. The next generation games will support DX10 which, if
written gleverly, will blow your mind. The GF 8800 GTX is the only card on
the market that can resolve new generation games at enhanced resolutions




  #12  
Old January 11th 07, 05:19 AM posted to alt.comp.periphs.videocards.nvidia
heycarnut
external usenet poster
 
Posts: 64
Default 8800 GTX or not?

goPostal wrote:
wrote in message
oups.com...
So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...

Thanks
Gil


This is an old arguement and you are wrong. Your brain can see much faster
refresh than that. I'll post a few links in the morning about it.


Please, don't post links to the psuedo-science BS paper all the gamers
keep referring to as 'proof' that humans can perceive the difference in
frame rates to some hundreds of FPS. It incorrectly uses data from an
air force study on image persistence, that has little if anything to do
with FPS discernability.

I have yet to see anyone do such a study within the constraints of a
'gamers' world, that is, with the kind of hardware and display devices
we use. I had a long conversation with a leading researcher in visual
perception at the local U, and he said there wasn't much interest in
doing such a test. It appears that the normal limit for humans in
seeing any difference in FPS is ~60-100, depending on the person, scene
characteristics, brightness, contrast, etc. There is *no* credible
evidence supporting claims of many hundreds of FPS making any
difference, just anecdotal statements like 'Well, *I* can tell the
difference" that have about as much weight as those by audio kooks that
claim using $1000.00 power cords makes their audio system sound
better...

R

  #14  
Old January 11th 07, 10:38 AM posted to alt.comp.periphs.videocards.nvidia
Mr.E Solved!
external usenet poster
 
Posts: 888
Default 8800 GTX or not?

First of One wrote:

I still can't think of any visually important feature in DX10/DX10.1 that
DX9.0c cannot do.


There are several improvements that DX10 will provide. Namely SM4.0 and
the ability to render any number of on screen objects. This will create
a visual environment more complex than previously able.

This is helped by architecture improvements, as you are well aware, to
enable all of these extra elements to appear, with no performance
penalty. Instancing and Rapid Occlusion Culling make it possible to
calculate only on those pixels that are actually visible, many many
times over.

Also, hardware virtualization will enable the GPU(s) to be used as
physics and who knows what else processors. Creating novel and unique
visual environments. Think jet streams, ion beams, rivers of magma...

Rapid Texture Swapping and Memory Paging allow for a whole outdoor
environment to be loaded and ready to go when you open a door from an
inside scene to an outdoor one. This minimizes, and can even prevent,
loss of immersions when you traverse thresholds and other portals that
normally would cause a break in play to access stuffs from disk.

A better question is what can DX10 do that DX9.0L can't do that isn't
DRM related?


  #15  
Old January 11th 07, 10:59 AM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

"heycarnut" wrote:

I have yet to see anyone do such a study within the constraints of a
'gamers' world, that is, with the kind of hardware and display devices
we use. I had a long conversation with a leading researcher in visual
perception at the local U, and he said there wasn't much interest in
doing such a test. It appears that the normal limit for humans in
seeing any difference in FPS is ~60-100, depending on the person, scene
characteristics, brightness, contrast, etc. There is *no* credible
evidence supporting claims of many hundreds of FPS making any
difference,


You don't need evidence. You don't need a study to determine at what speed
different people see wagon wheels turning backwards on TV because they all
see the same effect. The effect is an artifact of the sampling system used
which is easily understood and predictable.

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of samples
is easily understood and predictable just the same.

At the speeds people would like to see movement in FPS twitchers I know
(without needing studies or evidence) that any human will be able to see a
difference with increasing frame rates up to several thousand FPS.

Of course we don't have any mainstream display technology capable of more
than a couple of hundred FPS and the world is going towards LCDs which are
all stuck at a sucky 60fps anyway so the point is a bit moot.

It is a shame gamers are not demanding faster display devices, but it seems
most of what gamers demand (watercooled quad SLi kw power supply etc) is
willy waving bull**** which allows them to have bump mapped HDR x16
antialiased nuts on mosquitoes in their games rather than fun running round
fragging people.

--
  #16  
Old January 11th 07, 01:26 PM posted to alt.comp.periphs.videocards.nvidia
DRS
external usenet poster
 
Posts: 588
Default 8800 GTX or not?

"nospam" wrote in message


[...]

At the speeds people would like to see movement in FPS twitchers I
know (without needing studies or evidence) that any human will be
able to see a difference with increasing frame rates up to several
thousand FPS.


Yeah, who needs evidence?

Of course we don't have any mainstream display technology capable of
more than a couple of hundred FPS and the world is going towards LCDs
which are all stuck at a sucky 60fps anyway so the point is a bit
moot.


The refresh rate isn't fps. LCDs are not stuck at any partiucular FPS but
hey, you don't need evidence. Or facts, it seems.


  #17  
Old January 11th 07, 02:30 PM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

"DRS" wrote:

"nospam" wrote in message


[...]

At the speeds people would like to see movement in FPS twitchers I
know (without needing studies or evidence) that any human will be
able to see a difference with increasing frame rates up to several
thousand FPS.


Yeah, who needs evidence?


Not my fault some people are too thick to understand how things work.

Of course we don't have any mainstream display technology capable of
more than a couple of hundred FPS and the world is going towards LCDs
which are all stuck at a sucky 60fps anyway so the point is a bit
moot.


The refresh rate isn't fps. LCDs are not stuck at any partiucular FPS but
hey, you don't need evidence. Or facts, it seems.


Rendering more frames than can be sent to the display device is just a
waste of hardware and power which archives nothing but the possibility of
displaying torn images at the display refresh rate.

--
  #18  
Old January 11th 07, 08:32 PM posted to alt.comp.periphs.videocards.nvidia
Mr.E Solved!
external usenet poster
 
Posts: 888
Default 8800 GTX or not?

nospam wrote:

Rendering more frames than can be sent to the display device is just a
waste of hardware and power which archives nothing but the possibility of
displaying torn images at the display refresh rate.


Adding:
Even in the age of digital panels and fixed refresh rates, the "fps vs
refresh rate" ball gets bounced around still. Whacked on ever harder by
the seemingly dedicated efforts of misinformation spewers.

"The human eye can't see more than X FPS"

"Movies have X FPS so therefore PC games should have X FPS"

"TV has X FPS so therefore you need X FPS"

"Leading researchers say..."

It's amazing what people will convince themselves of, in the face of
empirical evidence, and years of experience. It's like asking a doctor
(or PC guy) for help in their field of expertise and then arguing with
them when they give you their learned diagnosis.

Here are some questions to think about, to test your understanding of
this issue:

1) Higher FPS's are always better for the gamer.

2) The ideal game has a FPS that never varies, ever.

3) The minimum FPS in a test is more indicative of game play than the
maximum FPS achieved

4) Refresh rate should never be set lower than your MAX FPS.

5) Vsync should only be on when triple buffering is enabled.

6) Any FPS over 60 is wasted, since the human eye can't see anything
faster than that.

7) Digital Panels all run at 60hz, so you don't need more than 60FPS.


Do you know which statements are true, or false, more importantly, do
you know why they are true of false? So that you can explain it to
someone who does not know?





  #19  
Old January 12th 07, 09:03 AM posted to alt.comp.periphs.videocards.nvidia
Bobo
external usenet poster
 
Posts: 6
Default 8800 GTX or not?

It's not the 210 fps that you need, it's the 40 - 50 fps you'll need in the
future.

But anyway, top of the line cards are for enthusiasts. Fast cars, all that
kind of stuff. Highest resolutions...

Plus it's DirectX 10.

But if you don't have any preference, I tend to wait to see how the next
AMD/ATI card compares. I can go either way easily, and if the next gen card
for AMD comes out behind, 8800gtx it is!


  #20  
Old January 20th 07, 04:51 AM posted to alt.comp.periphs.videocards.nvidia
Roger
external usenet poster
 
Posts: 76
Default 8800 GTX or not?

On Thu, 11 Jan 2007 10:59:12 +0000, nospam
wrote:

"heycarnut" wrote:

I have yet to see anyone do such a study within the constraints of a
'gamers' world, that is, with the kind of hardware and display devices
we use. I had a long conversation with a leading researcher in visual
perception at the local U, and he said there wasn't much interest in
doing such a test. It appears that the normal limit for humans in
seeing any difference in FPS is ~60-100, depending on the person, scene
characteristics, brightness, contrast, etc. There is *no* credible
evidence supporting claims of many hundreds of FPS making any
difference,


You don't need evidence. You don't need a study to determine at what speed
different people see wagon wheels turning backwards on TV because they all
see the same effect. The effect is an artifact of the sampling system used
which is easily understood and predictable.

But you are mixing apples and oranges. The wagon wheel turning
backwards is an artifact created between the image movement and the
sweep rate, or frequency.

In an LCD it is the image movement and refresh rate.

You can see them change just by changing the refresh rate.

Another problem is image persistence on the display. With the old
CRTs we could even purchase them with phosphors of different
persistence. Prior to good storage scopes we used long persistence
phosphors to measure the lifetime of minority carriers in
semiconductor material. Storage scopes sure improved the accuracy of
those measurements.

With even the best of CRT displays you could flash a bright line
across the screen and then watch it fade. It was that in normal mode
the next line was bright enough we didn't notice the remnants of the
previous image.

The human eye suffers from the same persistence problem. With bright
objects we call them after images, but those after images exist even
when viewing a display. As most displays have rather high contrast
and brightness the after images last longer than normal. This by
itself sets an upper limit on frame rates far below what would
theoretically possible.

Contrary to the usual explanation of motion and persistence of vision
the after image should detract and not aid in creating movement.

Although they could use some lessons in English, Joseph and Barbera
Anderson pretty much sum this up in the 10th paragraph of their paper
on "The Myth of Persistence of Vision Revisited" (Journal of Film and
Video (Spring 1993)
http://www.uca.edu/org/ccsmi/ccsmi/c...0Revisited.htm

Under the right conditions you can see a bit of flicker in good
fluorescent lights that are working right. Normally the flicker can
not be seen. If it weren't for the persistence of vision moving
around in a store with fluorescent lights would be like moving around
at a bar with the strobes on.

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of samples
is easily understood and predictable just the same.


Which means we should remove the artifacts.
Remove those artifacts and we'd see smooth movement without needing to
increase the frame rates.


At the speeds people would like to see movement in FPS twitchers I know
(without needing studies or evidence) that any human will be able to see a
difference with increasing frame rates up to several thousand FPS.


I seriously doubt any human can discern between images only a 1/2
millisecond apart unless aided by something. Again what turns up are
artifacts between the image movement and in the case of the LCD the
refresh rate. The viewer is seeing the results created by the
interaction/interference between two or more items, not the actual
movement of the image. Under some conditions it is possible to
increase the flutter by increasing frame rate. The best example is a
video of a turning airplane propeller using a variable shutter speed.
Starting with a normal shutter speed the prop is a blur as displayed
as well as in each image. As the shutter speed is increased the prop
will appear to slow,stop, reverse, and speed up in the opposite
direction. This apparent change in speed will continue until a point
is reached where the prop speed is no longer a harmonic of the shutter
speed.

We do something similar with a calibrated strobe to detect RPM. If the
prop speed is a harmonic (multiple) of the strobe frequency it will
appear to stand still.

Refresh rate on a LCD can be a bit confusing as when the "screen" is
refreshed the entire image is not refreshed. Only the potions of the
image that change in color or brightness are refreshed. Add to that
images may be buffered so the next image ready to be displayed already
exists and can be drawn (row and column) exceptionally fast with no
apparent lag from top to bottom or side to side. This rapid refresh
of both column and row coupled with image movement can generate
patterns, not just jerky movement. It's also more likely to create
harmonic relationships between the motion and the refresh per row and
column.
..
However changes in pixel output are limited by that elements ability
to change, or its persistence. That element persistence is currently
the limiting factor in what we have for refresh rates. Generally you
can increase the brightness faster than the light will fade to black
and the rate at which colors change in hue and intensity varies as
well. With LCDs on the order of 4 to 5 ms for a refresh rate we have
made some good steps. The laser screens that should be available in
another year or so may or may not give us a bit more speed. They
certainly will bring a new set of variables when it comes to
interference patterns.


Of course we don't have any mainstream display technology capable of more
than a couple of hundred FPS and the world is going towards LCDs which are


You are being generous with a couple hundred frames a second. Most of
our displays can just make half that despite the ability to refresh in
a few milliseconds. There are a few monitors that will run 120 fps,
but most are on the order of 60 to 75.

all stuck at a sucky 60fps anyway so the point is a bit moot.

It is a shame gamers are not demanding faster display devices, but it seems
most of what gamers demand (watercooled quad SLi kw power supply etc) is


That is because they don't see what you are seeing. And with the new
cooling systems you don't have to go with water, but it is quiet.
:-)) My latest system is all fans and it's the quietest I've had in
years. I do have to admit the vast majority of the noise comes from
the video card(s) and that does make water cooling a bit tempting.
There's not much you gain by going to quad unless running multiple
apps. A number of programs do take advantage of dual processor or
dual core setups but I don't recall any using quad as yet. On top of
that you then have to find an OS that will run more than two
processors or cores and the game of choice...or the app of choice.

willy waving bull**** which allows them to have bump mapped HDR x16
antialiased nuts on mosquitoes in their games rather than fun running round
fragging people.


OTOH running two or three monitors off a card can pretty well use up
that extra capacity in a hurry.

There are two gaming camps. Those running the shoot 'em up speed
demons and those running the flight sims demanding detail. Of course
the second camp is crippled by MS giving them a CPU bound program that
only runs on one processor or core.
I like 'em both, but I wish some one would write a new flight sim that
isn't CPU bound to run on the MS platform. But maybe the next
generation of computers will be powerful enough. Maybe the one I'm
building will handle it?
Roger Halstead (K8RI & ARRL life member)
(N833R, S# CD-2 Worlds oldest Debonair)
www.rogerhalstead.com
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Asus Motherboards 6 December 12th 06 06:38 AM
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Nvidia Videocards 6 December 12th 06 06:38 AM
Problems playing some games with 8800 GTX Neil Catling Nvidia Videocards 6 December 9th 06 08:40 AM
nTune and SLI 8800 GTX Reggie Hillier Nvidia Videocards 3 December 3rd 06 02:08 AM
A8N32 SLI Deluxe - 8800 GTX Running SLI Reggie Hillier Asus Motherboards 9 November 25th 06 08:12 AM


All times are GMT +1. The time now is 09:09 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.