A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

8800 GTX or not?



 
 
Thread Tools Display Modes
  #21  
Old January 20th 07, 05:20 AM posted to alt.comp.periphs.videocards.nvidia
Roger
external usenet poster
 
Posts: 76
Default 8800 GTX or not?

On Fri, 12 Jan 2007 09:03:57 GMT, "Bobo" wrote:

It's not the 210 fps that you need, it's the 40 - 50 fps you'll need in the
future.

But anyway, top of the line cards are for enthusiasts. Fast cars, all that
kind of stuff. Highest resolutions...

Plus it's DirectX 10.

But if you don't have any preference, I tend to wait to see how the next
AMD/ATI card compares. I can go either way easily, and if the next gen card
for AMD comes out behind, 8800gtx it is!

I'm hoping for fast with a lot of detail, hence the need for the
graphics HP. FSX with every thing turned up and a lot of detail. Then
bring Combat FS up to the same specs. Unfortunately right now FSX is
CPU bound instead of graphics bound.

Now if I could figure out how to fit the silent square cooler onto my
ASUS M2N SLI Deluxe Socket AM2 (which the chart says it's compatible),
but the screws for the adapter plate are over a half inch off on each
side. sigh
It would be relatively easy to make an adapter for the adapter that
would work and work well, but according to the literature I shouldn't
have to make one.sheesh
Roger Halstead (K8RI & ARRL life member)
(N833R, S# CD-2 Worlds oldest Debonair)
www.rogerhalstead.com
  #22  
Old January 20th 07, 01:00 PM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

Roger wrote:

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of samples
is easily understood and predictable just the same.


Which means we should remove the artifacts.
Remove those artifacts and we'd see smooth movement without needing to
increase the frame rates.


You can't remove artifacts which are created by missing information. You
can't add information when the information rate is already bound by the
display frame rate.

At the speeds people would like to see movement in FPS twitchers I know
(without needing studies or evidence) that any human will be able to see a
difference with increasing frame rates up to several thousand FPS.


I seriously doubt any human can discern between images only a 1/2
millisecond apart unless aided by something.


They are aided (or rather hindered) by the display system sampling rate,
just like a strobe.

Consider you are playing an FPS on a 24" 60Hz LCD and the graphics card is
powerful enough to always keep up with the display.

You are being shot from behind so you turn through 180 degrees, lets say
you are a bit crap at FPSes and take 0.5 seconds to turn. During that 0.5
seconds the display system shows you 30 frames. Each of those frames is 6
degrees apart. If your FOV is 90 degrees on a 24" widescreen 6 degrees is
about 1.1" across the screen.

As you turn you see 6 or 7 images of the guy shooting you spaced 1.1"
apart. Your eye tries to track the guy to aim at him and actually sees 2
flickering guys spaced 1.1" apart or if he is closer a fuzzy flickering guy
1.1" fatter than he really is.

That's at a solid 60 fps (not the 30fps the OP claims is enough for anyone)
and 180 degrees in 0.5 seconds is like slow motion to an FPS twitcher.

The effect will be visible until the double image spacing gets down to a
couple of pixels which requires around 3600fps then you just turn twice as
fast and see it again.

It is a shame gamers are not demanding faster display devices, but it seems
most of what gamers demand (watercooled quad SLi kw power supply etc) is


That is because they don't see what you are seeing.


There is nothing special about my eyes. I guess newer LCD gamers have never
seen faster frame rates to know what they are missing. People who have been
able to look know they can see a difference, they usually describe it as
'fluidity' probably without really understanding what the difference they
are seeing is.

--
  #23  
Old January 20th 07, 03:13 PM posted to alt.comp.periphs.videocards.nvidia
McG.
external usenet poster
 
Posts: 65
Default 8800 GTX or not?


"nospam" wrote in message
...
Roger wrote:

Video games also generate an image from a sequence of samples and the
artifacts created trying to depict moving images with a series of
samples
is easily understood and predictable just the same.


Which means we should remove the artifacts.
Remove those artifacts and we'd see smooth movement without needing to
increase the frame rates.


You can't remove artifacts which are created by missing information.
You
can't add information when the information rate is already bound by
the
display frame rate.

At the speeds people would like to see movement in FPS twitchers I
know
(without needing studies or evidence) that any human will be able to
see a
difference with increasing frame rates up to several thousand FPS.


I seriously doubt any human can discern between images only a 1/2
millisecond apart unless aided by something.


They are aided (or rather hindered) by the display system sampling
rate,
just like a strobe.

Consider you are playing an FPS on a 24" 60Hz LCD and the graphics
card is
powerful enough to always keep up with the display.

You are being shot from behind so you turn through 180 degrees, lets
say
you are a bit crap at FPSes and take 0.5 seconds to turn. During that
0.5
seconds the display system shows you 30 frames. Each of those frames
is 6
degrees apart. If your FOV is 90 degrees on a 24" widescreen 6 degrees
is
about 1.1" across the screen.

As you turn you see 6 or 7 images of the guy shooting you spaced 1.1"
apart. Your eye tries to track the guy to aim at him and actually sees
2
flickering guys spaced 1.1" apart or if he is closer a fuzzy
flickering guy
1.1" fatter than he really is.

That's at a solid 60 fps (not the 30fps the OP claims is enough for
anyone)
and 180 degrees in 0.5 seconds is like slow motion to an FPS twitcher.

The effect will be visible until the double image spacing gets down to
a
couple of pixels which requires around 3600fps then you just turn
twice as
fast and see it again.

It is a shame gamers are not demanding faster display devices, but it
seems
most of what gamers demand (watercooled quad SLi kw power supply etc)
is


That is because they don't see what you are seeing.


There is nothing special about my eyes. I guess newer LCD gamers have
never
seen faster frame rates to know what they are missing. People who have
been
able to look know they can see a difference, they usually describe it
as
'fluidity' probably without really understanding what the difference
they
are seeing is.


"Twitch" games. Yes, what a description. I've played the FPS games
for a long time now. And even early in the QuakeWorldCTF days, we who
played regularly understood clearly why it was a crying necessity for
the game to be able to run faster than 30 or 60 frames per second.
FPSs are why I disable VSync. While the refresh rate is really a moot
issue with DVI and a fast LCD, it is not a moot issue in game speed
itself.
I run HL2DM in 1600x1200 with all the goodies on full. My framerates in
game in open areas with a few players in the scene run from 70's to
90's. This is in single display performance mode, it's quite a lot
faster in SLI. However, the Samsung 204B's are set to 60 Hz refresh.
There is no 'ghosting' at all with these LCD flat panels.
McG.


  #24  
Old January 20th 07, 04:36 PM posted to alt.comp.periphs.videocards.nvidia
heycarnut
external usenet poster
 
Posts: 64
Default 8800 GTX or not?

nospam wrote:
blah blah blah....


Christ, this numbnut is still babbling anecdotal bull**** here about
uber-framerates?

Show us some real, scientific studies, not your psuedo-science
bs/anecdotal crap that gamers can see
the difference up to hundreds of FPS. There is none, because they
can't. And your haughty missive about others not understanding the
reasons for artifacts, aliasing, etc. simply demonstrated your lack of
knowledge (as pointed out correctly by the other poster) of the casues
and effects of artifacts in images.

Please, do us a favor, stop with the 'Well, I can see the difference,
blah blah balh' crap and give us proof. Extraordinary claims require
extraordinary evidence. You've made them, and provided none. Otherwise,
STFU.

R

  #25  
Old January 20th 07, 08:39 PM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

"McG." wrote:

"Twitch" games. Yes, what a description. I've played the FPS games
for a long time now. And even early in the QuakeWorldCTF days, we who
played regularly understood clearly why it was a crying necessity for
the game to be able to run faster than 30 or 60 frames per second.
FPSs are why I disable VSync. While the refresh rate is really a moot
issue with DVI and a fast LCD, it is not a moot issue in game speed
itself.
I run HL2DM in 1600x1200 with all the goodies on full. My framerates in
game in open areas with a few players in the scene run from 70's to
90's.


And you wont see any more than 60 fps. Why do so many have difficulty
comprehending this, especially with DVI? Your DVI link runs at a fixed
frequency exactly enough to transfer 60 frames per second at your chosen
resolution. If you render more than 60 frames per second then parts or
whole frames get thrown away.
--
  #26  
Old January 20th 07, 08:57 PM posted to alt.comp.periphs.videocards.nvidia
Mr.E Solved!
external usenet poster
 
Posts: 888
Default 8800 GTX or not?

heycarnut wrote:


Christ, this numbnut is still babbling anecdotal bull**** here about
uber-framerates?

Show us some real, scientific studies, not your psuedo-science
bs/anecdotal crap that gamers can see
the difference up to hundreds of FPS. There is none, because they
can't. And your haughty missive about others not understanding the
reasons for artifacts, aliasing, etc. simply demonstrated your lack of
knowledge (as pointed out correctly by the other poster) of the casues
and effects of artifacts in images.

Please, do us a favor, stop with the 'Well, I can see the difference,
blah blah balh' crap and give us proof. Extraordinary claims require
extraordinary evidence. You've made them, and provided none. Otherwise,
STFU.


As in many things, differences only matter to those who discern them. I
am loath to interject in this tired discussion, since it's been asked
and answered so many times in the past, and the facts and results are
available for all to see and understand.

Nospam was basically correct in his description of the events of
multi-player gaming and how sustained FPS assists a player.

What he failed to mention, and is relevant, is that games have an
internal cock, a heartbeat: the game state is sampled x times a second
and all actors and objects get refreshed each and every beat.

So if you are playing a game with a 1/60th second tick, if you always,
consistently have a 60FPS rate, you are in gaming heaven. You are
synchronized perfectly to the events in that world. You 'see' what's
happening as it is happening.

However, that doesn't happen too often, moreover new games like BF2 have
a 100FPS tick, and few people have CRT's (LCD's are not for power
gaming) that refresh at 100HZ, let alone PCs that can provide 100FPS
consistently. I do happen to have a CRT that provides 100HZ refresh and
I can get 100FPS sustained on BF2, and yes, it is smoother and more
fluid than a 60FPS capped game. An Anecdote? Maybe so but it's real,
reproducible, describable, and consistent.

It's not a question of "seeing", it's a matter of results. Higher
refresh rates give the client machine more time to draw gamestate
changes, which provide a more accurate and timely picture of the virtual
environment. Which is ultimately what you want to see real-time events
in the virtual world.

If the refresh rate is faster than the rate of change in the
application, then yes, you might not SEE any state change: that rocket
hasn't moved on the game server, so it's not going to move, yet, on your
client machine, But you will redraw that rocket in place as soon as the
server says to. Without any delay on your end, if your FPS and refresh
rate is as fast, or faster than the game-tick. If your FPS or refresh
rate is too slow, that rocket will move on it's path, uncaring that your
PC can't keep up.

If your client machine has a sustained 30FPS refresh rate, and the game
has a 60FPS tick, and another player has a sustained 60FPS tick, be
assured, he is getting more accurate positional data than you are.

In fact, this is such a well known problem, games have been compensating
for positional inaccuracies due to this phenomenon (and built in network
latencies which makes it worse) with "positional predictions" for years
and years.

Serious Sam was one of the first games that gave the end user very
specific settings to adjust positional assumptions based on measured
latencies.

So, people can go on and claim high refresh rates aren't important, and
the "eye can't see this or that", when it's they who are ignoring
empirical data and the facts right in front of them.

I enjoy playing on-line against such ignorant and untalented people, in
fact, educating them about how to get a better gaming experience is not
in my best interest. Luckily, there are a large number of people who,
like heycarnut, would rather stay aggressively misinformed, and forever
tied to inferior results.

Heycarnut, prove me wrong!





  #27  
Old January 20th 07, 09:02 PM posted to alt.comp.periphs.videocards.nvidia
nospam
external usenet poster
 
Posts: 18
Default 8800 GTX or not?

"heycarnut" wrote:

nospam wrote:
blah blah blah....


Christ, this numbnut is still babbling anecdotal bull**** here about
uber-framerates?


There was nothing anecdotal in my post. I gave a simple (obviously not
simple enough for you) example of how artifacts occur, how they relate to
movement speed and frame rate, and how they will appear to a viewer.

If it read as blah blah blah... to you, well it's not my fault you are to
dumb to understand.

And you are the one calling me a numbnut - lol.
--
  #28  
Old January 20th 07, 09:10 PM posted to alt.comp.periphs.videocards.nvidia
McG.
external usenet poster
 
Posts: 65
Default 8800 GTX or not?


"nospam" wrote in message
news
"McG." wrote:

"Twitch" games. Yes, what a description. I've played the FPS games
for a long time now. And even early in the QuakeWorldCTF days, we who
played regularly understood clearly why it was a crying necessity for
the game to be able to run faster than 30 or 60 frames per second.
FPSs are why I disable VSync. While the refresh rate is really a moot
issue with DVI and a fast LCD, it is not a moot issue in game speed
itself.
I run HL2DM in 1600x1200 with all the goodies on full. My framerates in
game in open areas with a few players in the scene run from 70's to
90's.


And you wont see any more than 60 fps. Why do so many have difficulty
comprehending this, especially with DVI? Your DVI link runs at a fixed
frequency exactly enough to transfer 60 frames per second at your chosen
resolution. If you render more than 60 frames per second then parts or
whole frames get thrown away.
--


Some games, especially older ones like the Quakes 1-3, would be frame capped
and some things were not possible to do if vsync was enabled in game and
refresh was other than 125. We turned vsysnc off and ran with the variable
Max_FPS_125 to achieve the gameplay speed that allowed specific jumps. THAT
is what so many get stuck on I think. I don't think I can discern flicker
at 85 Hz. It is definitely evident at 60 Hz on theCRT's I used to use. On
these LCD flat panels I've used for the last 5 years, no flicker whatsoever
and the refresh rate for native resolution is 60 Hz. But they don't
function the same way a CRT does. With the more modern games, that
particular little limitation is gone. Now I just like HL2DM to be as smooth
as possible and to be able to whip around and shoot with no lag. Game
framerates in the 90s make little or no difference to our eyes, but they
sure do make a difference in the performance of the game itself.

McG.


  #29  
Old January 21st 07, 01:59 AM posted to alt.comp.periphs.videocards.nvidia
heycarnut
external usenet poster
 
Posts: 64
Default 8800 GTX or not?

Mr.E Solved! wrote:

Heycarnut, prove me wrong!


Don't have to - you, and the other clown here, are making the
extraordinary claims. You have to provide the proof, same as someone
who claims to be friends with bigfoot, or says they can hear the
differences in power cords on their stereo - both in the same league as
the claims you both make. On the other hand, I'll put my money where my
mouth is. Either of you uber-gamer-super-eyes find your way out in the
SF bay area, we'll go out to the stanford neuro labs, and you can try
to prove your claims. If I'm wrong, and you can target better at say
200FPS than 100FPS, I'll donate $1000.00 to the charity of your choice.
When you can't , you agree to do the same. I'll keep this offer open
until say end of march 2007.

You've got my email from my profile, if you're so sure of yourselves.
Until then, onto my ignore list, along with the bigfoot believers....

r

  #30  
Old January 21st 07, 02:20 AM posted to alt.comp.periphs.videocards.nvidia
heycarnut
external usenet poster
 
Posts: 64
Default 8800 GTX or not?


nospam wrote:
And you wont see any more than 60 fps. Why do so many have difficulty

comprehending this, especially with DVI? Your DVI link runs at a fixed
frequency exactly enough to transfer 60 frames per second at your chosen
resolution. If you render more than 60 frames per second then parts or
whole frames get thrown away.
--


This truly shows your level of ignorance. Ever even read the spec, much
less watch a signal on this with a logic probe? Geez - go away and go
shoot at backwards running wagon wheels or something.

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Asus Motherboards 6 December 12th 06 06:38 AM
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Nvidia Videocards 6 December 12th 06 06:38 AM
Problems playing some games with 8800 GTX Neil Catling Nvidia Videocards 6 December 9th 06 08:40 AM
nTune and SLI 8800 GTX Reggie Hillier Nvidia Videocards 3 December 3rd 06 02:08 AM
A8N32 SLI Deluxe - 8800 GTX Running SLI Reggie Hillier Asus Motherboards 9 November 25th 06 08:12 AM


All times are GMT +1. The time now is 12:48 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.