A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

8800 GTX or not?



 
 
Thread Tools Display Modes
  #41  
Old January 22nd 07, 02:10 AM posted to alt.comp.periphs.videocards.nvidia
goPostal
external usenet poster
 
Posts: 46
Default 8800 GTX or not?


"Mr.E Solved!" wrote in message
. ..
nospam wrote:
"Mr.E Solved!" wrote:

What he failed to mention, and is relevant, is that games have an
internal cock, a heartbeat: the game state is sampled x times a second
and all actors and objects get refreshed each and every beat.


I didn't mention such things because they are game specific and I
understand them less well. They are not relevant to my assertion that
visible artifacts will be
present until you reach a few thousand fps except that the objects and
viewpoint of what is being rendered must be updated at a similar rate. I
accept that games need some kind of internal timebase and agree that
aliasing between the internal timebase and achieved display frame rate
can
lead to additional unpleasant effects. Interesting you still use a CRT
display. I suspect they are going to be
hard to find and expensive in the future. I don't know what the technical
limitations determining maximum frame rates
are within LCD displays, i.e. if demand for higher frame rates could
easily
be met. Disappointingly a DVI interface puts a hard limit on frame rates
of
for example about 160Hz at 1600x1200.


I'll gladly recap for someone who doesn't insult, or claim to "think they
know" when in fact they just "feel like they do". Thank you for your
civility. Authoritatively and for your amusement:

Ignoring network latencies, an accurate representation of a gamestate is
only achieved when the refresh rate of the display device matches or
exceeds the fixed refresh rate of the game. This allows for perfect
positional information, which is the ideal condition. Otherwise, as
mentioned, a rocket will travel x amount of pixels in a game tick, and you
will not know it until your client pc is able to redraw its screen. If you
happen to be that target and you are within that x-pixel range, you get
hit before you know it. A modern game with a 100 tick per second refresh
rate, can have five updates before your 20FPS rig can display them. Hence
affecting "what you see".

(n.b. this issue is fatal to net-play, that is why measures such as
positional prediction algorithms have been created to compensate for
latencies, visual or network)

Of course, such a vivid and indisputable example is useless in a ng
discussion, since it leaves no room for misunderstanding or meaningless
rebuttal. Let me then provide less rigorous examples, rife with
possibilities for false interpretation, to appease contrarians' need to
fume:

I am a virtual paintball player, and I see my opponent on the other side
of a fence, it's a combination fence, with slats and some chain parts. I
am running now on one side of the fence he starts running too...we both
start shooting at each other...through the fence....assuming perfect
aim...who hits who?

Laser beams weapons are not so new in video games, super fast, one shot
one kill, very dangerous! One of their drawbacks is that they require line
of sight to operate...but you can be tricky and bounce the rays off
certain reflective materials, to get a corner shot. You lined up this
great shot, bouncing your beam off a weather vane and to your target.
Alas, your target knows this and creates wind to spin the weather vane in
circles, faster and faster it spins and you take your shot....where does
it go? Can you predict where it will go if you can't accurately determine
which direction it is pointing in? Are you willing to bet your game life
it won't reflect back to you?

Those two scenarios are symbolic of "sampling error due to insufficient
frequency" When events happen faster than can be displayed, you lose data,
or the data becomes meaningless since it loses it's timeliness.

The key to all of this is the scene or events must be in motion, and
objects in motion on available displays, all of them, every one, only
approximate the objects position. With this approximation comes inherent
error. This error can be minimized by matching refresh rates to the
highest possible rate of change in the application, which can be 100FPS,
as in BF2.

Notice, I never mention anything about "the limits of the human eye" or
"thousands of frames per second". Nor do I mention, the obvious fact that
static images can have a frame rate of 1, and look picture perfect. Ask
Mona.

Lastly, why a CRT? I think it's obvious, CRT's have refresh rates that can
match those of certain applications. Which minimizes not just "where are
things" but "how do they appear". That is a whole separate issue of
"texture tearing" which only further demands high refresh rates.

LCD's have different concerns, but being digital, they are fixed frequency
and that's that. Once new tech, such as SED is available, LCD's will
finally be as flexible as CRT's in this specific feature.

http://en.wikipedia.org/wiki/Surface...mitter_display






Well done Mr.E. This should be the end of the thread (but it won't.)


  #42  
Old February 6th 07, 07:06 AM posted to alt.comp.periphs.videocards.nvidia
jadavis01
external usenet poster
 
Posts: 16
Default 8800 GTX or not?

I may be very slow (read stupid) but based on all this do we go with a 8800
GTX or not?


"goPostal" wrote in message
...

"Mr.E Solved!" wrote in message
. ..
nospam wrote:
"Mr.E Solved!" wrote:

What he failed to mention, and is relevant, is that games have an
internal cock, a heartbeat: the game state is sampled x times a second
and all actors and objects get refreshed each and every beat.

I didn't mention such things because they are game specific and I
understand them less well. They are not relevant to my assertion that
visible artifacts will be
present until you reach a few thousand fps except that the objects and
viewpoint of what is being rendered must be updated at a similar rate. I
accept that games need some kind of internal timebase and agree that
aliasing between the internal timebase and achieved display frame rate
can
lead to additional unpleasant effects. Interesting you still use a CRT
display. I suspect they are going to be
hard to find and expensive in the future. I don't know what the
technical limitations determining maximum frame rates
are within LCD displays, i.e. if demand for higher frame rates could
easily
be met. Disappointingly a DVI interface puts a hard limit on frame rates
of
for example about 160Hz at 1600x1200.


I'll gladly recap for someone who doesn't insult, or claim to "think they
know" when in fact they just "feel like they do". Thank you for your
civility. Authoritatively and for your amusement:

Ignoring network latencies, an accurate representation of a gamestate is
only achieved when the refresh rate of the display device matches or
exceeds the fixed refresh rate of the game. This allows for perfect
positional information, which is the ideal condition. Otherwise, as
mentioned, a rocket will travel x amount of pixels in a game tick, and
you will not know it until your client pc is able to redraw its screen.
If you happen to be that target and you are within that x-pixel range,
you get hit before you know it. A modern game with a 100 tick per second
refresh rate, can have five updates before your 20FPS rig can display
them. Hence affecting "what you see".

(n.b. this issue is fatal to net-play, that is why measures such as
positional prediction algorithms have been created to compensate for
latencies, visual or network)

Of course, such a vivid and indisputable example is useless in a ng
discussion, since it leaves no room for misunderstanding or meaningless
rebuttal. Let me then provide less rigorous examples, rife with
possibilities for false interpretation, to appease contrarians' need to
fume:

I am a virtual paintball player, and I see my opponent on the other side
of a fence, it's a combination fence, with slats and some chain parts. I
am running now on one side of the fence he starts running too...we both
start shooting at each other...through the fence....assuming perfect
aim...who hits who?

Laser beams weapons are not so new in video games, super fast, one shot
one kill, very dangerous! One of their drawbacks is that they require
line of sight to operate...but you can be tricky and bounce the rays off
certain reflective materials, to get a corner shot. You lined up this
great shot, bouncing your beam off a weather vane and to your target.
Alas, your target knows this and creates wind to spin the weather vane in
circles, faster and faster it spins and you take your shot....where does
it go? Can you predict where it will go if you can't accurately determine
which direction it is pointing in? Are you willing to bet your game life
it won't reflect back to you?

Those two scenarios are symbolic of "sampling error due to insufficient
frequency" When events happen faster than can be displayed, you lose
data, or the data becomes meaningless since it loses it's timeliness.

The key to all of this is the scene or events must be in motion, and
objects in motion on available displays, all of them, every one, only
approximate the objects position. With this approximation comes inherent
error. This error can be minimized by matching refresh rates to the
highest possible rate of change in the application, which can be 100FPS,
as in BF2.

Notice, I never mention anything about "the limits of the human eye" or
"thousands of frames per second". Nor do I mention, the obvious fact that
static images can have a frame rate of 1, and look picture perfect. Ask
Mona.

Lastly, why a CRT? I think it's obvious, CRT's have refresh rates that
can match those of certain applications. Which minimizes not just "where
are things" but "how do they appear". That is a whole separate issue of
"texture tearing" which only further demands high refresh rates.

LCD's have different concerns, but being digital, they are fixed
frequency and that's that. Once new tech, such as SED is available, LCD's
will finally be as flexible as CRT's in this specific feature.

http://en.wikipedia.org/wiki/Surface...mitter_display






Well done Mr.E. This should be the end of the thread (but it won't.)



  #43  
Old February 6th 07, 03:51 PM posted to alt.comp.periphs.videocards.nvidia
bloomer
external usenet poster
 
Posts: 1
Default 8800 GTX or not?


"jadavis01" wrote in message
. ..
I may be very slow (read stupid) but based on all this do we go with a 8800
GTX or not?


"goPostal" wrote in message
...

"Mr.E Solved!" wrote in message
. ..
nospam wrote:
"Mr.E Solved!" wrote:

What he failed to mention, and is relevant, is that games have an
internal cock, a heartbeat: the game state is sampled x times a second
and all actors and objects get refreshed each and every beat.

I didn't mention such things because they are game specific and I
understand them less well. They are not relevant to my assertion that
visible artifacts will be
present until you reach a few thousand fps except that the objects and
viewpoint of what is being rendered must be updated at a similar rate.
I accept that games need some kind of internal timebase and agree that
aliasing between the internal timebase and achieved display frame rate
can
lead to additional unpleasant effects. Interesting you still use a CRT
display. I suspect they are going to be
hard to find and expensive in the future. I don't know what the
technical limitations determining maximum frame rates
are within LCD displays, i.e. if demand for higher frame rates could
easily
be met. Disappointingly a DVI interface puts a hard limit on frame
rates of
for example about 160Hz at 1600x1200.

I'll gladly recap for someone who doesn't insult, or claim to "think
they know" when in fact they just "feel like they do". Thank you for
your civility. Authoritatively and for your amusement:

Ignoring network latencies, an accurate representation of a gamestate is
only achieved when the refresh rate of the display device matches or
exceeds the fixed refresh rate of the game. This allows for perfect
positional information, which is the ideal condition. Otherwise, as
mentioned, a rocket will travel x amount of pixels in a game tick, and
you will not know it until your client pc is able to redraw its screen.
If you happen to be that target and you are within that x-pixel range,
you get hit before you know it. A modern game with a 100 tick per second
refresh rate, can have five updates before your 20FPS rig can display
them. Hence affecting "what you see".

(n.b. this issue is fatal to net-play, that is why measures such as
positional prediction algorithms have been created to compensate for
latencies, visual or network)

Of course, such a vivid and indisputable example is useless in a ng
discussion, since it leaves no room for misunderstanding or meaningless
rebuttal. Let me then provide less rigorous examples, rife with
possibilities for false interpretation, to appease contrarians' need to
fume:

I am a virtual paintball player, and I see my opponent on the other side
of a fence, it's a combination fence, with slats and some chain parts. I
am running now on one side of the fence he starts running too...we both
start shooting at each other...through the fence....assuming perfect
aim...who hits who?

Laser beams weapons are not so new in video games, super fast, one shot
one kill, very dangerous! One of their drawbacks is that they require
line of sight to operate...but you can be tricky and bounce the rays off
certain reflective materials, to get a corner shot. You lined up this
great shot, bouncing your beam off a weather vane and to your target.
Alas, your target knows this and creates wind to spin the weather vane
in circles, faster and faster it spins and you take your shot....where
does it go? Can you predict where it will go if you can't accurately
determine which direction it is pointing in? Are you willing to bet your
game life it won't reflect back to you?

Those two scenarios are symbolic of "sampling error due to insufficient
frequency" When events happen faster than can be displayed, you lose
data, or the data becomes meaningless since it loses it's timeliness.

The key to all of this is the scene or events must be in motion, and
objects in motion on available displays, all of them, every one, only
approximate the objects position. With this approximation comes inherent
error. This error can be minimized by matching refresh rates to the
highest possible rate of change in the application, which can be 100FPS,
as in BF2.

Notice, I never mention anything about "the limits of the human eye" or
"thousands of frames per second". Nor do I mention, the obvious fact
that static images can have a frame rate of 1, and look picture perfect.
Ask Mona.

Lastly, why a CRT? I think it's obvious, CRT's have refresh rates that
can match those of certain applications. Which minimizes not just "where
are things" but "how do they appear". That is a whole separate issue of
"texture tearing" which only further demands high refresh rates.

LCD's have different concerns, but being digital, they are fixed
frequency and that's that. Once new tech, such as SED is available,
LCD's will finally be as flexible as CRT's in this specific feature.

http://en.wikipedia.org/wiki/Surface...mitter_display






Well done Mr.E. This should be the end of the thread (but it won't.)


I've got 2 in sli and have had no problems with them - They are superb cards
but at a price.

My moan right now is the poor development of vista drivers for these cards -
nvidia have still to release anything past public beta.

However, Vista is a new os and that should be taken into consideration. The
true test of these cards will be in DX 10 games for which they have been
designed so we'll wait and see.

In terms of performance, they trump anything else out there whether sli or
ATI equivalent.

  #44  
Old February 6th 07, 11:52 PM posted to alt.comp.periphs.videocards.nvidia
Roger
external usenet poster
 
Posts: 76
Default 8800 GTX or not?

On Sun, 21 Jan 2007 19:28:33 -0500, "Mr.E Solved!"
wrote:

nospam wrote:
"Mr.E Solved!" wrote:

What he failed to mention, and is relevant, is that games have an
internal cock, a heartbeat: the game state is sampled x times a second
and all actors and objects get refreshed each and every beat.


I didn't mention such things because they are game specific and I
understand them less well.

They are not relevant to my assertion that visible artifacts will be
present until you reach a few thousand fps except that the objects and
viewpoint of what is being rendered must be updated at a similar rate.

I accept that games need some kind of internal timebase and agree that
aliasing between the internal timebase and achieved display frame rate can
lead to additional unpleasant effects.

Interesting you still use a CRT display. I suspect they are going to be
hard to find and expensive in the future.

I don't know what the technical limitations determining maximum frame rates
are within LCD displays, i.e. if demand for higher frame rates could easily
be met. Disappointingly a DVI interface puts a hard limit on frame rates of
for example about 160Hz at 1600x1200.


I'll gladly recap for someone who doesn't insult, or claim to "think
they know" when in fact they just "feel like they do". Thank you for
your civility. Authoritatively and for your amusement:

Ignoring network latencies, an accurate representation of a gamestate is
only achieved when the refresh rate of the display device matches or
exceeds the fixed refresh rate of the game. This allows for perfect
positional information, which is the ideal condition. Otherwise, as
mentioned, a rocket will travel x amount of pixels in a game tick, and
you will not know it until your client pc is able to redraw its screen.
If you happen to be that target and you are within that x-pixel range,
you get hit before you know it. A modern game with a 100 tick per second
refresh rate, can have five updates before your 20FPS rig can display
them. Hence affecting "what you see".

(n.b. this issue is fatal to net-play, that is why measures such as
positional prediction algorithms have been created to compensate for
latencies, visual or network)

Of course, such a vivid and indisputable example is useless in a ng
discussion, since it leaves no room for misunderstanding or meaningless
rebuttal. Let me then provide less rigorous examples, rife with
possibilities for false interpretation, to appease contrarians' need to
fume:

I am a virtual paintball player, and I see my opponent on the other side
of a fence, it's a combination fence, with slats and some chain parts.
I am running now on one side of the fence he starts running too...we
both start shooting at each other...through the fence....assuming
perfect aim...who hits who?

Laser beams weapons are not so new in video games, super fast, one shot
one kill, very dangerous! One of their drawbacks is that they require
line of sight to operate...but you can be tricky and bounce the rays off
certain reflective materials, to get a corner shot. You lined up this
great shot, bouncing your beam off a weather vane and to your target.
Alas, your target knows this and creates wind to spin the weather vane
in circles, faster and faster it spins and you take your shot....where
does it go? Can you predict where it will go if you can't accurately
determine which direction it is pointing in? Are you willing to bet your
game life it won't reflect back to you?

Those two scenarios are symbolic of "sampling error due to insufficient
frequency" When events happen faster than can be displayed, you lose
data, or the data becomes meaningless since it loses it's timeliness.

The key to all of this is the scene or events must be in motion, and
objects in motion on available displays, all of them, every one, only
approximate the objects position. With this approximation comes inherent
error. This error can be minimized by matching refresh rates to the
highest possible rate of change in the application, which can be 100FPS,
as in BF2.

Notice, I never mention anything about "the limits of the human eye" or
"thousands of frames per second". Nor do I mention, the obvious fact
that static images can have a frame rate of 1, and look picture perfect.
Ask Mona.

Lastly, why a CRT? I think it's obvious, CRT's have refresh rates that
can match those of certain applications. Which minimizes not just "where
are things" but "how do they appear". That is a whole separate issue of
"texture tearing" which only further demands high refresh rates.

LCD's have different concerns, but being digital, they are fixed
frequency and that's that. Once new tech, such as SED is available,
LCD's will finally be as flexible as CRT's in this specific feature.

http://en.wikipedia.org/wiki/Surface...mitter_display



Thank you for that explanation.
I have dealt with displays a lot, but in an entirely different arena
and this puts the "gamers" desire for the high frame rates into an
easily understandable presentation for those of us who normally see
frame rates from an entirely different perspective.




Roger Halstead (K8RI & ARRL life member)
(N833R, S# CD-2 Worlds oldest Debonair)
www.rogerhalstead.com
  #45  
Old February 7th 07, 01:39 AM posted to alt.comp.periphs.videocards.nvidia
Mr.E Solved!
external usenet poster
 
Posts: 888
Default 8800 GTX or not?

Roger wrote:

Thank you for that explanation.
I have dealt with displays a lot, but in an entirely different arena
and this puts the "gamers" desire for the high frame rates into an
easily understandable presentation for those of us who normally see
frame rates from an entirely different perspective.


Since that was the intent of the post, you are very welcome. It's
sometimes very difficult to express visual phenomenon with the written
word. Oh, the number of times I begged for a link so I didn't have to
write something that has been asked and answered a thousand countless
times before. In that vein, I present to you, Roger, a link I hold in
high regard, regrets if you know it already...

http://www.theeldergeek.com

Use it wisely.
  #46  
Old February 7th 07, 06:33 AM posted to alt.comp.periphs.videocards.nvidia
Roger
external usenet poster
 
Posts: 76
Default 8800 GTX or not?

On Tue, 06 Feb 2007 20:39:52 -0500, "Mr.E Solved!"
wrote:

Roger wrote:

Thank you for that explanation.
I have dealt with displays a lot, but in an entirely different arena
and this puts the "gamers" desire for the high frame rates into an
easily understandable presentation for those of us who normally see
frame rates from an entirely different perspective.


Since that was the intent of the post, you are very welcome. It's
sometimes very difficult to express visual phenomenon with the written
word. Oh, the number of times I begged for a link so I didn't have to
write something that has been asked and answered a thousand countless
times before. In that vein, I present to you, Roger, a link I hold in
high regard, regrets if you know it already...

http://www.theeldergeek.com

Use it wisely.

Thanks,
I need to "repair" this OS and I've already found several items I can
use to make life easier. This is an old OEM install I built up so
running repair means reinstalling SP 2 as well. (System has lost
Icons, dot net, and won't print from OE.- script error

)
Roger Halstead (K8RI & ARRL life member)
(N833R, S# CD-2 Worlds oldest Debonair)
www.rogerhalstead.com
  #47  
Old February 7th 07, 06:08 PM posted to alt.comp.periphs.videocards.nvidia
McG.
external usenet poster
 
Posts: 65
Default 8800 GTX or not?

you've just read a lot of e-pinionatedness from someone(s).

I own and use a number of LCD flat panel displays from 17" through
20.1". If I had the money right now, *I* would be using an 8800GTX
this instant.

If you can afford it, and are shopping for the best on the consumer
market right now, this is it.

Hope this helps,

McG.


"jadavis01" wrote in message
. ..
I may be very slow (read stupid) but based on all this do we go with a
8800 GTX or not?


"goPostal" wrote in message
...

"Mr.E Solved!" wrote in message
. ..
nospam wrote:
"Mr.E Solved!" wrote:

What he failed to mention, and is relevant, is that games have an
internal cock, a heartbeat: the game state is sampled x times a
second and all actors and objects get refreshed each and every
beat.

I didn't mention such things because they are game specific and I
understand them less well. They are not relevant to my assertion
that visible artifacts will be
present until you reach a few thousand fps except that the objects
and
viewpoint of what is being rendered must be updated at a similar
rate. I accept that games need some kind of internal timebase and
agree that
aliasing between the internal timebase and achieved display frame
rate can
lead to additional unpleasant effects. Interesting you still use a
CRT display. I suspect they are going to be
hard to find and expensive in the future. I don't know what the
technical limitations determining maximum frame rates
are within LCD displays, i.e. if demand for higher frame rates
could easily
be met. Disappointingly a DVI interface puts a hard limit on frame
rates of
for example about 160Hz at 1600x1200.

I'll gladly recap for someone who doesn't insult, or claim to "think
they know" when in fact they just "feel like they do". Thank you for
your civility. Authoritatively and for your amusement:

Ignoring network latencies, an accurate representation of a
gamestate is only achieved when the refresh rate of the display
device matches or exceeds the fixed refresh rate of the game. This
allows for perfect positional information, which is the ideal
condition. Otherwise, as mentioned, a rocket will travel x amount of
pixels in a game tick, and you will not know it until your client pc
is able to redraw its screen. If you happen to be that target and
you are within that x-pixel range, you get hit before you know it. A
modern game with a 100 tick per second refresh rate, can have five
updates before your 20FPS rig can display them. Hence affecting
"what you see".

(n.b. this issue is fatal to net-play, that is why measures such as
positional prediction algorithms have been created to compensate for
latencies, visual or network)

Of course, such a vivid and indisputable example is useless in a ng
discussion, since it leaves no room for misunderstanding or
meaningless rebuttal. Let me then provide less rigorous examples,
rife with possibilities for false interpretation, to appease
contrarians' need to fume:

I am a virtual paintball player, and I see my opponent on the other
side of a fence, it's a combination fence, with slats and some chain
parts. I am running now on one side of the fence he starts running
too...we both start shooting at each other...through the
fence....assuming perfect aim...who hits who?

Laser beams weapons are not so new in video games, super fast, one
shot one kill, very dangerous! One of their drawbacks is that they
require line of sight to operate...but you can be tricky and bounce
the rays off certain reflective materials, to get a corner shot. You
lined up this great shot, bouncing your beam off a weather vane and
to your target. Alas, your target knows this and creates wind to
spin the weather vane in circles, faster and faster it spins and you
take your shot....where does it go? Can you predict where it will go
if you can't accurately determine which direction it is pointing in?
Are you willing to bet your game life it won't reflect back to you?

Those two scenarios are symbolic of "sampling error due to
insufficient frequency" When events happen faster than can be
displayed, you lose data, or the data becomes meaningless since it
loses it's timeliness.

The key to all of this is the scene or events must be in motion, and
objects in motion on available displays, all of them, every one,
only approximate the objects position. With this approximation comes
inherent error. This error can be minimized by matching refresh
rates to the highest possible rate of change in the application,
which can be 100FPS, as in BF2.

Notice, I never mention anything about "the limits of the human eye"
or "thousands of frames per second". Nor do I mention, the obvious
fact that static images can have a frame rate of 1, and look picture
perfect. Ask Mona.

Lastly, why a CRT? I think it's obvious, CRT's have refresh rates
that can match those of certain applications. Which minimizes not
just "where are things" but "how do they appear". That is a whole
separate issue of "texture tearing" which only further demands high
refresh rates.

LCD's have different concerns, but being digital, they are fixed
frequency and that's that. Once new tech, such as SED is available,
LCD's will finally be as flexible as CRT's in this specific feature.

http://en.wikipedia.org/wiki/Surface...mitter_display






Well done Mr.E. This should be the end of the thread (but it won't.)





  #48  
Old February 7th 07, 09:26 PM posted to alt.comp.periphs.videocards.nvidia
[email protected]
external usenet poster
 
Posts: 40
Default 8800 GTX or not?

On Jan 7, 1:32 am, wrote:
Hi,
I've been thinking about upgrading my video card...
The top card today is the 8800GTX (Money is not an issue), I looked at
how it preforms onhttp://www.tomshardware.com/and even at 2560x1600,
4x AA, 4x AF, Doom 3 the results are 60fps!!!
it's working very well... too well...

BUT, who uses this resolution??? and LCD monitors don't even support
that high resolutions...
at 1024x768, 4x AA, 4x AF, Doom 3 the results are 210fps

So why sould anyone buy this card? no one needs 210 fps! you'r brain
can't consive more than 30 fps...

Thanks
Gil



Only 60FPS with that card? I guess you didn't turn the 60 FPS lock
off.

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Asus Motherboards 6 December 12th 06 06:38 AM
Does an Nvidia GeForce 8800 GTX work on an Asus P5WD2 Premium motherboard? Todd Nvidia Videocards 6 December 12th 06 06:38 AM
Problems playing some games with 8800 GTX Neil Catling Nvidia Videocards 6 December 9th 06 08:40 AM
nTune and SLI 8800 GTX Reggie Hillier Nvidia Videocards 3 December 3rd 06 02:08 AM
A8N32 SLI Deluxe - 8800 GTX Running SLI Reggie Hillier Asus Motherboards 9 November 25th 06 08:12 AM


All times are GMT +1. The time now is 02:47 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.