PDA

View Full Version : Nvidia's 9800 GTX and 9800 GX2 seem to be a waste of time & money


NV55
April 16th 08, 08:21 AM
If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
more or less what you already have. In some important areas, really
less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
will should be arriving Q3 2008.


http://www.techradar.com/news/computing-components/graphics-cards/our-verdict-nvidias-9800-gtx-and-gx2-on-test-318825




Our verdict: Nvidia's 9800 GTX and GX2 on test
Are these cards a technological leap?

16 hours ago | Reader comments (0)

The new Nvidia GeForce 9800 range is not as good as we'd hoped

ZoomZoom

<>

Just over eighteen months ago the much-heralded age of the DirectX10-
capable graphics card dawned, as the supreme G80-powered GeForce
8800GTX dropped into the TechRadar office. Six months later came the
updated 8800 Ultra, a card that has remained Nvidia's top end
offering... until now.

We've had to wait 12 long months for the refresh, during which time we
have been treated to a mass of mid-range cards, including the
excellent 8800GT - Nvidia's first card with a 65nm core.

But still, it's been a long time coming for the 9800 GTX and GX2.

Long time passing

Both new cards are powered by the same 65nm G92, a core that is itself
now six months old. And it's the first time that Nvidia has released a
brand new family of top-end cards based on old architecture. Replacing
the 8800GTX and Ultra is a necessity as far as furthering the Nvidia
brand is concerned, competition-wise though it's less of an issue. AMD
still hasn't managed to create anything to seriously outperform these
year-old cards, so is the lack of a new core an acknowledgment that
Nvidia only has to turn up at the track to win the race?

The GTX version of the 9800 card is a straight, beefed up version of
the G92 with higher clock speeds across the board. While it shares the
number of Raster Operators (ROPs) with the 8800GT, it does have the
old GTX's complement of shader units at 128, giving it the necessary
speed boost.

The GX2 follows the example of the old 7950GX2, strapping two G92-
stuffed PCBs together. But this time both PCBs face into the same
heatsink and are housed in a vaguely coffin-like surround. The clock
speeds are slightly slower than the GTX, but a fair bit of optimising
has gone into making this single-card SLI offering an impressive piece
of engineering.

Swiss cheese memory

The first difference you'll notice when comparing the two new cards
with their predecessors is the change in memory capacity. Both the
8800 GTX and Ultra had a 384-bit memory bus with 768MB of GDDR3, while
the 9800s make do with the same 256-bit 512MB of memory that resides
on the GTS and GT iterations of the G92-based 8800s.

Due to its two cores, the GX2 comes out tops in the memory bandwidth
stakes at 128GB/s compared with the Ultra's 103.7GB/s, but the 9800
GTX lags well behind both of the previous cards. What this all means
in real terms is that at the higher resolutions, and most especially
with full screen anti-aliasing turned on, the new cards take quite a
hit at the levels we were hoping these big-panel pixel pushers would
excel at.

The differences between the GTX and GX2, and indeed the 8800GT, are
slight; the GX2 simply relying on the brute force effect of the single
card SLI factor. Where the difference between the two new G92 parts is
most obvious though is the number of ROPs. The GTX is still hobbling
along with 16, less than both the 8800GTX and Ultra at 24, but due to
the doubling up, the 9800 GX2 has 32 ROPs. The difficulty is in
knowing how much of a benefit this multi-GPU's extra ROPs gives us as
opposed to the single card with 24.

Bigger, faster, stronger

So where do we find ourselves with the two new top-end cards? Well,
mostly in the same place we were before to be honest. There's very
little difference between this new set and the old, with the 9800 GTX
being the biggest disappointment.

It struggles to find any space between itself and the 8800 GTX (which
it's supposed to be replacing), and there's also the fact that you can
still pick up the older card - with the extra memory, bandwidth and
ROPs - for less than 200. In some places you can save yourself around
50 and come out with an equivalent, and in some cases, faster card.
The march of progress seems to have stomped right past this iteration
of the 9800, and here at TechRadar we might just have to plump for the
original DX10 monster.

With regards to the GX2, Nvidia had to go down the multi-GPU route,
not just to prove it could produce a functional version like AMD, but
also to create a card that it could legitimately call the fastest
graphics card around.

The final verdict...

Still, the memory constraints hold the GX2 back from being the
superlative, stand out, top-end card de jour. On lower-res panels
without silicon-melting anti-aliasing, it speeds ahead of the
competition. Yet with all the bells and whistles cranked up to a
deafening roar it struggles to break-even with the old 8800 Ultra.
Again, if you shop around you can pick up an Ultra for around 350,
and be fairly sure that your card will have drivers mature enough to
cater for whatever you throw down its graphics tubes.

The long and short of it is that if you've got yourself an 8800GTX or
Ultra, and felt that twinge of envy at the announcement of this new
generation of top-end cards, then you can stop worrying. In fact, you
can probably be down-right smug as your slightly geriatric cards are
still more than capable of holding their own against these
youngbloods. Til the GT200 comes out that is...

The full version of this review will appear in PC Format magazine
issue 214 and will go on sale on 4 May.
By Dave James and James Rivington

Tim O[_2_]
April 16th 08, 01:10 PM
On Wed, 16 Apr 2008 00:21:44 -0700 (PDT), NV55 >
wrote:

>If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
>your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
>more or less what you already have. In some important areas, really
>less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
>will should be arriving Q3 2008.

Save my money for Q3? I bought an 8800GT in December. It'll be long
after Q3 that I upgrade again. I suspect thats true for a lot of
people since there are probably going to be 2 more games that can even
use the 8800's horsepower by then.

Tim

Augustus
April 16th 08, 03:23 PM
You're preaching to the choir in this forum.....for $175 I can add a second
8800GT OC unit that will outperform any 9800GX2.

JLC
April 16th 08, 03:44 PM
"Tim O" > wrote in message
...
> On Wed, 16 Apr 2008 00:21:44 -0700 (PDT), NV55 >
> wrote:
>
>>If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
>>your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
>>more or less what you already have. In some important areas, really
>>less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
>>will should be arriving Q3 2008.
>
> Save my money for Q3? I bought an 8800GT in December. It'll be long
> after Q3 that I upgrade again. I suspect thats true for a lot of
> people since there are probably going to be 2 more games that can even
> use the 8800's horsepower by then.
>
> Tim

I agree. My 8800GT kicks the crap out of any game I throw at it (except
Crysis of course). I'm hanging on to this card for awhile. JLC

Trimble Bracegirdle
April 19th 08, 03:14 AM
>>>"If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
more or less what you already have. "<<

Well thanks for that comfort as I'm still trying to feel my 1 year old 300
8800GTX
is / was worth it.
All divides us up on this issue depending on what resolution a Bunny thinks
it needs
must play in.
With my 21" CRT Monitor all is looking good at 1024 x 768 Max everything.
(\__/)
(='.'=)
(")_(") mouse(it all just dots u no ?)

Augustus
April 19th 08, 04:39 AM
> Well thanks for that comfort as I'm still trying to feel my 1 year old
> 300 8800GTX
> is / was worth it.
> All divides us up on this issue depending on what resolution a Bunny
> thinks it needs
> must play in.
> With my 21" CRT Monitor all is looking good at 1024 x 768 Max everything.

You're joking, right? It should be looking just fine at 1600x1200 running
everything unless it's Crysis or one two other titles. And these should be
more than just fine at 1280x1024 maxed. Running an 8800GTX at 1024x768 with
everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do what
an 8600GT would do at 1024x768.

Sorrid User
April 19th 08, 11:07 AM
"Augustus" > wrote in message
news:[email protected]
>> Well thanks for that comfort as I'm still trying to feel my 1 year old
>> 300 8800GTX
>> is / was worth it.
>> All divides us up on this issue depending on what resolution a Bunny
>> thinks it needs
>> must play in.
>> With my 21" CRT Monitor all is looking good at 1024 x 768 Max everything.
>
> You're joking, right? It should be looking just fine at 1600x1200 running
> everything unless it's Crysis or one two other titles. And these should be
> more than just fine at 1280x1024 maxed. Running an 8800GTX at 1024x768
> with everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to
> do what an 8600GT would do at 1024x768.

An 8600GT is not good enough for Assasin's Creed and World of Conflict IME.
Even at 1024x768. Crysis looks and runs well enough on medium, but those two
games looks like DX7 titles by the time I get playable framerates on my
8600GT.

Augustus
April 19th 08, 03:39 PM
> An 8600GT is not good enough for Assasin's Creed and World of Conflict
> IME. Even at 1024x768. Crysis looks and runs well enough on medium, but
> those two games looks like DX7 titles by the time I get playable
> framerates on my 8600GT.

I'm aware of that. You missed the point.

Trimble Bracegirdle
April 20th 08, 01:05 AM
Wise man (?) say:
>>>"You're joking, right? ...... Running an 8800GTX at 1024x768 with
everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do what
an 8600GT would do at 1024x768.">>>>

Exactly...a year ago the 8800GTX (DX10 etc) looked to make sense & those
other
cheaper lower spec 8xxx's were not out .
The CRT won't last forever anway.
(\__/)
(='.'=)
(")_(") mouse(Yes we did rather get that purchses wrong did we not)

JLC
April 20th 08, 03:35 AM
"Trimble Bracegirdle" > wrote in message
...
> Wise man (?) say:
>>>>"You're joking, right? ...... Running an 8800GTX at 1024x768 with
> everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do
> what
> an 8600GT would do at 1024x768.">>>>
>
> Exactly...a year ago the 8800GTX (DX10 etc) looked to make sense & those
> other
> cheaper lower spec 8xxx's were not out .
> The CRT won't last forever anway.
> (\__/)
> (='.'=)
> (")_(") mouse(Yes we did rather get that purchses wrong did we not)
>
>
I used to run all my games at 1024x768 until around this time last year when
I upgrade to a ATI X1900XT. I then discovered 1280x1024 to be a lot better
looking on my 19"CRT. I usually run 4x AA and 16x AF and the only game that
has problems with that is Crysis. Now that I have a 8800GT I still run my
games at 1280 because if I go any higher on this monitor I have to run at
60Hz which bugs my eyes. I run everything at 85Hz. Someday I'll take the
plunge and get a nice big LCD, but for now I'm happy with what I have. JLC

John Lewis
April 20th 08, 04:45 AM
On Sun, 20 Apr 2008 01:05:44 +0100, "Trimble Bracegirdle"
> wrote:

>Wise man (?) say:
>>>>"You're joking, right? ...... Running an 8800GTX at 1024x768 with
>everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do what
>an 8600GT would do at 1024x768.">>>>
>
>Exactly...a year ago the 8800GTX (DX10 etc) looked to make sense & those
>other
>cheaper lower spec 8xxx's were not out .
>The CRT won't last forever anway.

It sure will last lots longer than the backlight on current generation
LCDs. For both the total lifetime and the (lack of) intensity loss per
year. Take a look at the lumens vs time curve for a typical LCD
backlight. You will not like what you see... at all. Both of these
problems with LCD backlights will only be solved with the universal
availability of LED backlighting (at a denent price). Oh, and getting
a replacement backlight for your LCD -- forget it.....

John Lewis

>(\__/)
>(='.'=)
>(")_(") mouse(Yes we did rather get that purchses wrong did we not)
>
>

JLC
April 20th 08, 07:48 AM
"John Lewis" > wrote in message
...
> On Sun, 20 Apr 2008 01:05:44 +0100, "Trimble Bracegirdle"
> > wrote:
>
>>Wise man (?) say:
>>>>>"You're joking, right? ...... Running an 8800GTX at 1024x768 with
>>everything maxed on a 21" CRT? You're using an 8800GTX 768Mb card to do
>>what
>>an 8600GT would do at 1024x768.">>>>
>>
>>Exactly...a year ago the 8800GTX (DX10 etc) looked to make sense & those
>>other
>>cheaper lower spec 8xxx's were not out .
>>The CRT won't last forever anway.
>
> It sure will last lots longer than the backlight on current generation
> LCDs. For both the total lifetime and the (lack of) intensity loss per
> year. Take a look at the lumens vs time curve for a typical LCD
> backlight. You will not like what you see... at all. Both of these
> problems with LCD backlights will only be solved with the universal
> availability of LED backlighting (at a denent price). Oh, and getting
> a replacement backlight for your LCD -- forget it.....
>
> John Lewis
>

I guess I'm like a guy that likes vinyl over CD. Sure the CD sounds great,
but there's just something about an old record that sounds better to some
people. That's the way I feel about CRT's They just have a great picture and
are easier on my eyes then LCD's. I also have a HD CRT. Sure it's not a
mammoth set, but what there is to see is amazing. I haven't yet seen an HD
set even come close to having the rich color my WEGA has. JLC

pg[_2_]
April 26th 08, 06:20 AM
That's the game them video card people do to us all the time.

ATI roled out their HD 3870, and charged a premium for it. Turns out
that 3870 is just a rehash of 2600 ! Both have 120 stream processors
in them.

The 2600 XT runs at 1.4 GHz, while the 3870 at 1.7 GHz, a 20%
different, but they charge a whole lot more than the 20% improvement
in speed.


On Apr 16, 12:21 am, NV55 > wrote:
> If you have a 8800 GTX, 8800 Ultra or 8800 GT, you might as well save
> your money, don't waste it on 9800 GTX or 9800 GX2. These cards are
> more or less what you already have. In some important areas, really
> less. Save it for the upcoming GT200 based 9900GTX or 9900GX2. Which
> will should be arriving Q3 2008.
>
> http://www.techradar.com/news/computing-components/graphics-cards/our...
>
> Our verdict: Nvidia's 9800 GTX and GX2 on test
> Are these cards a technological leap?
>
> 16 hours ago | Reader comments (0)
>
> The new Nvidia GeForce 9800 range is not as good as we'd hoped
>
> ZoomZoom
>
> <>
>
> Just over eighteen months ago the much-heralded age of the DirectX10-
> capable graphics card dawned, as the supreme G80-powered GeForce
> 8800GTX dropped into the TechRadar office. Six months later came the
> updated 8800 Ultra, a card that has remained Nvidia's top end
> offering... until now.
>
> We've had to wait 12 long months for the refresh, during which time we
> have been treated to a mass of mid-range cards, including the
> excellent 8800GT - Nvidia's first card with a 65nm core.
>
> But still, it's been a long time coming for the 9800 GTX and GX2.
>
> Long time passing
>
> Both new cards are powered by the same 65nm G92, a core that is itself
> now six months old. And it's the first time that Nvidia has released a
> brand new family of top-end cards based on old architecture. Replacing
> the 8800GTX and Ultra is a necessity as far as furthering the Nvidia
> brand is concerned, competition-wise though it's less of an issue. AMD
> still hasn't managed to create anything to seriously outperform these
> year-old cards, so is the lack of a new core an acknowledgment that
> Nvidia only has to turn up at the track to win the race?
>
> The GTX version of the 9800 card is a straight, beefed up version of
> the G92 with higher clock speeds across the board. While it shares the
> number of Raster Operators (ROPs) with the 8800GT, it does have the
> old GTX's complement of shader units at 128, giving it the necessary
> speed boost.
>
> The GX2 follows the example of the old 7950GX2, strapping two G92-
> stuffed PCBs together. But this time both PCBs face into the same
> heatsink and are housed in a vaguely coffin-like surround. The clock
> speeds are slightly slower than the GTX, but a fair bit of optimising
> has gone into making this single-card SLI offering an impressive piece
> of engineering.
>
> Swiss cheese memory
>
> The first difference you'll notice when comparing the two new cards
> with their predecessors is the change in memory capacity. Both the
> 8800 GTX and Ultra had a 384-bit memory bus with 768MB of GDDR3, while
> the 9800s make do with the same 256-bit 512MB of memory that resides
> on the GTS and GT iterations of the G92-based 8800s.
>
> Due to its two cores, the GX2 comes out tops in the memory bandwidth
> stakes at 128GB/s compared with the Ultra's 103.7GB/s, but the 9800
> GTX lags well behind both of the previous cards. What this all means
> in real terms is that at the higher resolutions, and most especially
> with full screen anti-aliasing turned on, the new cards take quite a
> hit at the levels we were hoping these big-panel pixel pushers would
> excel at.
>
> The differences between the GTX and GX2, and indeed the 8800GT, are
> slight; the GX2 simply relying on the brute force effect of the single
> card SLI factor. Where the difference between the two new G92 parts is
> most obvious though is the number of ROPs. The GTX is still hobbling
> along with 16, less than both the 8800GTX and Ultra at 24, but due to
> the doubling up, the 9800 GX2 has 32 ROPs. The difficulty is in
> knowing how much of a benefit this multi-GPU's extra ROPs gives us as
> opposed to the single card with 24.
>
> Bigger, faster, stronger
>
> So where do we find ourselves with the two new top-end cards? Well,
> mostly in the same place we were before to be honest. There's very
> little difference between this new set and the old, with the 9800 GTX
> being the biggest disappointment.
>
> It struggles to find any space between itself and the 8800 GTX (which
> it's supposed to be replacing), and there's also the fact that you can
> still pick up the older card - with the extra memory, bandwidth and
> ROPs - for less than 200. In some places you can save yourself around
> 50 and come out with an equivalent, and in some cases, faster card.
> The march of progress seems to have stomped right past this iteration
> of the 9800, and here at TechRadar we might just have to plump for the
> original DX10 monster.
>
> With regards to the GX2, Nvidia had to go down the multi-GPU route,
> not just to prove it could produce a functional version like AMD, but
> also to create a card that it could legitimately call the fastest
> graphics card around.
>
> The final verdict...
>
> Still, the memory constraints hold the GX2 back from being the
> superlative, stand out, top-end card de jour. On lower-res panels
> without silicon-melting anti-aliasing, it speeds ahead of the
> competition. Yet with all the bells and whistles cranked up to a
> deafening roar it struggles to break-even with the old 8800 Ultra.
> Again, if you shop around you can pick up an Ultra for around 350,
> and be fairly sure that your card will have drivers mature enough to
> cater for whatever you throw down its graphics tubes.
>
> The long and short of it is that if you've got yourself an 8800GTX or
> Ultra, and felt that twinge of envy at the announcement of this new
> generation of top-end cards, then you can stop worrying. In fact, you
> can probably be down-right smug as your slightly geriatric cards are
> still more than capable of holding their own against these
> youngbloods. Til the GT200 comes out that is...
>
> The full version of this review will appear in PC Format magazine
> issue 214 and will go on sale on 4 May.
> By Dave James and James Rivington