View Single Post
  #10  
Old September 28th 03, 02:36 PM
Radeon350
external usenet poster
 
Posts: n/a
Default

Tony Hill wrote in message t.com...
On Sat, 27 Sep 2003 07:36:59 GMT, "graphics processing unit"
wrote:
While not directly related to Nvidia or ATI, the fact that XGI is entering
the consumer graphics industry with its range of Volari GPUs may effect both
of the current leaders. hopefully in a positive way, for the end user. God
knows we could use some more competition here.


"I'll believe it when I see it". There have been a LOT of graphics
cards that were supposed to be the next big thing to come along. S3
has done it a handful of times (and again just recently with Delta
Chrome), Matrox has done it, BitBoys did it several times without ever
having a product, and now we've got XGI. So far none of these cards
have managed to compete very effectively with the low-end chips from
ATI or nVidia, let alone their high-end stuff.

The real key is in getting decent drivers. This is why nVidia took
over the graphics world, not by their hardware. nVidia managed to get
fast and *stable* drives out for all of their products while 3dfx and
ATI were floundering with buggy drivers that were missing features and
having either very poor performance or, at best, uneven performance.
ATI has since learned from their mistakes and really improved the
quality of their drivers, but they are about the only one.

Right now there are three players in the graphics market, ATI, nVidia
and Intel (with Intel actually been the largest supplier). Most of
the world's computer users do VERY well with integrated graphics, and
have absolutely ZERO reason to buy an add-in card. That just leaves
an extremely small market at the very high-end and a decent sized but
very low-margin market in the mid range. If XGI wants to succeed,
they need to get a graphics card out for $100 that has stable drivers
and that can match or beat whatever nVidia and ATI are selling for
~$125 at the time (right now that would be the GeForceFX 5600 and the
Radeon 9600).

I ain't holding my breath. I'll be surprised if they ever get stable
drivers, let alone within the next 6 months of it's release. And
that's just talking about Windows drivers, the situation is likely to
be even worse for their Linux drivers if they even bother to make
those at all.

Personally, I am most excited about the Volari V8 Duo - first *consumer*
graphics card configuration to sport twin Grahpics Processing Units.


I'm not. I doubt that it manage to match a GeforceFX 5600 or ATI
Radeon 9600, yet it will likely cost a LOT more. It all comes back to
drivers, especially for a more complicated design with two graphics
processors.

Besides that, their claim as being the first consumer card with dual
GPUs is REALLY stretching things. They're taking a very narrow view
on just what it means to be a consumer card and what it takes to be
considered a GPU. Marketing at it's best/worst here.

-------------
Tony Hill
hilla underscore 20 at yahoo dot ca



I don't see why it is such a stretch. First of all, there are not many
companies that make consumer GPUs to begin with. They can be counted
on one hand, I believe. And as far as I am aware, none have released a
card with more than one GPU, for consumer use. Yeah, there are dozens
of cards that use 2 or more GPUs, from a number of companies, for all
kinds of highend, non-consumer applications. many of them predate
Nvidia's NV10/GeForce256, which was the first working consumer GPU,
but *certainly* not the first-ever GPU. that is, a chip with T&L
on-chip.

Actually I don't just use the term 'GPU' as Nvidia uses it. To myself
and to many who use graphics processors, something that takes the
geometry processing load OFF the CPU, putting on the graphics chip,
that's a 'graphics processor' or graphics processing unit / GPU as
Nvidia coined it. The 3Dfx Voodoo chips, including VSA-100s used in
Voodoo5 5500 and 6000 did NOT do that at all. Neither did any of the
pre-Radeon ATI chips, inluding the duel Rage Fury chips in the MAXX
card. And basicly any consumer 3D PC chip before the GeForce256. Any
graphics chip that lacks what used to be called 'geometry processing'
or what was commenly called T&L in late 1999 when GeForce came out,
and is now called Vertex Shading, if it lacks that, it's usually
concidered a 3D accelerator or rasterizer, rather than a complete
'graphics processor' or GPU. At least that is the way I have
understood things for a long time.

On the other hand,
I suppose one can argue that any graphics chip, be it 2D or 3D is a
'GPU', anything from a 1990 VGA chip, to a Voodoo1, to the Graphics
Synthesizer in the PS2. However it is commen practice in the graphics
industry to differentiate between a rasterizer and a complete graphics
processor with geometry & lighting (now vertex shading) on board.

So therefore, I do not find the marketing of XGI to be outrageous in
their claims of having the first duel GPU card for consumer use. Of
course, they *will* have to bring Volari to market. it will have to
work. in other words "believe when we see it" that still applies. but
the specific claim of having the first duel GPU card is not a stretch
in and of itself, in my book