PDA

View Full Version : is it possible to modding a geforce 8800 into a quadro ?


g. bon
March 10th 08, 08:09 PM
Is there any hardware or software method to transform a geforce 8800 (GTS -
512MB) into a quadro ?
Where can we found the method to do it ?

thanks,

GB

DaveW[_5_]
March 10th 08, 11:52 PM
Wrong. The hardware and GPU are different. No can do.

--
--DaveW


"g. bon" > wrote in message
...
> Is there any hardware or software method to transform a geforce 8800
> (GTS - 512MB) into a quadro ?
> Where can we found the method to do it ?
>
> thanks,
>
> GB
>

Augustus
March 11th 08, 12:09 AM
"DaveW" > wrote in message
. ..
> Wrong. The hardware and GPU are different. No can do.

Wrong again, Dave W. I guess that's why the Quadro FX 4600 and 5600 use the
G80 90nm GPU which, wonder of wonders, is the exact same G80 GPU as the
8800GTX and 8800 Ultra. It can be done via softmod, but it's messy and does
not work as well as previous softmods of older cards. I would not recommend
it. Tons of threads out there, but this one is typical

http://forums.guru3d.com/showthread.php?t=220942&highlight=8800+quadro

deimos[_2_]
March 11th 08, 02:02 AM
Augustus wrote:
> "DaveW" > wrote in message
> . ..
>> Wrong. The hardware and GPU are different. No can do.
>
> Wrong again, Dave W. I guess that's why the Quadro FX 4600 and 5600 use the
> G80 90nm GPU which, wonder of wonders, is the exact same G80 GPU as the
> 8800GTX and 8800 Ultra. It can be done via softmod, but it's messy and does
> not work as well as previous softmods of older cards. I would not recommend
> it. Tons of threads out there, but this one is typical
>
> http://forums.guru3d.com/showthread.php?t=220942&highlight=8800+quadro
>
>

NVIDIA's chip designation is based on the same cores (G80, G84, etc),
but they seem to be adding "GL onto them to differentiate Quadro's and I
doubt this is merely marketing. Older cards (GF6 series and before)
were easily modded to Quadro's with RivaTuner, but newer ones are a
total pain in the ass and professional applications like Solidworks may
not detect them as a Quadro.

Augustus
March 11th 08, 02:55 AM
> NVIDIA's chip designation is based on the same cores (G80, G84, etc), but
> they seem to be adding "GL onto them to differentiate Quadro's and I doubt
> this is merely marketing. Older cards (GF6 series and before) were easily
> modded to Quadro's with RivaTuner, but newer ones are a total pain in the
> ass and professional applications like Solidworks may not detect them as a
> Quadro.

It's more the case that the GL extensions and functions are enabled by the
Quadro BIOS and Quadro drivers, rather than the G80 architecture is
physically different. The GPU, shaders and memory have different different
clock settings. Memory size is larger, as is the PCB. The Nvidia CUDA
Programming Guide 1.1 has most of this stuff in it.

Benjamin Gawert
March 11th 08, 04:48 AM
* DaveW:

> Wrong. The hardware and GPU are different. No can do.

Still no clue what you're talking about, ******? FYI: the GPUs of Quadro
and Geforce are identical.

How about a nice cup of shut the **** up?

Benjamin

Benjamin Gawert
March 11th 08, 04:50 AM
* deimos:

> NVIDIA's chip designation is based on the same cores (G80, G84, etc),
> but they seem to be adding "GL onto them to differentiate Quadro's and I
> doubt this is merely marketing.

The GPUs are 100% identical, that's a fact. The difference lies in the
functionality that is activated by the driver.

> Older cards (GF6 series and before)
> were easily modded to Quadro's with RivaTuner, but newer ones are a
> total pain in the ass and professional applications like Solidworks may
> not detect them as a Quadro.

That's because changes in the hardware made it more different to change
the GPU ID from Geforce to Quadro.

Benjamin

deimos[_2_]
March 12th 08, 01:03 AM
Benjamin Gawert wrote:
> * deimos:
>
>> NVIDIA's chip designation is based on the same cores (G80, G84, etc),
>> but they seem to be adding "GL onto them to differentiate Quadro's and
>> I doubt this is merely marketing.
>
> The GPUs are 100% identical, that's a fact. The difference lies in the
> functionality that is activated by the driver.

Do you have a source on this? I used to think the same, but I've seen
anecdotal evidence that the cores have different transistor counts and
certainly there are different surface components (accounting for the
12-bit color output precision and other media production centric features).

>> Older cards (GF6 series and before) were easily modded to Quadro's
>> with RivaTuner, but newer ones are a total pain in the ass and
>> professional applications like Solidworks may not detect them as a
>> Quadro.
>
> That's because changes in the hardware made it more different to change
> the GPU ID from Geforce to Quadro.
>
> Benjamin

Many professional applications like the ones I work with actually have
some code for Quadro specific detection that goes beyond just the driver
or device ID, so there must be an API method of detecting Quadro
features other than just DXCaps or GL extensions.

Benjamin Gawert
March 12th 08, 03:11 PM
* deimos:

>> The GPUs are 100% identical, that's a fact. The difference lies in the
>> functionality that is activated by the driver.
>
> Do you have a source on this?

Yes, but none that is publically available.

> I used to think the same, but I've seen
> anecdotal evidence that the cores have different transistor counts and
> certainly there are different surface components (accounting for the
> 12-bit color output precision and other media production centric features).

These "evidences" are, as you said already, anecdotal.

> Many professional applications like the ones I work with actually have
> some code for Quadro specific detection that goes beyond just the driver
> or device ID,

Which applications are that and how exactly are they trying to detect a
Quadro beyond driver information or device ID?

> so there must be an API method of detecting Quadro
> features other than just DXCaps or GL extensions.

Nope, there isn't.

Benjamin

Mr.E Solved!
March 12th 08, 06:32 PM
deimos wrote:

> Many professional applications like the ones I work with actually have
> some code for Quadro specific detection that goes beyond just the driver
> or device ID, so there must be an API method of detecting Quadro
> features other than just DXCaps or GL extensions.

No, but you are right in that it is the GL extensions recognition that
make up the bulk of the performance difference betwixt the two cards.

The chippies are the same: both CUDA GPUs, it's the drivers that are
different, different capabilities and ways of using the circuitry.

Also, a minor but interesting point, Quadros are severely stressed
before being put into the channel, the blue screen and nv4_disp.dll
errors that GeForce users see with too much frequency (for any reason)
you just don't get with Quadros, they are designed to be flawless 24/7,
like a CPU.