View Single Post
  #49  
Old August 21st 12, 03:31 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)

Well, if this is the case, if it's the case of maximizing contact area then
here is an idea for a chip/gpu:

The gpu is cut up into many tiny little pieces.

The tiny little pieces are distributed over the entire graphics card.

Tiny little heatsinks which are larger then the gpu piece are stuck on top
of it.

This should maximize the area a bit more... better distribution of heat.

Since it's a parallel chip consisting out of multiple cores... it should be
possible to cut up those cores and distribute them
across the graphics card...

Added benefit is also more lanes towards all tiny little cores... for more
bandwidth and more memory lookup power.

These tiny little gpu pieces could by stuck between capcitators... or maybe
even on top of them... or vice versa...

Not sure if that's a good idea... or where to best place them... but some
spreading out seems nice.

If this would be any better than current situation remains to be seen...
current heatsinks also pretty massive
across the graphics board... so maybe it don't matter, or just very
little...

Or maybe it does matter... maybe having everything on a small little area
prevents optimal heat transfer...

Thus cutting the chip up into multiple pieces might make it better.

Maybe the entire chip design should be more like a building with windows in
it... and blow air directly through the chip... instead of an additional
heatsink.

Bye,
Skybuck.