A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

(NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)



 
 
Thread Tools Display Modes
  #41  
Old August 20th 12, 06:59 PM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Flasherly[_2_]
external usenet poster
 
Posts: 2,407
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck,don't blow ! heatfins direction)

On Aug 20, 12:48 pm, John Larkin
wrote:

I wonder if CPU chip layouts include hot-spot distribution, like
putting the hottest bits into the corners or something.


I wouldn't as generally implemented within a more direct approach
ostensibly to remove added abstractions through dynamic modeling
practices as gate clocking and fetch latches contingent upon sensor
relays. Labeled under a safety badge to ensure another preventative
layer against failure conditions, the intent is established in
advantage to saving the core processor when a $2 heatsink or fan
malfunctions or improperly is manipulated outside provisional intents
by which they're packaged and distributed.
  #42  
Old August 20th 12, 10:28 PM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
John Larkin
external usenet poster
 
Posts: 307
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)

On Mon, 20 Aug 2012 10:59:07 -0700 (PDT), Flasherly
wrote:

On Aug 20, 12:48 pm, John Larkin
wrote:

I wonder if CPU chip layouts include hot-spot distribution, like
putting the hottest bits into the corners or something.


I wouldn't as generally implemented within a more direct approach
ostensibly to remove added abstractions through dynamic modeling
practices as gate clocking and fetch latches contingent upon sensor
relays. Labeled under a safety badge to ensure another preventative
layer against failure conditions, the intent is established in
advantage to saving the core processor when a $2 heatsink or fan
malfunctions or improperly is manipulated outside provisional intents
by which they're packaged and distributed.



Word salad. You must be AlwaysWrong.


  #43  
Old August 21st 12, 12:14 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Flasherly[_2_]
external usenet poster
 
Posts: 2,407
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck,don't blow ! heatfins direction)

On Aug 20, 5:28 pm, John Larkin
wrote:


Word salad. You must be AlwaysWrong.


Lex parsimoniae -- irrespective of older processors I do own, to have
attributed heat as randomness to an ordering of chip density cannot
either wrong or right, whether suspect or implicit in indulgence, much
as application [more correctly] negates relevancy by dint of simple
apparency;- well, almost. . .I did elect not to turn on heat
throttling in my CMOS.
  #44  
Old August 21st 12, 02:12 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)



"Jeff Liebermann" wrote in message
...

On Sat, 18 Aug 2012 12:59:41 -0700, John Larkin
wrote:

I have this theory that the fins of a heat sink should reduce a fan's
free-flow rate by 50% for optimum heat transfer.


"
On the original assertion, that it's better to suck than to blow,
methinks that's wrong.
"

I will draw a more detailed drawing for you what I ment with "suck". There
is also some "blow" involved.

Side view of proposed heat sink design by skybuck:


+--------------------------------------+
out----------- airflow -------------in FAN or CASE FAN
------------- airflow ---------------
|^|^|^|^|^|^|^|^|^|^|^|^| - suckage effect going up
|S|S|S|S|S|S|S|S|S|S|S|S|
|U|U|U|U|U|U|U|U|U|U|U|U| - heatfins
|C|C|C|C|C|C|C|C|C|C|C|C|
|K|K|K|K|K|K|K|K|K|K|K|K|K|
+--------------------------------------+

By blowing air over the heat fins as proposed this will hopefully create a
suck effect, sucking any dust out from between the heatfins

I do see some problems with this design... the tunnel will be small.... and
a big fan will have trouble blowing air into it... maybe a small one will be
enough... low rpm hopefully.

Bye,
Skybuck.





  #45  
Old August 21st 12, 02:16 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)



"Timothy Daniels" wrote in message
...

"Jeff Liebermann" wrote:
[.............] On the original assertion, that it's better to suck than
to blow,
methinks that's wrong. .....


"
Given the same mass/sec flow of air over the fins of a heatsink,
the best heat transfer is by blowing due to the greater turbulence -
which disturbs the boundary layer of air that lies in contact with
the fins and puts more flowing air in direct contact with the surface
of the fins. In the case where the fins rise up away from the source
of the heat, it's best to blow downward from the ends of the fins
toward the source of the heat. IOW, the air should move in a
direction opposite to the heat flow.

This principle is not only used in heat transfer systems, but also in
biological systems in oxygen transfer through membranes - as in
fish gills where the blood moves across the gill membrane in a
direction opposite to the flow of water. The basis of this principle
lies in the finite heat (or gas) capacity of a fluid and that greatest
heat (or gas) flow occurs as a linear function of the difference of
temperature (or gas concentration) between 2 bodies. Apply a little
calculus, and the principle of opposing flows results. This design
principle was recently seen when I opened up the case of a friend's
PC to clean it out: The cooling fins for the CPU rose up from the CPU,
and the cooling fan blew air down along the fins toward the CPU.
Obviously, the designer had paid attention during college freshman
physics.
"

I'd love to see simulation that actually includes dust particles and hair to
see how much effectiveness remains for this theory.

I suspect the simulation software used at the time did not include these
factors, and therefore all designs might be totally wrong for dusty/hairy
environments.

Bye,
Skybuck.

  #46  
Old August 21st 12, 02:19 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)



wrote in message ...

On Sun, 19 Aug 2012 01:56:38 -0500, "Tim Williams"
wrote:

Note that heat transfer by volume isn't usually the goal, so much as
minimum temperature is. In a counterflow setup, the hottest part of the
heatsink is cooled by the hottest air. If you flip it around, the hottest
part of the heatsink gets cooled by the coolest air, achieving the highest
heat flux for a given surface area and temperature difference -- more
power density, at some expense to mass flow and pumping loss. You might
avoid this, for example, if you had to use pure nitrogen (or helium, for
that matter) for some process, minimizing the gas flow to keep operating
cost down.


"
Why not use compressed/expanded air for this purpose ? Using a piston
compressor to compress the air to a few bars, the air gets quite hot,
then let it go through a heat exchanger to get rid of most of the heat
and cool the pressurized air closer to ambient temperature.

Let the air expand to normal ambient pressure and the air temperature
is now well below ambient temperature and let it flow through
semiconductor heatsinks to the environment.

To avoid problems with dust and condensation, a closed loop might make
sense, but of course, now the heat exchanger would also have to
dissipate the heat from the semiconductor. However, the heat exchanger
can be remotely located and it can have much higher temperatures than
the semiconductors, getting rid of the heat into the environment would
be easier.
"

I like this idea of a closed air system very much...

Maybe a case which is build entirely out of "heatsinks" or something... to
get rid of as much heat from inside the case to the outside...
without actually sucking in any dust/hair.

Bye,
Skybuck.

  #47  
Old August 21st 12, 02:24 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
John Larkin
external usenet poster
 
Posts: 307
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)

On Tue, 21 Aug 2012 03:19:55 +0200, "Skybuck Flying"
wrote:



wrote in message ...

On Sun, 19 Aug 2012 01:56:38 -0500, "Tim Williams"
wrote:

Note that heat transfer by volume isn't usually the goal, so much as
minimum temperature is. In a counterflow setup, the hottest part of the
heatsink is cooled by the hottest air. If you flip it around, the hottest
part of the heatsink gets cooled by the coolest air, achieving the highest
heat flux for a given surface area and temperature difference -- more
power density, at some expense to mass flow and pumping loss. You might
avoid this, for example, if you had to use pure nitrogen (or helium, for
that matter) for some process, minimizing the gas flow to keep operating
cost down.


"
Why not use compressed/expanded air for this purpose ? Using a piston
compressor to compress the air to a few bars, the air gets quite hot,
then let it go through a heat exchanger to get rid of most of the heat
and cool the pressurized air closer to ambient temperature.

Let the air expand to normal ambient pressure and the air temperature
is now well below ambient temperature and let it flow through
semiconductor heatsinks to the environment.

To avoid problems with dust and condensation, a closed loop might make
sense, but of course, now the heat exchanger would also have to
dissipate the heat from the semiconductor. However, the heat exchanger
can be remotely located and it can have much higher temperatures than
the semiconductors, getting rid of the heat into the environment would
be easier.
"

I like this idea of a closed air system very much...


It's been done, with better working fluids. You have one in your
kitchen.


  #48  
Old August 21st 12, 03:22 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)

Also perhaps a vaccuum would be created on the suck sections.

So then on the bottom little holes would need to be made to create little
openings to let air in...

So then it starts to seems a little bit more like the blown through
design... but this would be
some kind of hybrid design.

Some blow through and some suckage

Hopefully dust won't be sucked in from those tiny little holes... or at
least a whole lot less then the other designs...
otherwise it would be pointless.

Bye,
Skybuck.

  #49  
Old August 21st 12, 03:31 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)

Well, if this is the case, if it's the case of maximizing contact area then
here is an idea for a chip/gpu:

The gpu is cut up into many tiny little pieces.

The tiny little pieces are distributed over the entire graphics card.

Tiny little heatsinks which are larger then the gpu piece are stuck on top
of it.

This should maximize the area a bit more... better distribution of heat.

Since it's a parallel chip consisting out of multiple cores... it should be
possible to cut up those cores and distribute them
across the graphics card...

Added benefit is also more lanes towards all tiny little cores... for more
bandwidth and more memory lookup power.

These tiny little gpu pieces could by stuck between capcitators... or maybe
even on top of them... or vice versa...

Not sure if that's a good idea... or where to best place them... but some
spreading out seems nice.

If this would be any better than current situation remains to be seen...
current heatsinks also pretty massive
across the graphics board... so maybe it don't matter, or just very
little...

Or maybe it does matter... maybe having everything on a small little area
prevents optimal heat transfer...

Thus cutting the chip up into multiple pieces might make it better.

Maybe the entire chip design should be more like a building with windows in
it... and blow air directly through the chip... instead of an additional
heatsink.

Bye,
Skybuck.

  #50  
Old August 21st 12, 03:37 AM posted to alt.comp.hardware.pc-homebuilt,alt.comp.periphs.videocards.nvidia,comp.arch,sci.electronics.design
Skybuck Flying[_7_]
external usenet poster
 
Posts: 460
Default (NVIDIA) Fan-Based-Heatsink Designs are probably wrong. (suck, don't blow ! heatfins direction)



"John Larkin" wrote in message
...

On Tue, 21 Aug 2012 03:19:55 +0200, "Skybuck Flying"
wrote:



wrote in message ...

On Sun, 19 Aug 2012 01:56:38 -0500, "Tim Williams"
wrote:

Note that heat transfer by volume isn't usually the goal, so much as
minimum temperature is. In a counterflow setup, the hottest part of the
heatsink is cooled by the hottest air. If you flip it around, the hottest
part of the heatsink gets cooled by the coolest air, achieving the highest
heat flux for a given surface area and temperature difference -- more
power density, at some expense to mass flow and pumping loss. You might
avoid this, for example, if you had to use pure nitrogen (or helium, for
that matter) for some process, minimizing the gas flow to keep operating
cost down.


"
Why not use compressed/expanded air for this purpose ? Using a piston
compressor to compress the air to a few bars, the air gets quite hot,
then let it go through a heat exchanger to get rid of most of the heat
and cool the pressurized air closer to ambient temperature.

Let the air expand to normal ambient pressure and the air temperature
is now well below ambient temperature and let it flow through
semiconductor heatsinks to the environment.

To avoid problems with dust and condensation, a closed loop might make
sense, but of course, now the heat exchanger would also have to
dissipate the heat from the semiconductor. However, the heat exchanger
can be remotely located and it can have much higher temperatures than
the semiconductors, getting rid of the heat into the environment would
be easier.
"

I like this idea of a closed air system very much...


"
It's been done, with better working fluids. You have one in your
kitchen.
"

Yeah in case such a special case does not exist, a next best thing might
simply be a mini/tiny refrigator and place the entire pc inside of it...

My fridge actually has small little holes on the back side... so some cables
could go through it...

But it's a scary idea... electronics and moist.... hmm I'll have to look
into this somemore...

For now biggest drawback could be noise of fridge.... or maybe fridge can't
handle the pc heat at all...

Hmm..

Bye,
Skybuck.









 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
can heatsink/fan blow motherboard/cpu? Joe Befumo General 5 July 30th 06 11:50 PM
HELP: Case fan shroud for P4, suck or blow? [email protected] Overclocking 18 April 2nd 06 02:38 PM
Harddisks Fan - Should it blow or suck? adsci Homebuilt PC's 3 February 23rd 06 12:35 AM
OT fan question: suck or blow, which is best? mr_buggerlugs Asus Motherboards 12 December 17th 03 11:19 PM
To Suck or Blow? That is the question :) Tony Overclocking AMD Processors 3 September 19th 03 06:08 AM


All times are GMT +1. The time now is 11:32 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.