A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Video Cards » Nvidia Videocards
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Temperature range



 
 
Thread Tools Display Modes
  #11  
Old July 7th 08, 07:38 PM posted to alt.comp.periphs.videocards.nvidia
Augustus[_3_]
external usenet poster
 
Posts: 266
Default Temperature range


"Mhaxx" wrote in message
...
How do you know it's the videocard temperature that is causing lockups? I


I only suppose, I think CPU or GPU exceed temp could be the cause..

highly doubt a 6200LE card is doing duty running Crysis or other strenuous
3D activity. What is the CPU temp and the m/b temp under load? Ambient
room


I don't know, could you suggest a good prg to monitor CPU, GPU temps..
fan speed, etc?

Thanks,

Mhaxx


Rivatuner for the graphics card, Sandra or MBM for the m/b and CPU. I'm not
even sure a 6200LE has a thermal diode to minotor though.....


  #12  
Old July 7th 08, 08:47 PM posted to alt.comp.periphs.videocards.nvidia
Les Matthew
external usenet poster
 
Posts: 14
Default Temperature range

Mhaxx wrote:
How do you know it's the videocard temperature that is causing lockups? I


I only suppose, I think CPU or GPU exceed temp could be the cause..

highly doubt a 6200LE card is doing duty running Crysis or other strenuous
3D activity. What is the CPU temp and the m/b temp under load? Ambient room


I don't know, could you suggest a good prg to monitor CPU, GPU temps..
fan speed, etc?

Thanks,

Mhaxx


I use speedfan from he http://www.almico.com/sfdownload.php

monitors cpu and gpu temps and gives voltages and fan speeds.


les...
  #13  
Old July 8th 08, 08:46 AM posted to alt.comp.periphs.videocards.nvidia
Mhaxx[_2_]
external usenet poster
 
Posts: 18
Default Temperature range

I use speedfan from he http://www.almico.com/sfdownload.php

monitors cpu and gpu temps and gives voltages and fan speeds.


FanSpeed gives my CPU temp from about 45°C to 60°C (it's okay) and GPU about
85°C: in your opinion, right temp for GeForce 6200 LE?

Massimo


  #14  
Old July 8th 08, 08:48 AM posted to alt.comp.periphs.videocards.nvidia
Mhaxx[_2_]
external usenet poster
 
Posts: 18
Default Temperature range

Rivatuner for the graphics card,

I'm downloading it, but I've read it optimizes the NVIDIA card, not gets GPU
temperature.. and since yesterday I've installed the latest driver for my
card I suppose it's already well optimized. Are you sure you can get the
temperature?

Sandra or MBM for the m/b and CPU. I'm not
even sure a 6200LE has a thermal diode to minotor though.....


Oh..

Anyway, I can't find an official page (from nvidia.com) where they declare
the accaptable temp range.. can you?

Massimo


  #15  
Old July 8th 08, 01:41 PM posted to alt.comp.periphs.videocards.nvidia
Les Matthew
external usenet poster
 
Posts: 14
Default Temperature range

Mhaxx wrote:
I use speedfan from he http://www.almico.com/sfdownload.php

monitors cpu and gpu temps and gives voltages and fan speeds.


FanSpeed gives my CPU temp from about 45°C to 60°C (it's okay) and GPU about
85°C: in your opinion, right temp for GeForce 6200 LE?

Massimo



85 for GPU does seem a bit high for regular desktop. I have an 8800GT
that idles around the 50 mark and that will sucking way more power than
a 6200.


les...
  #16  
Old July 8th 08, 03:31 PM posted to alt.comp.periphs.videocards.nvidia
Mhaxx[_2_]
external usenet poster
 
Posts: 18
Default Temperature range

85 for GPU does seem a bit high for regular desktop. I have an 8800GT

Mmm..

that idles around the 50 mark and that will sucking way more power than
a 6200.


I can't find an official page (from nvidia.com) where they declare the
acceptable temp range! Can you?

Massimo


  #17  
Old July 8th 08, 03:47 PM posted to alt.comp.periphs.videocards.nvidia
Patrick Vervoorn
external usenet poster
 
Posts: 117
Default Temperature range

In article , Mhaxx wrote:
85 for GPU does seem a bit high for regular desktop. I have an 8800GT


Mmm..

that idles around the 50 mark and that will sucking way more power than
a 6200.


I can't find an official page (from nvidia.com) where they declare the
acceptable temp range! Can you?


No, but I haven't tried hard either, but we're telling you 85 deg C is
certainly on the high side. Even my 8800GTX, after running a heavy game
like Crysis for a few hours on end, doesn't get much higher than 75 deg C,
and quickly drops back to ~55-60 deg C when I quit that.

While 85 deg C won't immediately kill an NVidia GPU, it is pretty high,
especially if this is the temperature when the GPU is simply idling (or
running your desktop).

Check your fans, check your airflow, try checking it again with the case
open, and perhaps find some kind of fan (a house-fan would even do) to
blow some forced air into the opened case, to see if that lowers the
temperature readings, and perhaps even solves your problems...

Regards,

Patrick.
  #18  
Old July 8th 08, 03:53 PM posted to alt.comp.periphs.videocards.nvidia
Mr.E Solved!
external usenet poster
 
Posts: 888
Default Temperature range

Mhaxx wrote:

I can't find an official page (from nvidia.com) where they declare the
acceptable temp range! Can you?

Massimo



Many newer GeForce cards have temperature sensitive throttling circuits,
if they get too hot they shut or slow down. That is the only documented
unacceptable temperature range, and it is well over 90c.

Your biggest concern should be how much over ambient your card is idling
and then how much over that, the load temperature is. If you notice a
larger jump than normal in time, you then need to clean the dust out of
the HSF.

Each case and configuration is different, someone might get 50c at idle
with one card another might get 45c, while yet another gets 60c

Higher temperatures can take away card life from the far end, but do not
worry too much about high temps and card life, I have had a 7950GT,
fanless, that typically runs at 90c under load and has for 2 years.
  #19  
Old July 8th 08, 11:03 PM posted to alt.comp.periphs.videocards.nvidia
Phil Weldon[_2_]
external usenet poster
 
Posts: 131
Default Temperature range

'Mhaxx' wrote:
Rivatuner for the graphics card,


I'm downloading it, but I've read it optimizes the NVIDIA card, not gets
GPU
temperature.. and since yesterday I've installed the latest driver for my
card I suppose it's already well optimized. Are you sure you can get the
temperature?

Sandra or MBM for the m/b and CPU. I'm not
even sure a 6200LE has a thermal diode to minotor though.....


Oh..

Anyway, I can't find an official page (from nvidia.com) where they declare
the accaptable temp range.. can you?

_____

For 'Riva Tuner "optimize" mainly means increasing clock speeds to get
better than factory performance. nVidia drivers, no matter how new, don't
do that. 'Riva Tuner was recommended because it reports information like
GPU temperature and clock speeds, as well as providing the capability to
adjust clock speeds on the graphics board (there may be as many as three
different clocks, depending on the card.

Does your 6200 LE even HAVE a cooling fan? A 6200 LE is not a very capable
graphics card, and doesn't use much power. Also, there are many different
6200 LE graphics cards with many different collections of components, memory
size, memory bus width. If you still require help with your problem, post
the exact model and manufacture of your GeForce 6200 LE, whether it has a
fan or not, the room ambient temperature, the air temperature inside the
system case, and the specifications for the rest of your system, including
the number of case fans. With such a low power graphics card, any heat
problems would have to be from one or more of a poor interface between the
heatsink and the GPU, insufficient system case ventilation, failed GPU
heatsink fan, and/or dust and grease clogged GPU heatsink fins.

Phil Weldon

"Mhaxx" wrote in message
...
Rivatuner for the graphics card,


I'm downloading it, but I've read it optimizes the NVIDIA card, not gets
GPU
temperature.. and since yesterday I've installed the latest driver for my
card I suppose it's already well optimized. Are you sure you can get the
temperature?

Sandra or MBM for the m/b and CPU. I'm not
even sure a 6200LE has a thermal diode to minotor though.....


Oh..

Anyway, I can't find an official page (from nvidia.com) where they declare
the accaptable temp range.. can you?

Massimo



  #20  
Old July 9th 08, 04:42 AM posted to alt.comp.periphs.videocards.nvidia
DRS
external usenet poster
 
Posts: 588
Default Temperature range

"Mhaxx" wrote in message

85 for GPU does seem a bit high for regular desktop. I have an 8800GT


Mmm..

that idles around the 50 mark and that will sucking way more power
than a 6200.


I can't find an official page (from nvidia.com) where they declare the
acceptable temp range! Can you?


The GPU's threshold value is accessible from the chip itself. Utilities
like RivaTuner should be able to extract it via the NVCPL API.


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
VMR9 video playback too bright/washed (TV range instead of PC range) Ehud Shapira Nvidia Videocards 7 February 15th 07 06:45 PM
Working Temperature Range for P4 Ricky Romaya General Hardware 13 June 17th 05 06:04 PM
P4 3.4GHz temperature range Anthony Elcocks Homebuilt PC's 5 December 7th 04 11:56 AM
PIII temperature range darryl Homebuilt PC's 2 November 18th 03 02:58 AM
GeForce FX 5900 - Temperature Range Rich Nvidia Videocards 4 November 1st 03 12:26 AM


All times are GMT +1. The time now is 07:43 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.