A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » System Manufacturers & Vendors » Compaq Servers
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Bloody heat ?



 
 
Thread Tools Display Modes
  #1  
Old July 25th 05, 09:59 PM
Phil
external usenet poster
 
Posts: n/a
Default Bloody heat ?

Nutcracker,
You are the expert, just how hot can the ambient temperature be with theses
servers and raid enclosures be before something starts crapping out?

We lost the HVAC to the server room, we have doors open (NO windows in room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me. I had
backups to what I thought was everything BUT the cooling. For some reason I
did not take that failure into consideration.

Thanks,
Phil


  #2  
Old July 25th 05, 10:24 PM
Nut Cracker
external usenet poster
 
Posts: n/a
Default

Hello Phil,

To an extent, it depends on the models of the servers. I have never had a
problem with storage shutting down, and I have had that stuff running in
some very HOT environments.

There are parts to my answer.

First, most of the systems will go into Thermal Shutdown around 105 degree's
F (internal temp). They will shut down (gracefully) until they cool off. At
5 minute intervals, the servers power back on to see if they have cooled
down enough to power up and start running again.

As long as you have *some* airflow going into the front of those machines,
and there is ample airflow to exhaust the heat that comes out of the backs,
you should be OK until you get the HVAC system repaired.

Second, setting up the servers for a Thermal ASR (automatic server reset) is
an OPTION. It doesnt have to be enabled for the servers to operate. Of
course, you do so at your own risk.

If you have the system management hompage installed on your amchines, then
you should be able go into it and tell the server whether or not you want
Thermal Reset (or something like that) enabled.

If you have any other quetsions, I will be happy to answer them for you.

- LC

"Phil" wrote in message
...
Nutcracker,
You are the expert, just how hot can the ambient temperature be with
theses
servers and raid enclosures be before something starts crapping out?

We lost the HVAC to the server room, we have doors open (NO windows in
room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me. I had
backups to what I thought was everything BUT the cooling. For some reason
I
did not take that failure into consideration.

Thanks,
Phil




  #3  
Old July 26th 05, 03:04 AM
Phil
external usenet poster
 
Posts: n/a
Default

LC,
That 105 may be the magic number, but that room was allot hotter than that.

I'm using about 25 servers, 1850R's and dl380's with a few more odd balls.
10 of the EU's 12 drives ea and 4 newer arrays 14 dives ea. Add the
switches, kvm, kvm extenders, ip control, multiple 3, and 6, KVA UPS's,
satellite equipment, routers, phone equipment, alarm equipment, CCTV
equipment, monitors and lighting and that's allot of heat. There is also a
bank of transformers in a big steel cabinet that generate heat too, can
hardly touch them, they tie into the panels and backup generators.

I didn't realize it until I started counting, but over 200 scsi drives in
that mess. Nowhere else do I have a concentration of over about 5 computers.
Already switched all my monitors out to flat screens which saved a few bucks
on power and heat.

We cut a 5-foot hole in the roof of that small building, pulled the drop
ceiling tiles out above the racks and temporarily set a 5-foot ceiling fan
over the cutout in the roof. That has dropped the temp considerably, and is
moving a hurricane like flow of air through now.

So far nothing has shut down from heat (that I know of), but this morning I
had 6 bix box fans blowing directly on the racks. I had already shut some
stuff down, but some stuff has to operate regardless.

Heat, and power is one reason I was thinking about upgrading to more modern
servers, but I question what dependability I would gain with the new cheaper
built equipment. This stuff has worked trouble free, and does everything I
need, just not cheap to operate and probably more dependable than new
equipment. I thought I had it set up so that any given failure would not
stop any process but 95 degrees outside is not helping. Damn, it's been hot.

Time to redesign my layout after this. I've learned allot about server
operation, but it has been a slow evolution and a continuous learning
experience, most the hard way. We never planed to have that much "stuff" in
there, it's been a slow process of adding equipment, so it simply wasn't
designed for.

Thanks again, for the time,
Phil

"Nut Cracker" wrote in message
...
Hello Phil,

To an extent, it depends on the models of the servers. I have never had a
problem with storage shutting down, and I have had that stuff running in
some very HOT environments.

There are parts to my answer.

First, most of the systems will go into Thermal Shutdown around 105

degree's
F (internal temp). They will shut down (gracefully) until they cool off.

At
5 minute intervals, the servers power back on to see if they have cooled
down enough to power up and start running again.

As long as you have *some* airflow going into the front of those machines,
and there is ample airflow to exhaust the heat that comes out of the

backs,
you should be OK until you get the HVAC system repaired.

Second, setting up the servers for a Thermal ASR (automatic server reset)

is
an OPTION. It doesnt have to be enabled for the servers to operate. Of
course, you do so at your own risk.

If you have the system management hompage installed on your amchines, then
you should be able go into it and tell the server whether or not you want
Thermal Reset (or something like that) enabled.

If you have any other quetsions, I will be happy to answer them for you.

- LC

"Phil" wrote in message
...
Nutcracker,
You are the expert, just how hot can the ambient temperature be with
theses
servers and raid enclosures be before something starts crapping out?

We lost the HVAC to the server room, we have doors open (NO windows in
room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me. I had
backups to what I thought was everything BUT the cooling. For some

reason
I
did not take that failure into consideration.

Thanks,
Phil






  #4  
Old July 26th 05, 03:26 AM
Phil
external usenet poster
 
Posts: n/a
Default

LS, yes I know what I have, UE, arrays !, my daughter is in Europe and
guess that was somewhat on my mind ( EU)
Long hot aggravating day. A total loss of time.
Phil


  #5  
Old July 26th 05, 03:30 AM
NuTCrAcKeR
external usenet poster
 
Posts: n/a
Default

hehe ... nice.

"Phil" wrote in message
...
LS, yes I know what I have, UE, arrays !, my daughter is in Europe and
guess that was somewhat on my mind ( EU)
Long hot aggravating day. A total loss of time.
Phil




  #6  
Old July 26th 05, 05:04 AM
Kevin Childers
external usenet poster
 
Posts: n/a
Default

Interesting you should bring this up, I am currently working for a
company that is suffering through to an A/C failure for the entire building
that has the server room shut down. Some joker brought in an old gas
station thermometer that was scaled to 135(F) degrees. It topped out in
under 15 minutes. As a temp solution we have moved the servers out and
distributed them around the network in what has to be one of the most
physically unsecured environments possible.

Two other sites I've worked at have had heat issues and in both cases
the key was air flow just as you have found out. In both instances heat was
not considered during the initial establishment of the server room. Here
are the two long term solutions they came up with and the solution I have
put forward for my current position.

1. Location, one story office complex with server room in a windowless
former storage room. Just like you we cut a large hole out, but in the back
wall. We then had installed an independent A/C unit (the same as some cheap
motels use) with a thermal switch set at 80 degrees. Setting was found
through trial and error as we noted the A/C unit needed a bit of a head
start to keep the server room temperature below 90 degrees. Realizing that
the ambient air temperature needed to be cooler than the max operating
temperature of the servers to provide the necessary heat transfer as it was
pulled through the cases.

2. Location, at the top of a stair well in a split level building. Server
room has it's own HVAC unit, but no back up. The HVAC was running 24/7/365
and the room temperature averaged around 80 degrees regardless of the season
or the thermostats settings. Due to power availability and structural
issues, we cut a hole in the ceiling and installed a large drum fan with
ducting to carry the hot air out of the building and provide a weather proof
system. Site is in a hurricane prone area and to date the ducting has kept
the rain out. The only draw back is that the fan is very noisy, but then it
is an emergency fall back system so it is not on that often. Later we
redirected an HVAC vent from a storage closet on the first floor into the
server room as a back up. The fan kept the temperature within the
operational range, but the front office didn't like the techs working in
shorts and t-shirts.

3. Location, former common/meeting room in the center (20 feet from nearest
accessible outside wall) of a multi-story office building. They now have
some high efficiency A/C unit for home/office room additions that only
require a 3 inch opening to connect the inside and outside units. Am
proposing we install same. Local climate control folks say that the piping
can be run with a bit of custom work on their part. With proper insulation
there should be no problem with condensation and the unit can support the
expected heat load. This will make the server room independent of the
buildings HVAC should it fail in the future. One advantage the climate
control folks mentioned is that the replacement unit for the HVAC unit would
need to be scaled up as the original unit was never intended to support the
sort of heat load that the server room adds to the building. Cost wise it
comes out about even and should the server room system fail in the future we
have the option to open up both ends of the server room and sponge off the
buildings ambient air until it can be fixed.

As a point of minor interest, the local community college uses it's
former mainframe server room as a mortuary. They long ago built a
"Technology Center" to hold their server rooms and computer labs. Until
they began teaching "Mortuary Science" the room was of little use due to the
custom HVAC system being to cold for classrooms or offices and the being no
heating system, the heat from the servers was ample enough.

KC


"Phil" wrote in message
...
LC,
That 105 may be the magic number, but that room was allot hotter than

that.

I'm using about 25 servers, 1850R's and dl380's with a few more odd balls.
10 of the EU's 12 drives ea and 4 newer arrays 14 dives ea. Add the
switches, kvm, kvm extenders, ip control, multiple 3, and 6, KVA UPS's,
satellite equipment, routers, phone equipment, alarm equipment, CCTV
equipment, monitors and lighting and that's allot of heat. There is also a
bank of transformers in a big steel cabinet that generate heat too, can
hardly touch them, they tie into the panels and backup generators.

I didn't realize it until I started counting, but over 200 scsi drives in
that mess. Nowhere else do I have a concentration of over about 5

computers.
Already switched all my monitors out to flat screens which saved a few

bucks
on power and heat.

We cut a 5-foot hole in the roof of that small building, pulled the drop
ceiling tiles out above the racks and temporarily set a 5-foot ceiling fan
over the cutout in the roof. That has dropped the temp considerably, and

is
moving a hurricane like flow of air through now.

So far nothing has shut down from heat (that I know of), but this morning

I
had 6 bix box fans blowing directly on the racks. I had already shut some
stuff down, but some stuff has to operate regardless.

Heat, and power is one reason I was thinking about upgrading to more

modern
servers, but I question what dependability I would gain with the new

cheaper
built equipment. This stuff has worked trouble free, and does everything I
need, just not cheap to operate and probably more dependable than new
equipment. I thought I had it set up so that any given failure would not
stop any process but 95 degrees outside is not helping. Damn, it's been

hot.

Time to redesign my layout after this. I've learned allot about server
operation, but it has been a slow evolution and a continuous learning
experience, most the hard way. We never planed to have that much "stuff"

in
there, it's been a slow process of adding equipment, so it simply wasn't
designed for.

Thanks again, for the time,
Phil

"Nut Cracker" wrote in message
...
Hello Phil,

To an extent, it depends on the models of the servers. I have never had

a
problem with storage shutting down, and I have had that stuff running in
some very HOT environments.

There are parts to my answer.

First, most of the systems will go into Thermal Shutdown around 105

degree's
F (internal temp). They will shut down (gracefully) until they cool off.

At
5 minute intervals, the servers power back on to see if they have cooled
down enough to power up and start running again.

As long as you have *some* airflow going into the front of those

machines,
and there is ample airflow to exhaust the heat that comes out of the

backs,
you should be OK until you get the HVAC system repaired.

Second, setting up the servers for a Thermal ASR (automatic server

reset)
is
an OPTION. It doesnt have to be enabled for the servers to operate. Of
course, you do so at your own risk.

If you have the system management hompage installed on your amchines,

then
you should be able go into it and tell the server whether or not you

want
Thermal Reset (or something like that) enabled.

If you have any other quetsions, I will be happy to answer them for you.

- LC

"Phil" wrote in message
...
Nutcracker,
You are the expert, just how hot can the ambient temperature be with
theses
servers and raid enclosures be before something starts crapping out?

We lost the HVAC to the server room, we have doors open (NO windows in
room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me. I

had
backups to what I thought was everything BUT the cooling. For some

reason
I
did not take that failure into consideration.

Thanks,
Phil








  #7  
Old July 26th 05, 03:57 PM
Phil
external usenet poster
 
Posts: n/a
Default

LC, Kevin,
In my case equipment is in a small (1500sf) separate concrete block building
with foam insulated, and sheet rocked walls and no windows. We have a 4-foot
space between the drop ceiling and the roof. Roof is flat so I thought
cutting rafters would be easier, cheaper, and quicker than walls. I'm going
to leave that fan on the roof, have a weatherproof enclosure built over the
fan with louvers and install steel bars for security. It's a temporary
solution, but in light of a total loss of operation, it is working.

I'm up in the mountains of North Carolina and we rarely see these high
outside temperatures and can hardly justify installing two HVAC systems.
(perhaps I need to reevaluate that too) Our compressor took a nosedive and
it wasn't a simple quick fix. I've found for the most part, build a backup
to the primary "whatever" and the primary won't fail or don't prepare and
look out. I'll leave that fan on the roof, just in case, but it is working
(cross fingers) and nothing is shutting down now, although still too warm to
comfortably work in there with clothes on!

Yesterday, prior to the hole/fan we did start having failures in some
servers. Luckily it was not all servers and the amazing thing was the their
respective backups kept going so I didn't loose one file.

I have too much equipment to temporarily relocate it. Our operation is not
just file servers We pull data from various satellites process it through
multiple machines, while archiving some, it's also routed to multiple other
machines. Data routing is very intertwined among various processes as to
make a "system", not just a collection of file servers that I could
distribute on the network.

I did like most of do in these situations, improvised. I will say LC while
some servers may shut down at 105, with a high, hurricane like volume of air
moving through they seem to hang in there. Just fans on the racks and fans
in the buildings open doors did not help much, just more heat than we could
push out the doors.

The only reason I outline my "mountain" solution, it keeps us going until
the primary problem is resolved. We had that ceiling cut open and a big fan
setting on the hole in under 3 hours. I'm sure it will take days to
weatherproof and secure it.

Wow, who would ever thought all that is involved to just keep a computer
running!
I guess the lesson here is Plan ahead, or you might get caught with your
pants down, and loose your shirt, literally, and not with the new secretary
!

Stay cool,
Phil

Oh, By the way LC, that fellow contacted me directly about fedora on the
older compact servers. I gave him the info to get him fixed up and he has it
going with no problems in graphics mode. If anyone needs to load fedora on
any of the older Compaq servers I'll be happy to share the tricks.
Have a good day,
Phil



"Kevin Childers" wrote in message
...
Interesting you should bring this up, I am currently working for a
company that is suffering through to an A/C failure for the entire

building
that has the server room shut down. Some joker brought in an old gas
station thermometer that was scaled to 135(F) degrees. It topped out in
under 15 minutes. As a temp solution we have moved the servers out and
distributed them around the network in what has to be one of the most
physically unsecured environments possible.

Two other sites I've worked at have had heat issues and in both cases
the key was air flow just as you have found out. In both instances heat

was
not considered during the initial establishment of the server room. Here
are the two long term solutions they came up with and the solution I have
put forward for my current position.

1. Location, one story office complex with server room in a windowless
former storage room. Just like you we cut a large hole out, but in the

back
wall. We then had installed an independent A/C unit (the same as some

cheap
motels use) with a thermal switch set at 80 degrees. Setting was found
through trial and error as we noted the A/C unit needed a bit of a head
start to keep the server room temperature below 90 degrees. Realizing

that
the ambient air temperature needed to be cooler than the max operating
temperature of the servers to provide the necessary heat transfer as it

was
pulled through the cases.

2. Location, at the top of a stair well in a split level building.

Server
room has it's own HVAC unit, but no back up. The HVAC was running 24/7/365
and the room temperature averaged around 80 degrees regardless of the

season
or the thermostats settings. Due to power availability and structural
issues, we cut a hole in the ceiling and installed a large drum fan with
ducting to carry the hot air out of the building and provide a weather

proof
system. Site is in a hurricane prone area and to date the ducting has

kept
the rain out. The only draw back is that the fan is very noisy, but then

it
is an emergency fall back system so it is not on that often. Later we
redirected an HVAC vent from a storage closet on the first floor into the
server room as a back up. The fan kept the temperature within the
operational range, but the front office didn't like the techs working in
shorts and t-shirts.

3. Location, former common/meeting room in the center (20 feet from

nearest
accessible outside wall) of a multi-story office building. They now have
some high efficiency A/C unit for home/office room additions that only
require a 3 inch opening to connect the inside and outside units. Am
proposing we install same. Local climate control folks say that the

piping
can be run with a bit of custom work on their part. With proper

insulation
there should be no problem with condensation and the unit can support the
expected heat load. This will make the server room independent of the
buildings HVAC should it fail in the future. One advantage the climate
control folks mentioned is that the replacement unit for the HVAC unit

would
need to be scaled up as the original unit was never intended to support

the
sort of heat load that the server room adds to the building. Cost wise it
comes out about even and should the server room system fail in the future

we
have the option to open up both ends of the server room and sponge off the
buildings ambient air until it can be fixed.

As a point of minor interest, the local community college uses it's
former mainframe server room as a mortuary. They long ago built a
"Technology Center" to hold their server rooms and computer labs. Until
they began teaching "Mortuary Science" the room was of little use due to

the
custom HVAC system being to cold for classrooms or offices and the being

no
heating system, the heat from the servers was ample enough.

KC


"Phil" wrote in message
...
LC,
That 105 may be the magic number, but that room was allot hotter than

that.

I'm using about 25 servers, 1850R's and dl380's with a few more odd

balls.
10 of the EU's 12 drives ea and 4 newer arrays 14 dives ea. Add the
switches, kvm, kvm extenders, ip control, multiple 3, and 6, KVA UPS's,
satellite equipment, routers, phone equipment, alarm equipment, CCTV
equipment, monitors and lighting and that's allot of heat. There is also

a
bank of transformers in a big steel cabinet that generate heat too, can
hardly touch them, they tie into the panels and backup generators.

I didn't realize it until I started counting, but over 200 scsi drives

in
that mess. Nowhere else do I have a concentration of over about 5

computers.
Already switched all my monitors out to flat screens which saved a few

bucks
on power and heat.

We cut a 5-foot hole in the roof of that small building, pulled the drop
ceiling tiles out above the racks and temporarily set a 5-foot ceiling

fan
over the cutout in the roof. That has dropped the temp considerably,

and
is
moving a hurricane like flow of air through now.

So far nothing has shut down from heat (that I know of), but this

morning
I
had 6 bix box fans blowing directly on the racks. I had already shut

some
stuff down, but some stuff has to operate regardless.

Heat, and power is one reason I was thinking about upgrading to more

modern
servers, but I question what dependability I would gain with the new

cheaper
built equipment. This stuff has worked trouble free, and does everything

I
need, just not cheap to operate and probably more dependable than new
equipment. I thought I had it set up so that any given failure would not
stop any process but 95 degrees outside is not helping. Damn, it's been

hot.

Time to redesign my layout after this. I've learned allot about server
operation, but it has been a slow evolution and a continuous learning
experience, most the hard way. We never planed to have that much "stuff"

in
there, it's been a slow process of adding equipment, so it simply wasn't
designed for.

Thanks again, for the time,
Phil

"Nut Cracker" wrote in message
...
Hello Phil,

To an extent, it depends on the models of the servers. I have never

had
a
problem with storage shutting down, and I have had that stuff running

in
some very HOT environments.

There are parts to my answer.

First, most of the systems will go into Thermal Shutdown around 105

degree's
F (internal temp). They will shut down (gracefully) until they cool

off.
At
5 minute intervals, the servers power back on to see if they have

cooled
down enough to power up and start running again.

As long as you have *some* airflow going into the front of those

machines,
and there is ample airflow to exhaust the heat that comes out of the

backs,
you should be OK until you get the HVAC system repaired.

Second, setting up the servers for a Thermal ASR (automatic server

reset)
is
an OPTION. It doesnt have to be enabled for the servers to operate. Of
course, you do so at your own risk.

If you have the system management hompage installed on your amchines,

then
you should be able go into it and tell the server whether or not you

want
Thermal Reset (or something like that) enabled.

If you have any other quetsions, I will be happy to answer them for

you.

- LC

"Phil" wrote in message
...
Nutcracker,
You are the expert, just how hot can the ambient temperature be with
theses
servers and raid enclosures be before something starts crapping out?

We lost the HVAC to the server room, we have doors open (NO windows

in
room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me. I

had
backups to what I thought was everything BUT the cooling. For some

reason
I
did not take that failure into consideration.

Thanks,
Phil










  #8  
Old July 26th 05, 06:13 PM
Kevin Childers
external usenet poster
 
Posts: n/a
Default

As long as it works for you and the price is right, it's a good
solution. Air flow around the servers is the key. Each server can cool
it's self as long as the server room's ambient air temperature is kept cool
enough for the server dump it's heat load into it without creating a rising
temperature spiral.

I was in Fayetteville NC for the first two situations I mentioned. Due
to geography, the weather there is not so kind in the summer. To low to get
any benefit from elevational cooling but low enough to get fog and high
humidity from the mixed airflow off the mountains and the ocean. Yet, to
far from the coast to get any true onshore effect. Fortunately the sand
hills drain well and are high enough to minimized storm flooding.

Best of luck

KC

"Phil" wrote in message
...
LC, Kevin,
In my case equipment is in a small (1500sf) separate concrete block

building
with foam insulated, and sheet rocked walls and no windows. We have a

4-foot
space between the drop ceiling and the roof. Roof is flat so I thought
cutting rafters would be easier, cheaper, and quicker than walls. I'm

going
to leave that fan on the roof, have a weatherproof enclosure built over

the
fan with louvers and install steel bars for security. It's a temporary
solution, but in light of a total loss of operation, it is working.

I'm up in the mountains of North Carolina and we rarely see these high
outside temperatures and can hardly justify installing two HVAC systems.
(perhaps I need to reevaluate that too) Our compressor took a nosedive and
it wasn't a simple quick fix. I've found for the most part, build a backup
to the primary "whatever" and the primary won't fail or don't prepare and
look out. I'll leave that fan on the roof, just in case, but it is working
(cross fingers) and nothing is shutting down now, although still too warm

to
comfortably work in there with clothes on!

Yesterday, prior to the hole/fan we did start having failures in some
servers. Luckily it was not all servers and the amazing thing was the

their
respective backups kept going so I didn't loose one file.

I have too much equipment to temporarily relocate it. Our operation is not
just file servers We pull data from various satellites process it through
multiple machines, while archiving some, it's also routed to multiple

other
machines. Data routing is very intertwined among various processes as to
make a "system", not just a collection of file servers that I could
distribute on the network.

I did like most of do in these situations, improvised. I will say LC while
some servers may shut down at 105, with a high, hurricane like volume of

air
moving through they seem to hang in there. Just fans on the racks and fans
in the buildings open doors did not help much, just more heat than we

could
push out the doors.

The only reason I outline my "mountain" solution, it keeps us going until
the primary problem is resolved. We had that ceiling cut open and a big

fan
setting on the hole in under 3 hours. I'm sure it will take days to
weatherproof and secure it.

Wow, who would ever thought all that is involved to just keep a computer
running!
I guess the lesson here is Plan ahead, or you might get caught with your
pants down, and loose your shirt, literally, and not with the new

secretary
!

Stay cool,
Phil

Oh, By the way LC, that fellow contacted me directly about fedora on the
older compact servers. I gave him the info to get him fixed up and he has

it
going with no problems in graphics mode. If anyone needs to load fedora on
any of the older Compaq servers I'll be happy to share the tricks.
Have a good day,
Phil



"Kevin Childers" wrote in message
...
Interesting you should bring this up, I am currently working for a
company that is suffering through to an A/C failure for the entire

building
that has the server room shut down. Some joker brought in an old gas
station thermometer that was scaled to 135(F) degrees. It topped out in
under 15 minutes. As a temp solution we have moved the servers out and
distributed them around the network in what has to be one of the most
physically unsecured environments possible.

Two other sites I've worked at have had heat issues and in both

cases
the key was air flow just as you have found out. In both instances heat

was
not considered during the initial establishment of the server room.

Here
are the two long term solutions they came up with and the solution I

have
put forward for my current position.

1. Location, one story office complex with server room in a windowless
former storage room. Just like you we cut a large hole out, but in the

back
wall. We then had installed an independent A/C unit (the same as some

cheap
motels use) with a thermal switch set at 80 degrees. Setting was found
through trial and error as we noted the A/C unit needed a bit of a head
start to keep the server room temperature below 90 degrees. Realizing

that
the ambient air temperature needed to be cooler than the max operating
temperature of the servers to provide the necessary heat transfer as it

was
pulled through the cases.

2. Location, at the top of a stair well in a split level building.

Server
room has it's own HVAC unit, but no back up. The HVAC was running

24/7/365
and the room temperature averaged around 80 degrees regardless of the

season
or the thermostats settings. Due to power availability and structural
issues, we cut a hole in the ceiling and installed a large drum fan with
ducting to carry the hot air out of the building and provide a weather

proof
system. Site is in a hurricane prone area and to date the ducting has

kept
the rain out. The only draw back is that the fan is very noisy, but

then
it
is an emergency fall back system so it is not on that often. Later we
redirected an HVAC vent from a storage closet on the first floor into

the
server room as a back up. The fan kept the temperature within the
operational range, but the front office didn't like the techs working in
shorts and t-shirts.

3. Location, former common/meeting room in the center (20 feet from

nearest
accessible outside wall) of a multi-story office building. They now

have
some high efficiency A/C unit for home/office room additions that only
require a 3 inch opening to connect the inside and outside units. Am
proposing we install same. Local climate control folks say that the

piping
can be run with a bit of custom work on their part. With proper

insulation
there should be no problem with condensation and the unit can support

the
expected heat load. This will make the server room independent of the
buildings HVAC should it fail in the future. One advantage the climate
control folks mentioned is that the replacement unit for the HVAC unit

would
need to be scaled up as the original unit was never intended to support

the
sort of heat load that the server room adds to the building. Cost wise

it
comes out about even and should the server room system fail in the

future
we
have the option to open up both ends of the server room and sponge off

the
buildings ambient air until it can be fixed.

As a point of minor interest, the local community college uses it's
former mainframe server room as a mortuary. They long ago built a
"Technology Center" to hold their server rooms and computer labs. Until
they began teaching "Mortuary Science" the room was of little use due to

the
custom HVAC system being to cold for classrooms or offices and the being

no
heating system, the heat from the servers was ample enough.

KC


"Phil" wrote in message
...
LC,
That 105 may be the magic number, but that room was allot hotter than

that.

I'm using about 25 servers, 1850R's and dl380's with a few more odd

balls.
10 of the EU's 12 drives ea and 4 newer arrays 14 dives ea. Add the
switches, kvm, kvm extenders, ip control, multiple 3, and 6, KVA

UPS's,
satellite equipment, routers, phone equipment, alarm equipment, CCTV
equipment, monitors and lighting and that's allot of heat. There is

also
a
bank of transformers in a big steel cabinet that generate heat too,

can
hardly touch them, they tie into the panels and backup generators.

I didn't realize it until I started counting, but over 200 scsi drives

in
that mess. Nowhere else do I have a concentration of over about 5

computers.
Already switched all my monitors out to flat screens which saved a few

bucks
on power and heat.

We cut a 5-foot hole in the roof of that small building, pulled the

drop
ceiling tiles out above the racks and temporarily set a 5-foot ceiling

fan
over the cutout in the roof. That has dropped the temp considerably,

and
is
moving a hurricane like flow of air through now.

So far nothing has shut down from heat (that I know of), but this

morning
I
had 6 bix box fans blowing directly on the racks. I had already shut

some
stuff down, but some stuff has to operate regardless.

Heat, and power is one reason I was thinking about upgrading to more

modern
servers, but I question what dependability I would gain with the new

cheaper
built equipment. This stuff has worked trouble free, and does

everything
I
need, just not cheap to operate and probably more dependable than new
equipment. I thought I had it set up so that any given failure would

not
stop any process but 95 degrees outside is not helping. Damn, it's

been
hot.

Time to redesign my layout after this. I've learned allot about server
operation, but it has been a slow evolution and a continuous learning
experience, most the hard way. We never planed to have that much

"stuff"
in
there, it's been a slow process of adding equipment, so it simply

wasn't
designed for.

Thanks again, for the time,
Phil

"Nut Cracker" wrote in message
...
Hello Phil,

To an extent, it depends on the models of the servers. I have never

had
a
problem with storage shutting down, and I have had that stuff

running
in
some very HOT environments.

There are parts to my answer.

First, most of the systems will go into Thermal Shutdown around 105
degree's
F (internal temp). They will shut down (gracefully) until they cool

off.
At
5 minute intervals, the servers power back on to see if they have

cooled
down enough to power up and start running again.

As long as you have *some* airflow going into the front of those

machines,
and there is ample airflow to exhaust the heat that comes out of the
backs,
you should be OK until you get the HVAC system repaired.

Second, setting up the servers for a Thermal ASR (automatic server

reset)
is
an OPTION. It doesnt have to be enabled for the servers to operate.

Of
course, you do so at your own risk.

If you have the system management hompage installed on your

amchines,
then
you should be able go into it and tell the server whether or not you

want
Thermal Reset (or something like that) enabled.

If you have any other quetsions, I will be happy to answer them for

you.

- LC

"Phil" wrote in message
...
Nutcracker,
You are the expert, just how hot can the ambient temperature be

with
theses
servers and raid enclosures be before something starts crapping

out?

We lost the HVAC to the server room, we have doors open (NO

windows
in
room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me.

I
had
backups to what I thought was everything BUT the cooling. For some
reason
I
did not take that failure into consideration.

Thanks,
Phil













  #9  
Old July 27th 05, 01:49 AM
Aidan Grey
external usenet poster
 
Posts: n/a
Default

On Mon, 25 Jul 2005 16:59:43 -0400, Phil wrote:

Nutcracker,
You are the expert, just how hot can the ambient temperature be with theses
servers and raid enclosures be before something starts crapping out?

We lost the HVAC to the server room, we have doors open (NO windows in room)
multiple fans going and it's still hot as hell.

Hopefully tomorrow will have it fixed, but this heat concerns me. I had
backups to what I thought was everything BUT the cooling. For some reason I
did not take that failure into consideration.

Thanks,
Phil




Although your servers may not fail right now, running them at a higher
tempature stresses them quite a lot.

You may see higher than usual drive failures, etc, for the next few months.
If
you were thinking of buying new equipment, you should probably do so
sooner rather than later.


Aidan Grey



  #10  
Old July 27th 05, 12:30 PM
Guy Macon
external usenet poster
 
Posts: n/a
Default




Keep in mind that the temerature of you server room has
to be lower at higher altitudes to get the same cooling.

Altitude derating factor:
FEET METERS COOLING:
0 0 100%
1,000 3,000 95%
1,500 5,000 90%
2,000 7,000 86%
3,000 10,000 80%
3,500 12,000 75%


 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Which is better, heat pipe or heat sinks/fan CPU cooling solution. ? milleron Asus Motherboards 1 May 27th 05 06:37 PM
Which is better, heat pipe or heat sinks/fan CPU cooling solution. ? stubborn Homebuilt PC's 0 May 27th 05 12:24 AM
R9800NP - overclocking problem power_ranger Overclocking 20 April 19th 04 08:08 AM
Opteron Overclocking? Adrian Richards Overclocking AMD Processors 9 October 5th 03 03:20 PM
XPS, the heat and fan noise problem Don Dell Computers 8 July 12th 03 08:57 AM


All times are GMT +1. The time now is 09:10 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.