On 10/13/2017 2:42 PM, Naslund, Steve wrote:
Funny how NANOG posts seem to precede actual attention from vendors isn't it.
Squeaky wheel, grease. Same reason why it takes me berating companies
on Twitter publicly before things actually get done.
*Stares directly at Verizon for a previous
Funny how NANOG posts seem to precede actual attention from vendors isn't it.
Steven Naslund
Chicago IL
-Original Message-
From: NANOG [mailto:nanog-boun...@nanog.org] On Behalf Of David Hubbard
Sent: Friday, October 13, 2017 3:38 PM
To: nanog@nanog.org
Subject: Re: Temp at Level 3 data
with the situation,
so hopefully things are headed in the right direction now.
David
From: Martin Hannigan <hanni...@gmail.com>
Date: Friday, October 13, 2017 at 4:05 PM
To: David Hubbard <dhubb...@dino.hostasaurus.com>
Cc: "nanog@nanog.org" <nanog@nanog.org>
Subject: Re:
On 2017-10-13 14:10, Roy wrote:
The IBM 308x and 309x series mainframes were water cooled.
The bank I worked for had just installed one. A big change were noise
levels, the thing was really quiet. But servicing now required a plumber
too. (there was a separate cabinet for the water pumps as
Hi David,
80F seems ~reasonable to me. What is the inlet temp, the temperature air is
going in at? What kind of gear are operating? Routers and switches?
Servers? Disk? Is the cabinet top fan working? Most modern equipment should
be able to handle those temps. As another poster noted, are these
On 2017-10-13 14:10, Roy wrote:
>
>
> The IBM 308x and 309x series mainframes were water cooled.
The bank I worked for had just installed one. A big change were noise
levels, the thing was really quiet. But servicing now required a plumber
too. (there was a separate cabinet for the water
The IBM 308x and 309x series mainframes were water cooled. They did
have Thermal Conduction Modules which had a helium-filled metal cap,
which contains one piston per chip; the piston presses against the back
of each chip to provide a heat conduction path from the chip to the
cap. The cap
Once upon a time, b...@theworld.com said:
> Also, the IBM 3090 at least, was cooled via helium-filled pipes kind
> of like today's liquid cooled systems. It was full of plumbing. If you
> opened it up some chips were right on copper junction boxes (maybe
> they were just
On October 12, 2017 at 19:56 jfmezei_na...@vaxination.ca (Jean-Francois Mezei)
wrote:
> back in the arly 1990s, Tandem had a computer called "Cyclone". (these
> were mission critical, fault tolerant machines).
ok old fart stories...tho maybe current.
IBM's big mainframes would repeat
back in the arly 1990s, Tandem had a computer called "Cyclone". (these
were mission critical, fault tolerant machines).
The reason for "Cyclone" name was that the cabinets had huge fan
capacity, and that was to deal with air conditioning failure by
increasing the air flow over the electronics to
If you are using hot/cold aisles and don't fill the rack, don't forget you have
to put in blank panels.
--
Keith Stokes
> On Oct 12, 2017, at 5:45 PM, William Herrin wrote:
>
> On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard <
> dhubb...@dino.hostasaurus.com> wrote:
>
>>
On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard <
dhubb...@dino.hostasaurus.com> wrote:
> Curious if anyone on here colo’s equipment at a Level 3 facility and has
> found the temperature unacceptably warm? I’m having that experience
> currently, where ambient temp is in the 80’s, but they tell me
On Thu, Oct 12, 2017 at 3:39 AM, Naslund, Steve
wrote:
> If the ambient temperature is higher is means the temperatures throughout
> the device would be higher and the temp at those points is what really
> matters. I would also be concerned because if they lose one of the
I'm a few years removed from having direct involvement in our DCs now, so I
don't have an example on hand to look at. Is cooling (and in-cabinet
temperature) not a part of the SLA? If it is, then there shouldn't be a
question of the DC staff brushing off complaints about the
temperature–either
the facility manager/personnel very useful.
They gave staight up answers and have done what they could to assist.
Original message
From: Chuck Anderson <c...@wpi.edu>
Date: 10/11/17 22:13 (GMT-05:00)
To: nanog@nanog.org
Subject: Re: Temp at Level 3 data centers
Install
Install an air conditioner in your rack.
On Wed, Oct 11, 2017 at 02:39:19PM -0500, Andrew Latham wrote:
> David
>
> The issue has several components and is vendor agnostic.
>
> Set Point: The systems are specifically set at a temperature
> Capacity Ability: The systems can maintain a
Den 11. okt. 2017 22.47 skrev "William Herrin" :
On Wed, Oct 11, 2017 at 4:32 PM, Jörg Kost wrote:
> Do you guys still at least have biometric access control devices at your
> Level3 dc? They even removed this things at our site, because there is no
> budget
On Wed, Oct 11, 2017 at 4:32 PM, Jörg Kost wrote:
> Do you guys still at least have biometric access control devices at your
> Level3 dc? They even removed this things at our site, because there is no
> budget for a successor for the failing unit. And to be consistent, they
>
Hi there,
been there, done that, rocky way ahead.
In Europe the standard temperature Level3 SLA is 26C. The measurement is
done on the cool aisle, at a distance of 45 cm to the equipment and a
height of 100 cm. You can use a Testo 625 handheld for measurements,
that is also handled by Level3
On 10/11/17 9:42 AM, Sam Kretchmer wrote:
with a former employer we had a suite at the L3 facility on Canal in
Chicago. They had this exact issue for the entire time we had the suite.
They kept blaming a failing HVAC unit on our floor, but it went on for
years no matter who we complained to, or
David
The issue has several components and is vendor agnostic.
Set Point: The systems are specifically set at a temperature
Capacity Ability: The systems can maintain a temperature
Customer Desire: What you expect from sales promises.
Sales Promise: What they might carefully avoid promising.
I
As temp goes up wire resistance increases too, increasing heat, increasing
resistance, etc - and I find breakers trip more easily at hotter temps too.
/kc
On Wed, Oct 11, 2017 at 01:08:33PM -0400, Zachary Winnerman said:
>That's a good point, though if you are running your breakers that close
On 2017-10-11 19:09, Naslund, Steve wrote:
> I would also be concerned because if they lose one of the a/c units
> what would the ambient temperature rise to?
It doesn't matter much if the "normal" temperature in your DC is 10
or 30 degrees Celcius; if the cooling system is barely keeping up
Also worth noting that temperature tolerances for large scale numbers of 1U
servers, Open Compute platform type high density servers, or blade servers
is a very different thing than air intake temperatures for more sensitive
things like DWDM platforms... There's laser and physics related issues
My 0.041 BTC:
1) For small facilities, without separate temperature-controlled UPS zones,
the optimum temperature for lead-acid batteries may be the lower bound.
77°F is optimal, with significant reduction in battery life even 15°F above
that. Given that batteries' internal temperature will
NANOG [mailto:nanog-boun...@nanog.org] On Behalf Of Zachary Winnerman
>Sent: Wednesday, October 11, 2017 11:54 AM
>To: nanog@nanog.org
>Subject: Re: Temp at Level 3 data centers
>
>I recall some evidence that 80+F temps can reduce hard drive lifetime, though
>it might be outdated as it w
>Sent: Wednesday, October 11, 2017 11:54 AM
>To: nanog@nanog.org
>Subject: Re: Temp at Level 3 data centers
>
>I recall some evidence that 80+F temps can reduce hard drive lifetime, though
>it might be outdated as it was from before SSDs were around. I would imagine
>that while it
That's a good point, though if you are running your breakers that close
I think you have bigger problems, as a power outage, however unlikely,
could cause your equipment to not come back up at all. Software updates
that reboot several servers in quick succession could also cause a
breaker to trip
In a message written on Wed, Oct 11, 2017 at 12:54:26PM -0400, Zachary
Winnerman wrote:
> I recall some evidence that 80+F temps can reduce hard drive lifetime,
> though it might be outdated as it was from before SSDs were around. I
This is very much a "your infrastructure may vary" situation.
http://www.datacenterknowledge.com/archives/2008/10/14/google-raise-your-data-center-temperature
On Oct 11, 2017 11:56 AM, "Zachary Winnerman"
wrote:
> I recall some evidence that 80+F temps can reduce hard drive lifetime,
> though it might be outdated as it was from
I recall some evidence that 80+F temps can reduce hard drive lifetime,
though it might be outdated as it was from before SSDs were around. I
would imagine that while it may not impact the ability for a server to
handle load, it may reduce equipment lifetime. It also could be an
indication that
My house isnt built for moving furniture, it's built for living in. I've not
moved a bed in or out of the bedroom in 8 years now. But for the 15 minutes
I did move a bed, the door and hallway had to accomodate it.
Humans have to go into datacenters - often in an emergency. Complicating the
There are plenty of people who say 80+ is fine for equipment and data centers
aren’t built for people.
However other things have to be done correctly.
Are you sure your equipment is properly oriented for airflow (hot/cold aisles
if in use) and has no restrictions?
On Oct 11, 2017, at 9:42 AM,
with a former employer we had a suite at the L3 facility on Canal in
Chicago. They had this exact issue for the entire time we had the suite.
They kept blaming a failing HVAC unit on our floor, but it went on for
years no matter who we complained to, or what we said.
Good luck.
On 10/11/17,
Curious if anyone on here colo’s equipment at a Level 3 facility and has found
the temperature unacceptably warm? I’m having that experience currently, where
ambient temp is in the 80’s, but they tell me that’s perfectly fine because
vented tiles have been placed in front of all equipment
35 matches
Mail list logo