Re: Temp at Level 3 data centers

2017-10-13 Thread Brielle Bruns

On 10/13/2017 2:42 PM, Naslund, Steve wrote:

Funny how NANOG posts seem to precede actual attention from vendors isn't it.



Squeaky wheel, grease.  Same reason why it takes me berating companies 
on Twitter publicly before things actually get done.


*Stares directly at Verizon for a previous incident where rejection 
message from an e-mail block said to e-mail a support address to get 
removed, but support address has same filters and blocked unblocked request*



--
Brielle Bruns
The Summit Open Source Development Group
http://www.sosdg.org/ http://www.ahbl.org


RE: Temp at Level 3 data centers

2017-10-13 Thread Naslund, Steve
Funny how NANOG posts seem to precede actual attention from vendors isn't it.

Steven Naslund
Chicago IL

-Original Message-
From: NANOG [mailto:nanog-boun...@nanog.org] On Behalf Of David Hubbard
Sent: Friday, October 13, 2017 3:38 PM
To: nanog@nanog.org
Subject: Re: Temp at Level 3 data centers

Thanks for all the opinions and experiences with this on this on and off list.  
The facility in question is not one that has a cold/hot row or containment 
concept so ambient temp plays a greater role than in other facilities.  Some 
folks from Level 3 reached out and are working to help me with the situation, 
so hopefully things are headed in the right direction now.

David

From: Martin Hannigan <hanni...@gmail.com>
Date: Friday, October 13, 2017 at 4:05 PM
To: David Hubbard <dhubb...@dino.hostasaurus.com>
Cc: "nanog@nanog.org" <nanog@nanog.org>
Subject: Re: Temp at Level 3 data centers



Hi David,

80F seems ~reasonable to me. What is the inlet temp, the temperature air is 
going in at? What kind of gear are operating? Routers and switches? Servers? 
Disk? Is the cabinet top fan working? Most modern equipment should be able to 
handle those temps. As another poster noted, are these triggers modifiable (or 
have they been)? I would refer to the manufactures guidelines. You haven't 
given us enough information to help. You can refer (them) to ASHRAE standards 
in your conversation. I'd be surprised if they weren't already well aware of it 
and practicing most of what it preaches. They may operate safely outside of 
some norms.

15F-20F cooler? You might be paying too much for colo if that's true.

Best,

-M<



On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard 
<dhubb...@dino.hostasaurus.com<mailto:dhubb...@dino.hostasaurus.com>> wrote:
Curious if anyone on here colo’s equipment at a Level 3 facility and has found 
the temperature unacceptably warm?  I’m having that experience currently, where 
ambient temp is in the 80’s, but they tell me that’s perfectly fine because 
vented tiles have been placed in front of all equipment racks.  My equipment is 
alarming for high temps, so obviously not fine.  Trying to find my way up to 
whomever I can complain to that’s in a position to do something about it but it 
seems the support staff have been told to brush questions about temp off as 
much as possible.  Was wondering if this is a country-wide thing for them or 
unique to the data center I have equipment in.  I have equipment in several 
others from different companies and most are probably 15-20 degrees cooler.

Thanks,

David



Re: Temp at Level 3 data centers

2017-10-13 Thread David Hubbard
Thanks for all the opinions and experiences with this on this on and off list.  
The facility in question is not one that has a cold/hot row or containment 
concept so ambient temp plays a greater role than in other facilities.  Some 
folks from Level 3 reached out and are working to help me with the situation, 
so hopefully things are headed in the right direction now.

David

From: Martin Hannigan <hanni...@gmail.com>
Date: Friday, October 13, 2017 at 4:05 PM
To: David Hubbard <dhubb...@dino.hostasaurus.com>
Cc: "nanog@nanog.org" <nanog@nanog.org>
Subject: Re: Temp at Level 3 data centers



Hi David,

80F seems ~reasonable to me. What is the inlet temp, the temperature air is 
going in at? What kind of gear are operating? Routers and switches? Servers? 
Disk? Is the cabinet top fan working? Most modern equipment should be able to 
handle those temps. As another poster noted, are these triggers modifiable (or 
have they been)? I would refer to the manufactures guidelines. You haven't 
given us enough information to help. You can refer (them) to ASHRAE standards 
in your conversation. I'd be surprised if they weren't already well aware of it 
and practicing most of what it preaches. They may operate safely outside of 
some norms.

15F-20F cooler? You might be paying too much for colo if that's true.

Best,

-M<



On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard 
<dhubb...@dino.hostasaurus.com<mailto:dhubb...@dino.hostasaurus.com>> wrote:
Curious if anyone on here colo’s equipment at a Level 3 facility and has found 
the temperature unacceptably warm?  I’m having that experience currently, where 
ambient temp is in the 80’s, but they tell me that’s perfectly fine because 
vented tiles have been placed in front of all equipment racks.  My equipment is 
alarming for high temps, so obviously not fine.  Trying to find my way up to 
whomever I can complain to that’s in a position to do something about it but it 
seems the support staff have been told to brush questions about temp off as 
much as possible.  Was wondering if this is a country-wide thing for them or 
unique to the data center I have equipment in.  I have equipment in several 
others from different companies and most are probably 15-20 degrees cooler.

Thanks,

David



Re: Temp at Level 3 data centers

2017-10-13 Thread Roy



On 2017-10-13 14:10, Roy wrote:


The IBM 308x and 309x series mainframes were water cooled.


The bank I worked for had just installed one. A big change were noise
levels, the thing was really quiet. But servicing now required a plumber
too. (there was a separate cabinet for the water pumps as I recall.)

But in all cases, the issue is how long you can survive when your "heat
dump" is not available. If nobody is removing heat from your water loop
it will eventually fail too.


In the end, it is a lot easier to provide redundancy for HVAC in one
large room than splitting the DC into small suites that each have their
1 unit. Redundancy there would require 2 units per suite. And the
problem with having AC units that are capable of twice the load (in case
other one fails) is that it increases the on-off cycles and thus reduces
lifetime (increases likelyhood of failure).


The separate box was a heat exchanger. In the "old" days, buildings 
had central systems that provided chilled water.  Its similar to your 
house HVAC where an outside unit cools Freon and you have a heat 
exchanger that cools the inside air.  In the case of the water cooled 
mainframe, the same chilled water was connected  to the exchanger and 
not directly to the computer.  The water running through the computer 
was a closed system.


Re: Temp at Level 3 data centers

2017-10-13 Thread Martin Hannigan
Hi David,

80F seems ~reasonable to me. What is the inlet temp, the temperature air is
going in at? What kind of gear are operating? Routers and switches?
Servers? Disk? Is the cabinet top fan working? Most modern equipment should
be able to handle those temps. As another poster noted, are these triggers
modifiable (or have they been)? I would refer to the manufactures
guidelines. You haven't given us enough information to help. You can refer
(them) to ASHRAE standards in your conversation. I'd be surprised if they
weren't already well aware of it and practicing most of what it preaches.
They may operate safely outside of some norms.

15F-20F cooler? You might be paying too much for colo if that's true.

Best,

-M<



On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard <
dhubb...@dino.hostasaurus.com> wrote:

> Curious if anyone on here colo’s equipment at a Level 3 facility and has
> found the temperature unacceptably warm?  I’m having that experience
> currently, where ambient temp is in the 80’s, but they tell me that’s
> perfectly fine because vented tiles have been placed in front of all
> equipment racks.  My equipment is alarming for high temps, so obviously not
> fine.  Trying to find my way up to whomever I can complain to that’s in a
> position to do something about it but it seems the support staff have been
> told to brush questions about temp off as much as possible.  Was wondering
> if this is a country-wide thing for them or unique to the data center I
> have equipment in.  I have equipment in several others from different
> companies and most are probably 15-20 degrees cooler.
>
> Thanks,
>
> David
>


Re: Temp at Level 3 data centers

2017-10-13 Thread Jean-Francois Mezei
On 2017-10-13 14:10, Roy wrote:
> 
> 
> The IBM 308x and 309x series mainframes were water cooled. 


The bank I worked for had just installed one. A big change were noise
levels, the thing was really quiet. But servicing now required a plumber
too. (there was a separate cabinet for the water pumps as I recall.)

But in all cases, the issue is how long you can survive when your "heat
dump" is not available. If nobody is removing heat from your water loop
it will eventually fail too.


In the end, it is a lot easier to provide redundancy for HVAC in one
large room than splitting the DC into small suites that each have their
1 unit. Redundancy there would require 2 units per suite. And the
problem with having AC units that are capable of twice the load (in case
other one fails) is that it increases the on-off cycles and thus reduces
lifetime (increases likelyhood of failure).



Re: Temp at Level 3 data centers

2017-10-13 Thread Roy



The IBM 308x and 309x series mainframes were water cooled.  They did 
have Thermal Conduction Modules which had a helium-filled metal cap, 
which contains one piston per chip; the piston presses against the back 
of each chip to provide a heat conduction path from the chip to the 
cap.  The cap was connected to the chilled water supply.


On 10/13/2017 10:51 AM, Chris Adams wrote:

Once upon a time, b...@theworld.com  said:

Also, the IBM 3090 at least, was cooled via helium-filled pipes kind
of like today's liquid cooled systems. It was full of plumbing. If you
opened it up some chips were right on copper junction boxes (maybe
they were just sensors but it looked cool.)

Cray supercomputers had Freon lines through them for cooling, up until
the last generation of the "old school" supercomputer.  That was not
sufficient to keep it cool, so they sealed the chassis (which was huge)
and pumped it full of 4 tons of Fluorinert.




Re: Temp at Level 3 data centers

2017-10-13 Thread Chris Adams
Once upon a time, b...@theworld.com  said:
> Also, the IBM 3090 at least, was cooled via helium-filled pipes kind
> of like today's liquid cooled systems. It was full of plumbing. If you
> opened it up some chips were right on copper junction boxes (maybe
> they were just sensors but it looked cool.)

Cray supercomputers had Freon lines through them for cooling, up until
the last generation of the "old school" supercomputer.  That was not
sufficient to keep it cool, so they sealed the chassis (which was huge)
and pumped it full of 4 tons of Fluorinert.
-- 
Chris Adams 


Re: Temp at Level 3 data centers

2017-10-13 Thread bzs

On October 12, 2017 at 19:56 jfmezei_na...@vaxination.ca (Jean-Francois Mezei) 
wrote:
 > back in the arly 1990s, Tandem had a computer called "Cyclone". (these
 > were mission critical, fault tolerant machines).

ok old fart stories...tho maybe current.

IBM's big mainframes would repeat calculations as a way to detect
hardware errors.

Above a certain temperature they would do more repeating.

If there was any disagreement it would be reported and they had some
complex statistical formula to determine how many repetitions to try
next and what to accept.

I assume this was analogous to the various time sync game theoretic
formulas to decide which time reference to believe when they
conflict. It's not as simple as majority vote, the majority could be
wrong (e.g., same stuck bit.)

So, at least as it was explained to me, as it got warmer (e.g., A/C
failure) the machine would get slower and slower, potentially to a
crawl.

And there was no doubt a point at which it'd just shut itself off, but
before it got there. Since many mainframes were mission critical they
were trying to avoid that.

That was the kind of thing which made multi-million dollar mainframes
cost multi-millions of dollars.

Also, the IBM 3090 at least, was cooled via helium-filled pipes kind
of like today's liquid cooled systems. It was full of plumbing. If you
opened it up some chips were right on copper junction boxes (maybe
they were just sensors but it looked cool.)

There was always something amusing back then when an IBM service
person would show up with one of those typical gas tanks on wheels,
like one uses for welding, to top off your mainframe.

-- 
-Barry Shein

Software Tool & Die| b...@theworld.com | http://www.TheWorld.com
Purveyors to the Trade | Voice: +1 617-STD-WRLD   | 800-THE-WRLD
The World: Since 1989  | A Public Information Utility | *oo*


Re: Temp at Level 3 data centers

2017-10-12 Thread Jean-Francois Mezei
back in the arly 1990s, Tandem had a computer called "Cyclone". (these
were mission critical, fault tolerant machines).

The reason for "Cyclone" name was that the cabinets had huge fan
capacity, and that was to deal with air conditioning failure by
increasing the air flow over the electronics to still keep then "comfy"
despite high data centre air temperature. (with the aim of having the
Tandem continue to run despite HVAC failure).

With dense  computers packed in 1U, you just can't have that excessive
airflow to cope with HVAC failure with tiny 1" fans.

The other difference is data centre density.  Bank computer rooms were
sparse compared to today's densely packed racks. So lots of space
relative to heat sources.

The equivalent today would be the football field size data centres from
the likes of Google with high ceilings and hot air from one area with
failed HVAC to rise to ceiling and partly be taken out by the others.

But when you are talking about downdown co-lo with enclosed suites that
are packed to the brim, failure of HVAC results in quick temp increases
because the heat has nowhere to spread to, and HVACs from adjoining also
enclosed suites can't provide help.

So when a tennant agrees to rent rack space in an small enclosed suite,
it should be considerewd that the odds of failure due to heat are
greater (and perhaps consider renting rack space in different suites to
provide some redundancy).




Re: Temp at Level 3 data centers

2017-10-12 Thread Keith Stokes
If you are using hot/cold aisles and don't fill the rack, don't forget you have 
to put in blank panels. 

--

Keith Stokes

> On Oct 12, 2017, at 5:45 PM, William Herrin  wrote:
> 
> On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard <
> dhubb...@dino.hostasaurus.com> wrote:
> 
>> Curious if anyone on here colo’s equipment at a Level 3 facility and has
>> found the temperature unacceptably warm?  I’m having that experience
>> currently, where ambient temp is in the 80’s, but they tell me that’s
>> perfectly fine because vented tiles have been placed in front of all
>> equipment racks.
> 
> 
> Hi David,
> 
> The thing I'm not understanding in this thread is that the last time I
> checked Level 3 was a premium player not a cost player. Has that changed?
> 
> If a premium data center vendor is asking you to swallow 80F in the cold
> aisle, something is very wrong. But realize I just said 80F in the *cold
> aisle*. DC cooling is not about "ambient" or "sensible cooling" or similar
> terms bandied about by ordinary HVAC professionals. In a data center, air
> doesn't really stack up anywhere. It flows.
> 
> If you haven't physically checked your racks, it's time to do that. There
> are lots of reasons for high temps in the cabinet which aren't the DC's
> fault.
> 
> Is all the air flow in your cabinet correctly moving from the cold aisle to
> the hot aisle? Even those side-venting Cisco switches? You're sure? If
> you're looping air inside the cabinet, that's your fault.
> 
> Have you or your rack neighbors exceeded the heat density that the DC's
> HVAC system supports? If you have, the air in the hot aisle may be looping
> over the top of the cabinets and back in to your servers. You can't
> necessarily fill a cabinet with equipment. When you reach the allowable
> heat density, you have to start filling the next cabinet. I've seen DC
> cabinets left half empty for exactly this reason.
> 
> Regards,
> Bill Herrin
> 
> 
> -- 
> William Herrin  her...@dirtside.com  b...@herrin.us
> Dirtside Systems . Web: 


Re: Temp at Level 3 data centers

2017-10-12 Thread William Herrin
On Wed, Oct 11, 2017 at 8:31 AM, David Hubbard <
dhubb...@dino.hostasaurus.com> wrote:

> Curious if anyone on here colo’s equipment at a Level 3 facility and has
> found the temperature unacceptably warm?  I’m having that experience
> currently, where ambient temp is in the 80’s, but they tell me that’s
> perfectly fine because vented tiles have been placed in front of all
> equipment racks.


Hi David,

The thing I'm not understanding in this thread is that the last time I
checked Level 3 was a premium player not a cost player. Has that changed?

If a premium data center vendor is asking you to swallow 80F in the cold
aisle, something is very wrong. But realize I just said 80F in the *cold
aisle*. DC cooling is not about "ambient" or "sensible cooling" or similar
terms bandied about by ordinary HVAC professionals. In a data center, air
doesn't really stack up anywhere. It flows.

If you haven't physically checked your racks, it's time to do that. There
are lots of reasons for high temps in the cabinet which aren't the DC's
fault.

Is all the air flow in your cabinet correctly moving from the cold aisle to
the hot aisle? Even those side-venting Cisco switches? You're sure? If
you're looping air inside the cabinet, that's your fault.

Have you or your rack neighbors exceeded the heat density that the DC's
HVAC system supports? If you have, the air in the hot aisle may be looping
over the top of the cabinets and back in to your servers. You can't
necessarily fill a cabinet with equipment. When you reach the allowable
heat density, you have to start filling the next cabinet. I've seen DC
cabinets left half empty for exactly this reason.

Regards,
Bill Herrin


-- 
William Herrin  her...@dirtside.com  b...@herrin.us
Dirtside Systems . Web: 


Re: Temp at Level 3 data centers

2017-10-12 Thread Sam Silvester
On Thu, Oct 12, 2017 at 3:39 AM, Naslund, Steve 
wrote:

> If the ambient temperature is higher is means the temperatures throughout
> the device would be higher and the temp at those points is what really
> matters.  I would also be concerned because if they lose one of the a/c
> units what would the ambient temperature rise to?  I would want them to
> tell me what the set point of the a/c actually is.
>
> Bottom line 80 F input air is too hot in my opinion and apparently the
> equipment's opinion as well.
>

My quick thoughts on the matter:

1. Above all else, know what your DC provider states in their SLA/contract.
2. It's never a bad idea to try to be on the best possible personal terms
with the DC manager(s), the better you get along the more they're inclined
to share knowledge/issues and work with you on any concerns.
3. You can't infer faults or lack of redundancy from the running
temperature - by way of example several facilities I know run at 25 degrees
celsius but if a chilled water unit in a given data hall fails there's a
number of DX units held in standby to take over. This is where point 2
comes in handy as knowing somebody on the ground they'll often be quite
happy to run through failure scenarios with you and help make sure
everybody is happy with the risk mitigation strategy.

Out of idle curiosity - I'm curious as to if the equipment that is alarming
is configurable or not? Reason I ask is I've heard users claiming
environmental parameters were out of spec before, but then it turned out it
was their own environmental monitoring they'd installed in the rack (using
default parameters out of the box, not configured to match the facility
SLA) that was complaining about a set point of 25...

Cheers,

Sam


Re: Temp at Level 3 data centers

2017-10-12 Thread Matthew Pounsett
I'm a few years removed from having direct involvement in our DCs now, so I
don't have an example on hand to look at.  Is cooling (and in-cabinet
temperature) not a part of the SLA?  If it is, then there shouldn't be a
question of the DC staff brushing off complaints about the
temperature–either L3 should fix it or pay the penalties.  If it isn't,
then I'd suggest having a look at your contract (and possibly looking a new
DCs) at renewal time.


Re: Temp at Level 3 data centers

2017-10-12 Thread Marshall, Quincy
I have equipment in several L(3) DCs. I'd say that is generally the exception 
however I have two notable facilities (smaller type 3) that have troubles on 
occasion... reaching into the 80s as you commented. (Usually during  the warm 
southern summer days) .

I found that my getting to know the facility manager/personnel very useful. 
They gave staight up answers and have done what they could to assist.

 Original message 
From: Chuck Anderson <c...@wpi.edu>
Date: 10/11/17 22:13 (GMT-05:00)
To: nanog@nanog.org
Subject: Re: Temp at Level 3 data centers

Install an air conditioner in your rack.

On Wed, Oct 11, 2017 at 02:39:19PM -0500, Andrew Latham wrote:
> David
>
> The issue has several components and is vendor agnostic.
>
> Set Point: The systems are specifically set at a temperature
> Capacity Ability: The systems can maintain a temperature
> Customer Desire: What you expect from sales promises.
> Sales Promise: What they might carefully avoid promising.
>
> I suggest you review your SLA and discuss with legal asap. You could have a
> document defining your question's answer already but it sits in a filing
> cabinet file labeled business continuity.
>
> If the set point is X then they likely would answer quickly that that is
> the case.
> If the capacity is lacking then they would likely redirect the issue.
> If they don't care about the customer that alone should be an indicator
> If a promise exists in the SLA then the ball is in your court
>
> >From the emails I fear that we have confirmed that this is normal. So your
> question "Is the temperature at Level 3 Data Centers normally in the 80-90F
> range?" sounds like a Yes.
>
> Regardless of the situation always ask for names, titles, and ask vendors
> to repeat critical information like the status of cooling in a building
> designed to deal with cooling. Keep the vendors that do it well.
>
>
>
> On Wed, Oct 11, 2017 at 7:31 AM, David Hubbard <
> dhubb...@dino.hostasaurus.com> wrote:
>
> > Curious if anyone on here colo’s equipment at a Level 3 facility and has
> > found the temperature unacceptably warm?  I’m having that experience
> > currently, where ambient temp is in the 80’s, but they tell me that’s
> > perfectly fine because vented tiles have been placed in front of all
> > equipment racks.  My equipment is alarming for high temps, so obviously not
> > fine.  Trying to find my way up to whomever I can complain to that’s in a
> > position to do something about it but it seems the support staff have been
> > told to brush questions about temp off as much as possible.  Was wondering
> > if this is a country-wide thing for them or unique to the data center I
> > have equipment in.  I have equipment in several others from different
> > companies and most are probably 15-20 degrees cooler.
> >
> > Thanks,
> >
> > David
---
 This email has been scanned for email related threats and delivered safely by 
Mimecast.
 For more information please visit http://www.mimecast.com
---


Re: Temp at Level 3 data centers

2017-10-11 Thread Chuck Anderson
Install an air conditioner in your rack.

On Wed, Oct 11, 2017 at 02:39:19PM -0500, Andrew Latham wrote:
> David
> 
> The issue has several components and is vendor agnostic.
> 
> Set Point: The systems are specifically set at a temperature
> Capacity Ability: The systems can maintain a temperature
> Customer Desire: What you expect from sales promises.
> Sales Promise: What they might carefully avoid promising.
> 
> I suggest you review your SLA and discuss with legal asap. You could have a
> document defining your question's answer already but it sits in a filing
> cabinet file labeled business continuity.
> 
> If the set point is X then they likely would answer quickly that that is
> the case.
> If the capacity is lacking then they would likely redirect the issue.
> If they don't care about the customer that alone should be an indicator
> If a promise exists in the SLA then the ball is in your court
> 
> >From the emails I fear that we have confirmed that this is normal. So your
> question "Is the temperature at Level 3 Data Centers normally in the 80-90F
> range?" sounds like a Yes.
> 
> Regardless of the situation always ask for names, titles, and ask vendors
> to repeat critical information like the status of cooling in a building
> designed to deal with cooling. Keep the vendors that do it well.
> 
> 
> 
> On Wed, Oct 11, 2017 at 7:31 AM, David Hubbard <
> dhubb...@dino.hostasaurus.com> wrote:
> 
> > Curious if anyone on here colo’s equipment at a Level 3 facility and has
> > found the temperature unacceptably warm?  I’m having that experience
> > currently, where ambient temp is in the 80’s, but they tell me that’s
> > perfectly fine because vented tiles have been placed in front of all
> > equipment racks.  My equipment is alarming for high temps, so obviously not
> > fine.  Trying to find my way up to whomever I can complain to that’s in a
> > position to do something about it but it seems the support staff have been
> > told to brush questions about temp off as much as possible.  Was wondering
> > if this is a country-wide thing for them or unique to the data center I
> > have equipment in.  I have equipment in several others from different
> > companies and most are probably 15-20 degrees cooler.
> >
> > Thanks,
> >
> > David


Re: Temp at Level 3 data centers

2017-10-11 Thread Baldur Norddahl
Den 11. okt. 2017 22.47 skrev "William Herrin" :

On Wed, Oct 11, 2017 at 4:32 PM, Jörg Kost  wrote:

> Do you guys still at least have biometric access control devices at your
> Level3 dc? They even removed this things at our site, because there is no
> budget for a successor for the failing unit. And to be consistent, they
> event want to remove all biometric access devices at least across Germany.
>

Hi  Jörg,

IMO, biometric was a gimmick in the first place and a bad idea when
carefully considered. All authenticators can be compromised. Hence, all
authenticators must be replaceable following a compromise. If one of your
DCs' palm vein databases is lost, what's your plan for replacing that hand?


Basic two or three factor authentication: something that you know
(password), something that you are (biometric) and something that you have
(access card).

You can tell your password to a coworker but he can not borrow your hand.
Hence you need both. The password is the replaceable part.


Re: Temp at Level 3 data centers

2017-10-11 Thread William Herrin
On Wed, Oct 11, 2017 at 4:32 PM, Jörg Kost  wrote:

> Do you guys still at least have biometric access control devices at your
> Level3 dc? They even removed this things at our site, because there is no
> budget for a successor for the failing unit. And to be consistent, they
> event want to remove all biometric access devices at least across Germany.
>

Hi  Jörg,

IMO, biometric was a gimmick in the first place and a bad idea when
carefully considered. All authenticators can be compromised. Hence, all
authenticators must be replaceable following a compromise. If one of your
DCs' palm vein databases is lost, what's your plan for replacing that hand?

Regards,
Bill Herrin


-- 
William Herrin  her...@dirtside.com  b...@herrin.us
Dirtside Systems . Web: 


Re: Temp at Level 3 data centers

2017-10-11 Thread Jörg Kost

Hi there,

been there, done that, rocky way ahead.

In Europe the standard temperature Level3 SLA is 26C. The measurement is 
done on the cool aisle, at a distance of 45 cm to the equipment and a 
height of 100 cm. You can use a Testo 625 handheld for measurements, 
that is also handled by Level3 staff.


Do you guys still at least have biometric access control devices at your 
Level3 dc? They even removed this things at our site, because there is 
no budget for a successor for the failing unit. And to be consistent, 
they event want to remove all biometric access devices at least across 
Germany.


Regards
Jörg

On 11 Oct 2017, at 14:31, David Hubbard wrote:

Curious if anyone on here colo’s equipment at a Level 3 facility and 
has found the temperature unacceptably warm?  I’m having that 
experience currently, where ambient temp is in the 80’s, but they 
tell me that’s perfectly fine because vented tiles have been placed 
in front of all equipment racks.  My equipment is alarming for high 
temps, so obviously not fine.  Trying to find my way up to whomever I 
can complain to that’s in a position to do something about it but it 
seems the support staff have been told to brush questions about temp 
off as much as possible.  Was wondering if this is a country-wide 
thing for them or unique to the data center I have equipment in.  I 
have equipment in several others from different companies and most are 
probably 15-20 degrees cooler.


Thanks,

David


Re: Temp at Level 3 data centers

2017-10-11 Thread Bryan Holloway

On 10/11/17 9:42 AM, Sam Kretchmer wrote:

with a former employer we had a suite at the L3 facility on Canal in
Chicago. They had this exact issue for the entire time we had the suite.
They kept blaming a failing HVAC unit on our floor, but it went on for
years no matter who we complained to, or what we said.

Good luck.



At $dayjob-1, we had a couple cabinets at that facility, and sometime 
around 2007, if I recall correctly, they kicked out all of the customers 
that were server-farms. Only carriers were allowed to stay (with a few 
exceptions, I'm sure ...) I know because we picked up a few of their 
customers.


That facility was built out around 2000/2001, and things were a lot 
different back then (e.g. no one was really using 208/240 yet.) I think 
they just couldn't keep up when things really took off again post 
dot-com bust.




On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
 wrote:


Curious if anyone on here colo¹s equipment at a Level 3 facility and has
found the temperature unacceptably warm?  I¹m having that experience
currently, where ambient temp is in the 80¹s, but they tell me that¹s
perfectly fine because vented tiles have been placed in front of all
equipment racks.  My equipment is alarming for high temps, so obviously
not fine.  Trying to find my way up to whomever I can complain to that¹s
in a position to do something about it but it seems the support staff
have been told to brush questions about temp off as much as possible.
Was wondering if this is a country-wide thing for them or unique to the
data center I have equipment in.  I have equipment in several others from
different companies and most are probably 15-20 degrees cooler.

Thanks,

David




Re: Temp at Level 3 data centers

2017-10-11 Thread Andrew Latham
David

The issue has several components and is vendor agnostic.

Set Point: The systems are specifically set at a temperature
Capacity Ability: The systems can maintain a temperature
Customer Desire: What you expect from sales promises.
Sales Promise: What they might carefully avoid promising.

I suggest you review your SLA and discuss with legal asap. You could have a
document defining your question's answer already but it sits in a filing
cabinet file labeled business continuity.

If the set point is X then they likely would answer quickly that that is
the case.
If the capacity is lacking then they would likely redirect the issue.
If they don't care about the customer that alone should be an indicator
If a promise exists in the SLA then the ball is in your court

>From the emails I fear that we have confirmed that this is normal. So your
question "Is the temperature at Level 3 Data Centers normally in the 80-90F
range?" sounds like a Yes.

Regardless of the situation always ask for names, titles, and ask vendors
to repeat critical information like the status of cooling in a building
designed to deal with cooling. Keep the vendors that do it well.



On Wed, Oct 11, 2017 at 7:31 AM, David Hubbard <
dhubb...@dino.hostasaurus.com> wrote:

> Curious if anyone on here colo’s equipment at a Level 3 facility and has
> found the temperature unacceptably warm?  I’m having that experience
> currently, where ambient temp is in the 80’s, but they tell me that’s
> perfectly fine because vented tiles have been placed in front of all
> equipment racks.  My equipment is alarming for high temps, so obviously not
> fine.  Trying to find my way up to whomever I can complain to that’s in a
> position to do something about it but it seems the support staff have been
> told to brush questions about temp off as much as possible.  Was wondering
> if this is a country-wide thing for them or unique to the data center I
> have equipment in.  I have equipment in several others from different
> companies and most are probably 15-20 degrees cooler.
>
> Thanks,
>
> David
>



-- 
- Andrew "lathama" Latham -


Re: Temp at Level 3 data centers

2017-10-11 Thread Ken Chase
As temp goes up wire resistance increases too, increasing heat, increasing
resistance, etc - and I find breakers trip more easily at hotter temps too.

/kc


On Wed, Oct 11, 2017 at 01:08:33PM -0400, Zachary Winnerman said:
  >That's a good point, though if you are running your breakers that close
  >I think you have bigger problems, as a power outage, however unlikely,
  >could cause your equipment to not come back up at all. Software updates
  >that reboot several servers in quick succession could also cause a
  >breaker to trip under those circumstances. Unfortunately, there's no way
  >to tell how close a breaker is to tripping without tripping it. Breakers
  >may have amp meteres and a rated size, but the actual load before
  >tripping is +-20% for common models, meaning a 20A breaker may trip as
  >low as 16A.

-- 
Ken Chase - m...@sizone.org Guelph Canada


Re: Temp at Level 3 data centers

2017-10-11 Thread Thomas Bellman
On 2017-10-11 19:09, Naslund, Steve wrote:

> I would also be concerned because if they lose one of the a/c units
> what would the ambient temperature rise to?

It doesn't matter much if the "normal" temperature in your DC is 10
or 30 degrees Celcius; if the cooling system is barely keeping up
with that, and you loose half your cooling capacity, then temperature
will rise pretty quickly, until the servers are literally cooked (i.e.
temperature reaching 100°C or more).

The spare capacity of the cooling system is the important information,
not the starting temperature.  That difference of 10-20°C in starting
temperature will just give you a few minutes extra, not *save* you, if
there is not enough spare capacity in the cooling system.

Assuming a reasonably densly packed data centre, at least; with low
power density, thin unisolated walls, and winter outside, you might
survive even a full cooling failure. :-)

Also, depending on your cooling solution, a partial failure might not
be very common.  We have district cooling providing us with cold water,
and if that stops pumping water, and we use up our 30m³ buffer tank
(which is enough for 30-40 minutes), *all* cooling stops.  But on the
other hand, we have capacity enough to survive even if they give us
16°C water instead of the normal 9°C water.

> I would want them to tell me what the set point of the a/c actually is.

That I agree with.

> Bottom line 80 F input air is too hot in my opinion and apparently
> the equipment's opinion as well.

Unfortunately the default settings of most servers are not very well
thought through.  They will typically spin up fans *much* more than
is actually needed to protect the hardware, and often there is no way
of changing that for the user.  And if you try to get the manufacturer
to tell you what the most power-efficient inlet temperature is, they
will just tell you "oh, we support anything between 5°C and 40°C" (or
whatever their actual limits are), and absolutely refuse to answer your
actual question.


-- 
Thomas Bellman 
National Supercomputer Centre, Linköping University, Sweden



signature.asc
Description: OpenPGP digital signature


Re: Temp at Level 3 data centers

2017-10-11 Thread Eric Kuhnke
Also worth noting that temperature tolerances for large scale numbers of 1U
servers, Open Compute platform type high density servers, or blade servers
is a very different thing than air intake temperatures for more sensitive
things like DWDM platforms...  There's laser and physics related issues
where temperature stability is important as channel sizes get narrower in
terms of optical THz.



On Wed, Oct 11, 2017 at 10:05 AM, Leo Bicknell  wrote:

> In a message written on Wed, Oct 11, 2017 at 12:54:26PM -0400, Zachary
> Winnerman wrote:
> > I recall some evidence that 80+F temps can reduce hard drive lifetime,
> > though it might be outdated as it was from before SSDs were around. I
>
> This is very much a "your infrastructure may vary" situation.
>
> The servers we're currently buying when speced with SSD only and
> the correct network card (generally meaning RJ45 only, but there
> are exceptions) are waranteed for 105 degree inlet operations.
> While we do not do "high temperature operations" we have seen
> operations where folks run them at 90-100 degree input chasing
> effiency.
>
> Famously, Intel ran computers outside in a tent just to prove it works
> fine:
>
> https://www.computerworld.com/article/2533138/data-center/
> running-servers-in-a-tent-outside--it-works.html
>
> It should be easy to purchase equipment that can tolerate 80-90
> degree input without damage.  But that's not the question here.
> The question is if the temp is within the range specified in the
> contract.  If it is, deal with it, and if it is not, hold your
> vendor to delivering what they promised.
>
> --
> Leo Bicknell - bickn...@ufp.org
> PGP keys at http://www.ufp.org/~bicknell/
>


Re: Temp at Level 3 data centers

2017-10-11 Thread Jeremy Austin
My 0.041 BTC:

1) For small facilities, without separate temperature-controlled UPS zones,
the optimum temperature for lead-acid batteries may be the lower bound.
77°F is optimal, with significant reduction in battery life even 15°F above
that. Given that batteries' internal temperature will be higher than
ambient, 80° set point is not stupid. I run cooler, FWIW.

2) Headroom. I try to have documented for each facility the climb in
degrees per hour (determined empirically) as a backup so I know required
response times when AC failure occurs.

On Wed, Oct 11, 2017 at 10:09 AM, Naslund, Steve 
wrote:

>
> Bottom line 80 F input air is too hot in my opinion and apparently the
> equipment's opinion as well.
>
> --
Jeremy Austin
jhaus...@gmail.co m

(907) 895-2311 office
(907) 803-5422 cell

Heritage NetWorks  - Whitestone Power &
Communications - Vertical Broadband, LLC 


RE: Temp at Level 3 data centers

2017-10-11 Thread Naslund, Steve
I think the key here is that if your set point is at 80 F, you better be able 
to hit it with a unit down and you better be able to react instantly to any 
environmental failure.  You just have no headroom to play with.

Steven Naslund
Chicago IL

>-Original Message-
>From: NANOG [mailto:nanog-boun...@nanog.org] On Behalf Of Zachary Winnerman
>Sent: Wednesday, October 11, 2017 11:54 AM
>To: nanog@nanog.org
>Subject: Re: Temp at Level 3 data centers
>
>I recall some evidence that 80+F temps can reduce hard drive lifetime, though 
>it might be outdated as it was from before SSDs were around. I would imagine 
>that while it may not impact the ability for a server to handle load, it may 
>>reduce equipment lifetime. It also could be an indication that they lack 
>redundancy in the case of an AC failure. This could cause equipment damage if 
>the datacenter is unattended and temperatures are allowed to rise.



RE: Temp at Level 3 data centers

2017-10-11 Thread Naslund, Steve
If the ambient temperature is higher is means the temperatures throughout the 
device would be higher and the temp at those points is what really matters.  I 
would also be concerned because if they lose one of the a/c units what would 
the ambient temperature rise to?  I would want them to tell me what the set 
point of the a/c actually is.

Bottom line 80 F input air is too hot in my opinion and apparently the 
equipment's opinion as well.


Steven Naslund
Chicago IL




  

>-Original Message-
>From: NANOG [mailto:nanog-boun...@nanog.org] On Behalf Of Zachary Winnerman
>Sent: Wednesday, October 11, 2017 11:54 AM
>To: nanog@nanog.org
>Subject: Re: Temp at Level 3 data centers
>
>I recall some evidence that 80+F temps can reduce hard drive lifetime, though 
>it might be outdated as it was from before SSDs were around. I would imagine 
>that while it may not impact the ability for a server to handle load, it may 
>>reduce equipment lifetime. It also could be an indication that they lack 
>redundancy in the case of an AC failure. This could cause equipment damage if 
>the datacenter is unattended and temperatures are allowed to rise.


Re: Temp at Level 3 data centers

2017-10-11 Thread Zachary Winnerman
That's a good point, though if you are running your breakers that close
I think you have bigger problems, as a power outage, however unlikely,
could cause your equipment to not come back up at all. Software updates
that reboot several servers in quick succession could also cause a
breaker to trip under those circumstances. Unfortunately, there's no way
to tell how close a breaker is to tripping without tripping it. Breakers
may have amp meteres and a rated size, but the actual load before
tripping is +-20% for common models, meaning a 20A breaker may trip as
low as 16A.


On 2017年10月11日 12:58, Matt Harris wrote:
> Another thing to remember - and I've actually seen breakers tripped on
> PDUs due to heat before because of this - is that it's going to spin
> all of your fans harder to keep internal temps down if the ambient
> temp is higher. This will increase your power draw, which means that
> if you're paying for metered power by usage, you're going to pay more
> - those fans really do add up in terms of power. In extreme cases, you
> can draw too much power and trip a breaker on a PDU because every host
> in a rack, or especially those towards the top, are spinning full
> tilt. It's not a good condition and one that you should force them to
> correct. 
>
>
> On Wed, Oct 11, 2017 at 11:54 AM, Zachary Winnerman
> > wrote:
>
> I recall some evidence that 80+F temps can reduce hard drive lifetime,
> though it might be outdated as it was from before SSDs were around. I
> would imagine that while it may not impact the ability for a server to
> handle load, it may reduce equipment lifetime. It also could be an
> indication that they lack redundancy in the case of an AC failure.
> This
> could cause equipment damage if the datacenter is unattended and
> temperatures are allowed to rise.
>
>
> On 2017年10月11日 11:45, Keith Stokes wrote:
> > There are plenty of people who say 80+ is fine for equipment and
> data centers aren’t built for people.
> >
> > However other things have to be done correctly.
> >
> > Are you sure your equipment is properly oriented for airflow
> (hot/cold aisles if in use) and has no restrictions?
> >
> > On Oct 11, 2017, at 9:42 AM, Sam Kretchmer
>   >> wrote:
> >
> > with a former employer we had a suite at the L3 facility on Canal in
> > Chicago. They had this exact issue for the entire time we had
> the suite.
> > They kept blaming a failing HVAC unit on our floor, but it went
> on for
> > years no matter who we complained to, or what we said.
> >
> > Good luck.
> >
> >
> > On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
> >   > on behalf of
> dhubb...@dino.hostasaurus.com
> 
>  >> wrote:
> >
> > Curious if anyone on here colo¹s equipment at a Level 3 facility
> and has
> > found the temperature unacceptably warm?  I¹m having that experience
> > currently, where ambient temp is in the 80¹s, but they tell me
> that¹s
> > perfectly fine because vented tiles have been placed in front of all
> > equipment racks.  My equipment is alarming for high temps, so
> obviously
> > not fine.  Trying to find my way up to whomever I can complain
> to that¹s
> > in a position to do something about it but it seems the support
> staff
> > have been told to brush questions about temp off as much as
> possible.
> > Was wondering if this is a country-wide thing for them or unique
> to the
> > data center I have equipment in.  I have equipment in several
> others from
> > different companies and most are probably 15-20 degrees cooler.
> >
> > Thanks,
> >
> > David
> >
> >
> >
> > ---
> >
> > Keith Stokes
> >
> >
> >
> >
>
>
>
>
>
> -- 
> Matt Harris - Chief Security Officer
> Main: +1 855.696.3834 ext 103
> Mobile: +1 908.590.9472
> Email: m...@netfire.net 



signature.asc
Description: OpenPGP digital signature


Re: Temp at Level 3 data centers

2017-10-11 Thread Leo Bicknell
In a message written on Wed, Oct 11, 2017 at 12:54:26PM -0400, Zachary 
Winnerman wrote:
> I recall some evidence that 80+F temps can reduce hard drive lifetime,
> though it might be outdated as it was from before SSDs were around. I

This is very much a "your infrastructure may vary" situation.

The servers we're currently buying when speced with SSD only and
the correct network card (generally meaning RJ45 only, but there
are exceptions) are waranteed for 105 degree inlet operations.
While we do not do "high temperature operations" we have seen
operations where folks run them at 90-100 degree input chasing
effiency.

Famously, Intel ran computers outside in a tent just to prove it works
fine:

https://www.computerworld.com/article/2533138/data-center/running-servers-in-a-tent-outside--it-works.html

It should be easy to purchase equipment that can tolerate 80-90
degree input without damage.  But that's not the question here.
The question is if the temp is within the range specified in the
contract.  If it is, deal with it, and if it is not, hold your
vendor to delivering what they promised.

-- 
Leo Bicknell - bickn...@ufp.org
PGP keys at http://www.ufp.org/~bicknell/


signature.asc
Description: PGP signature


Re: Temp at Level 3 data centers

2017-10-11 Thread Josh Reynolds
http://www.datacenterknowledge.com/archives/2008/10/14/google-raise-your-data-center-temperature

On Oct 11, 2017 11:56 AM, "Zachary Winnerman" 
wrote:

> I recall some evidence that 80+F temps can reduce hard drive lifetime,
> though it might be outdated as it was from before SSDs were around. I
> would imagine that while it may not impact the ability for a server to
> handle load, it may reduce equipment lifetime. It also could be an
> indication that they lack redundancy in the case of an AC failure. This
> could cause equipment damage if the datacenter is unattended and
> temperatures are allowed to rise.
>
>
> On 2017年10月11日 11:45, Keith Stokes wrote:
> > There are plenty of people who say 80+ is fine for equipment and data
> centers aren’t built for people.
> >
> > However other things have to be done correctly.
> >
> > Are you sure your equipment is properly oriented for airflow (hot/cold
> aisles if in use) and has no restrictions?
> >
> > On Oct 11, 2017, at 9:42 AM, Sam Kretchmer  > wrote:
> >
> > with a former employer we had a suite at the L3 facility on Canal in
> > Chicago. They had this exact issue for the entire time we had the suite.
> > They kept blaming a failing HVAC unit on our floor, but it went on for
> > years no matter who we complained to, or what we said.
> >
> > Good luck.
> >
> >
> > On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
> >  on behalf of
> dhubb...@dino.hostasaurus.com>
> wrote:
> >
> > Curious if anyone on here colo¹s equipment at a Level 3 facility and has
> > found the temperature unacceptably warm?  I¹m having that experience
> > currently, where ambient temp is in the 80¹s, but they tell me that¹s
> > perfectly fine because vented tiles have been placed in front of all
> > equipment racks.  My equipment is alarming for high temps, so obviously
> > not fine.  Trying to find my way up to whomever I can complain to that¹s
> > in a position to do something about it but it seems the support staff
> > have been told to brush questions about temp off as much as possible.
> > Was wondering if this is a country-wide thing for them or unique to the
> > data center I have equipment in.  I have equipment in several others from
> > different companies and most are probably 15-20 degrees cooler.
> >
> > Thanks,
> >
> > David
> >
> >
> >
> > ---
> >
> > Keith Stokes
> >
> >
> >
> >
>
>
>


Re: Temp at Level 3 data centers

2017-10-11 Thread Zachary Winnerman
I recall some evidence that 80+F temps can reduce hard drive lifetime,
though it might be outdated as it was from before SSDs were around. I
would imagine that while it may not impact the ability for a server to
handle load, it may reduce equipment lifetime. It also could be an
indication that they lack redundancy in the case of an AC failure. This
could cause equipment damage if the datacenter is unattended and
temperatures are allowed to rise.


On 2017年10月11日 11:45, Keith Stokes wrote:
> There are plenty of people who say 80+ is fine for equipment and data centers 
> aren’t built for people.
>
> However other things have to be done correctly.
>
> Are you sure your equipment is properly oriented for airflow (hot/cold aisles 
> if in use) and has no restrictions?
>
> On Oct 11, 2017, at 9:42 AM, Sam Kretchmer 
> > wrote:
>
> with a former employer we had a suite at the L3 facility on Canal in
> Chicago. They had this exact issue for the entire time we had the suite.
> They kept blaming a failing HVAC unit on our floor, but it went on for
> years no matter who we complained to, or what we said.
>
> Good luck.
>
>
> On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
>  on behalf of 
> dhubb...@dino.hostasaurus.com> wrote:
>
> Curious if anyone on here colo¹s equipment at a Level 3 facility and has
> found the temperature unacceptably warm?  I¹m having that experience
> currently, where ambient temp is in the 80¹s, but they tell me that¹s
> perfectly fine because vented tiles have been placed in front of all
> equipment racks.  My equipment is alarming for high temps, so obviously
> not fine.  Trying to find my way up to whomever I can complain to that¹s
> in a position to do something about it but it seems the support staff
> have been told to brush questions about temp off as much as possible.
> Was wondering if this is a country-wide thing for them or unique to the
> data center I have equipment in.  I have equipment in several others from
> different companies and most are probably 15-20 degrees cooler.
>
> Thanks,
>
> David
>
>
>
> ---
>
> Keith Stokes
>
>
>
>




signature.asc
Description: OpenPGP digital signature


Re: Temp at Level 3 data centers

2017-10-11 Thread Ken Chase
My house isnt built for moving furniture, it's built for living in. I've not
moved a bed in or out of the bedroom in 8 years now. But for the 15 minutes
I did move a bed, the door and hallway had to accomodate it.

Humans have to go into datacenters - often in an emergency. Complicating the
servicing of equipment by having sweat drip off you into the electronics is
not condusive to uptime.

/kc

On Wed, Oct 11, 2017 at 03:45:30PM +, Keith Stokes said:
  >There are plenty of people who say 80+ is fine for equipment and data 
centers aren???t built for people.
  >
  >However other things have to be done correctly.
  >
  >Are you sure your equipment is properly oriented for airflow (hot/cold 
aisles if in use) and has no restrictions?
  >
  >On Oct 11, 2017, at 9:42 AM, Sam Kretchmer 
> wrote:
  >
  >with a former employer we had a suite at the L3 facility on Canal in
  >Chicago. They had this exact issue for the entire time we had the suite.
  >They kept blaming a failing HVAC unit on our floor, but it went on for
  >years no matter who we complained to, or what we said.
  >
  >Good luck.
  >
  >
  >On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
  > on behalf of 
dhubb...@dino.hostasaurus.com> wrote:
  >
  >Curious if anyone on here colo??s equipment at a Level 3 facility and has
  >found the temperature unacceptably warm?  I??m having that experience
  >currently, where ambient temp is in the 80??s, but they tell me that??s
  >perfectly fine because vented tiles have been placed in front of all
  >equipment racks.  My equipment is alarming for high temps, so obviously
  >not fine.  Trying to find my way up to whomever I can complain to that??s
  >in a position to do something about it but it seems the support staff
  >have been told to brush questions about temp off as much as possible.
  >Was wondering if this is a country-wide thing for them or unique to the
  >data center I have equipment in.  I have equipment in several others from
  >different companies and most are probably 15-20 degrees cooler.
  >
  >Thanks,
  >
  >David
  >
  >
  >
  >---
  >
  >Keith Stokes
  >
  >
  >
  >

/kc
--
Ken Chase - m...@sizone.org Guelph Canada


Re: Temp at Level 3 data centers

2017-10-11 Thread Keith Stokes
There are plenty of people who say 80+ is fine for equipment and data centers 
aren’t built for people.

However other things have to be done correctly.

Are you sure your equipment is properly oriented for airflow (hot/cold aisles 
if in use) and has no restrictions?

On Oct 11, 2017, at 9:42 AM, Sam Kretchmer 
> wrote:

with a former employer we had a suite at the L3 facility on Canal in
Chicago. They had this exact issue for the entire time we had the suite.
They kept blaming a failing HVAC unit on our floor, but it went on for
years no matter who we complained to, or what we said.

Good luck.


On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
 on behalf of 
dhubb...@dino.hostasaurus.com> wrote:

Curious if anyone on here colo¹s equipment at a Level 3 facility and has
found the temperature unacceptably warm?  I¹m having that experience
currently, where ambient temp is in the 80¹s, but they tell me that¹s
perfectly fine because vented tiles have been placed in front of all
equipment racks.  My equipment is alarming for high temps, so obviously
not fine.  Trying to find my way up to whomever I can complain to that¹s
in a position to do something about it but it seems the support staff
have been told to brush questions about temp off as much as possible.
Was wondering if this is a country-wide thing for them or unique to the
data center I have equipment in.  I have equipment in several others from
different companies and most are probably 15-20 degrees cooler.

Thanks,

David



---

Keith Stokes






Re: Temp at Level 3 data centers

2017-10-11 Thread Sam Kretchmer
with a former employer we had a suite at the L3 facility on Canal in
Chicago. They had this exact issue for the entire time we had the suite.
They kept blaming a failing HVAC unit on our floor, but it went on for
years no matter who we complained to, or what we said.

Good luck.


On 10/11/17, 7:31 AM, "NANOG on behalf of David Hubbard"
 wrote:

>Curious if anyone on here colo¹s equipment at a Level 3 facility and has
>found the temperature unacceptably warm?  I¹m having that experience
>currently, where ambient temp is in the 80¹s, but they tell me that¹s
>perfectly fine because vented tiles have been placed in front of all
>equipment racks.  My equipment is alarming for high temps, so obviously
>not fine.  Trying to find my way up to whomever I can complain to that¹s
>in a position to do something about it but it seems the support staff
>have been told to brush questions about temp off as much as possible.
>Was wondering if this is a country-wide thing for them or unique to the
>data center I have equipment in.  I have equipment in several others from
>different companies and most are probably 15-20 degrees cooler.
>
>Thanks,
>
>David



Temp at Level 3 data centers

2017-10-11 Thread David Hubbard
Curious if anyone on here colo’s equipment at a Level 3 facility and has found 
the temperature unacceptably warm?  I’m having that experience currently, where 
ambient temp is in the 80’s, but they tell me that’s perfectly fine because 
vented tiles have been placed in front of all equipment racks.  My equipment is 
alarming for high temps, so obviously not fine.  Trying to find my way up to 
whomever I can complain to that’s in a position to do something about it but it 
seems the support staff have been told to brush questions about temp off as 
much as possible.  Was wondering if this is a country-wide thing for them or 
unique to the data center I have equipment in.  I have equipment in several 
others from different companies and most are probably 15-20 degrees cooler.

Thanks,

David