I am attempting to restrict access to servers in my farm and as a test
attempted the following with just one instance.
I first changed my "default" group to include the
following:

user: 919814621061
group: scalr-ssh2

and

user: 919814621061
group: scalr-snmp

I was assuming this would make sure that Scalr could access ssh and
snmp even if I block it from the security group. However upon doing
this Scalr immediately loses SNMP contact with the instance, and a
"Synchronize to All" shows that SSH is blocked as well because I got
this error:

Apr  3 22:14:45 ec2-72-44-39-234 logger: SCALR(instance-init.sh):
ec2_submit_event(hostInit): Cannot upload keys on 'i-9e3c5cf7'. Failed
to connect to '72.44.39.234:22'.

So, any ideas from the amazing Scalr devs would be excellent. I don't
mind if I have to put a bunch of static ips into my default security
group for Scalr. I think you guys are hosted at The Planet right? I am
assuming you probably have static ips. However, some kind of fix that
doesn't require ips would be excellent, because it would have to
constantly be updated as you guys added/removed servers etc.

Any ideas?


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"scalr-discuss" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to 
[email protected]
For more options, visit this group at 
http://groups.google.com/group/scalr-discuss?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to