pippin wrote: 
> Hm, how is this more secure than what Meep is doing?
> 

Hi Pippin. Great to open up this discussion. It's not intended as a
replacement / comparison to Meep's... more that there are some (other)
solutions out there which simply open up LMS directly (or use HTTP basic
auth) which is obviously a bad idea.

> 
> I mean... TLS and everything but this still requires you to open up your
> server machine to the internet and as security issues are this should at
> least involve staying up to date with known security issues in e.g. your
> TLS client on a daily basis because, you know, these big security issues
> that made the news in recent years were all about issues in security
> software like OpenSSL...
> 

I agree - patches are important, and yes for OpenSSL, as a key piece of
internet infrastructure it's scary and definitely in the firing line &
press often. To an extent that's why I'm advocating using -any- TLS TCP
proxy most suited to your server (if you can get one working - I've only
really tried stunnel) so you can choose.


> 
> So if such an issue shows up again (or it's not yet fixed on your NAS)
> you can fall victim to port scan attacks with such a setup. Not too
> likely but possible, some such bugs have been around (and used!) for
> years.
> 

Sure. We can guarantee routers / anything in the DMZ are getting
port-scanned anyway - the key is keeping those ports closed or secured
plus a certain (debatable!) level of obscurity, e.g. maybe don't expose
it on 443. The other key advice is to reduce the -impact- of a breach:
here this is not an SSH login, and shares no authentication mechanism
with the server OS - so the impact is limited to the LMS CLI (still bad
though). For further security around that you can enable built-in
user/pass auth on the LMS server - as it's DIY, it's up to the user to
decide.

> 
> I think the only thing being more secure is a solution where LMS only
> does outbound communication, that is: a plugin that connects to Alexa
> (or some intermediate service) and polls.
> 

For better or worse Alexa infra doesn't support incoming connections
(nor do AWS Lambdas), so you'd definitely need an intermediate server.
But traditional polling is probably not viable for a voice-prompted
service (where you want milliseconds of latency): the polling would need
to be too tight. With an intermediate service, a long-standing
connection / websocket might work and might be good to explore some
time, but the load (and cost) then goes up aggressively with the number
of registered users.

But: you still need to trust this intermediate service to control your
LMS... which is something I was uneasy about generally. Remember too -
-that- server is open to the internet on some port(s), probably /
hopefully uses TLS of some sort, and controls -everyone-'s LMS
instances.


------------------------------------------------------------------------
nickb's Profile: http://forums.slimdevices.com/member.php?userid=66261
View this thread: http://forums.slimdevices.com/showthread.php?t=107009

_______________________________________________
plugins mailing list
[email protected]
http://lists.slimdevices.com/mailman/listinfo/plugins

Reply via email to