> > On the second point, traditionally to prevent MITM's from modifying traffic > > in > > transit websites use SSL which is a most basic of security protocols. > > Tails.org and > > its distro mirrors do not even bother with this, the arguement supplied by > > the devs > > is "they can just attack the mirrors anyways"....see point 1. > > I've no idea what specific attack, using the aforementioned > nomenclature, you are referring to here. Your introduction suggests > that you're talking about a MitM modifying target files (.iuk), which > our current system protects users against, so it must be something > else => please clarify.
Your system may protect incremental updates against MITM attacks, but the iso's that all users must start from and periodically update with are protected by....nothing. You have a signing key that may prove authenticity to those who already had it before (cue attack scenario) a mitm modified the tails.boum.org site with compromised isos and fake keys, but how is a new users supposed to know their key isnt authentic? While protecting a website from attacks may be an ongoing challenge that can be won, what is protecting users from MITM's? You dont even bother to use https, why may i ask? And what of users who are continually having to download 900mb iso's that turn out to not be genuine because they are constantly getting MITM'd by rogue exit nodes and non-tor attackers? > > And why does tails.org provide such > > lengthy guides on verifying the .iso's using information that could just as > > easily be > > forged (SHA256/signing keys)?. If the devs would create a distrobution > > channel that > > ran over tor as an .onion then none of these problems would exist. > > It seems to me that you're assuming a pool of 100% trusted mirrors > that are all run in a very secure way, and a super-strong Tor Hidden > Services system. Unfortunately, that's not what we have right now :( > > I'm also curious what method users would use to bootstrap trust into > that .onion *address*, that would safe in the threat model you're > reasoning about in this thread. So the entire tails distribution security model hinges entirely on your keys? At least with an onion address we would have an additional easily identifable point of reference, the onion address, along with the superior (to SSL) MITM protection it offers. The private key to the onion would serve as a second key that you could secure and use to authenticate your identities in the case of a breach. -------- Original Message -------- From: intrigeri <[email protected]> Apparently from: [email protected] To: User support for Tails <[email protected]> Subject: Re: [Tails-support] Insecure updates, devs please address Date: Tue, 20 Jan 2015 21:50:28 +0100 > Hi, > > [email protected] wrote (19 Jan 2015 20:44:29 GMT) : > >> > namely the mirrors serving the updates can be made to serve malicious > >> > iso's with > >> > fake verification keys. > >> > >> Either I don't understand what you mean, or you didn't understand the > >> security discussion you're referring to. May you please clarify what > >> you mean with "fake verification keys", and what exact section of the > >> aforementioned security discussion you're referring to? > > > Hi, no specific section, > > OK, I misunderstood then. > > > is the basic principle of insecure networks that you are > > pushing updates through that im pointing out is inheriently at fault. > > Namely theres > > two main attack vectors, that tails.org and its mirrors can be made > > malicious by > > intruders, and that the connection to the mirrors/tails.org can be modified > > in > > transit by a malicious 3rd party. > > In what follows, I'll assume that when you write "tails.org", you > really mean "https://tails.boum.org/". Note that this website has no > mirror, and we've got no plans to change that currently. > > Also, it would be helpful if you were using the same well-defined > nomenclature as the one we're using in our incremental upgrades > security discussion (which is the same nomenclature that the current > research on this topic uses, by the way). This would make the > discussion much clearer and would save me a lot of time. > > > On the first point, when a sites IP is known its server can be located > > rendering it > > vulnerable to physical seizure or attacks, im assuming your servers arent > > guarded > > 24/7 by 2 armed men then you are leaving everyone wide open to what is a > > fairly > > common technique amongst governments to access and control a server locally. > > The obvious solution is to create a tails.org .onion located on a new > > server separate > > from the clearnet one as to insure its location remains a secret, allowing > > for its > > users to reference it as a secure verifiable source of info. As for the > > possibility > > of remote hacking i will assume since the tails devs are capable of > > securing an OS > > that they are capable of securing a website. > > Indeed, anyone who takes control of our website, but not of our > signing key, can perform "Indefinite freeze attacks" and "Endless data > attacks". I agree that serving the upgrade-description files from > a Tor Hidden Service hosted in a secret location would make physical > attacks harder. It would not make remote attacks any harder, though > (in passing, note that securing a client OS is not the same as > securing a public web server). > > Both attacks are slightly mitigated by the fact that we are announcing > new releases in other ways: > > * one that does not rely on our website at all (Twitter); > * one that does not rely on our website to be safe at the time Tails > Upgrader checks for available upgrades, as long as it was safe at > the time the new release was published (<[email protected]> > announce mailing-list). > > [Adding this information to the design documentation.] > > Given this, the current weaknesses of Tor hidden services, and our > limited resources when it comes to maintaining infrastructure, I'm not > convinced at all that it's worth it to create and maintain a HS with > the upgrade information. It's a matter of priorities, and for now, I'm > pretty sure that our focus (on the infra side) on making the Tails > project more maintainable and sustainable is the way to go. > > Now, if we got substantially more help in this area [1], of course we > could do more useful things in the same time scope :) > > [1] https://tails.boum.org/contribute/how/sysadmin/ > > <Off-topic in this thread> > > > On this point it seems wholly irresponsible that tails users upon > > connecting to Tor > > and loading the browser are connected to a clearnet site with scripts > > enabled, what > > sort of security model opens users up to scripting attacks every single > > time they > > connect to the internet?? > > We've got a ticket open about blocking JavaScript coming from our > website with NoScript => what we need isn't any more arguing and name > calling, it needs good patches. > > </Off-topic in this thread> > > > On the second point, traditionally to prevent MITM's from modifying traffic > > in > > transit websites use SSL which is a most basic of security protocols. > > Tails.org and > > its distro mirrors do not even bother with this, the arguement supplied by > > the devs > > is "they can just attack the mirrors anyways"....see point 1. > > I've no idea what specific attack, using the aforementioned > nomenclature, you are referring to here. Your introduction suggests > that you're talking about a MitM modifying target files (.iuk), which > our current system protects users against, so it must be something > else => please clarify. > > > And why does tails.org provide such > > lengthy guides on verifying the .iso's using information that could just as > > easily be > > forged (SHA256/signing keys)?. If the devs would create a distrobution > > channel that > > ran over tor as an .onion then none of these problems would exist. > > It seems to me that you're assuming a pool of 100% trusted mirrors > that are all run in a very secure way, and a super-strong Tor Hidden > Services system. Unfortunately, that's not what we have right now :( > > I'm also curious what method users would use to bootstrap trust into > that .onion *address*, that would safe in the threat model you're > reasoning about in this thread. > > Cheers, > -- > intrigeri > _______________________________________________ > tails-support mailing list > [email protected] > https://mailman.boum.org/listinfo/tails-support > To unsubscribe from this list, send an empty email to > [email protected]. _______________________________________________ tails-support mailing list [email protected] https://mailman.boum.org/listinfo/tails-support To unsubscribe from this list, send an empty email to [email protected].
