Another aspect of this discussion I'm a bit surprised that no one has yet 
raised is the simple truth that no amount of testing and source code review can 
(or should) anoint a tool as secure.

Even with formally provably secure software, OS, hardware, etc. it is still a 
very hard problem to make sure the code you fuzzed, reviewed, tested, 
statically analyzed, etc. ends up being the code you run.

We faced this in a few projects hacking US voting machines where we had to 
struggle with the question of "how much does one get from open source"... the 
answer was "not necessarily necessary or sufficient" (but that was not in a 
human rights context).

best, Joe

--
Joseph Lorenzo Hall
Senior Staff Technologist
Center for Democracy & Technology
https://www.cdt.org/

On Feb 19, 2013, at 18:36, "Q. Parker" <ghostdan...@gmail.com> wrote:

> On Tue, Feb 19, 2013 at 11:21:11PM +0100, Julian Oliver wrote:
>> ..on Mon, Feb 18, 2013 at 08:00:24PM -0800, Adam Fisk wrote:
>>> 
>>> I think the principle of that is great, but in practice we just can't
>>> all review all the code all the time. In practice we often end up
>>> trusting open source code that is far worse reviewed than much of the
>>> closed source code we trust. I'm not trying to attack open source --
>>> I've been writing open source code full time for the past 13 years --
>>> it's what I do. But I don't think we should be delusional about it.
>> 
>> 
>> I find this an unproductive black-and-white argument. Proprietary software 
>> does
>> not grant and encourage its own users even the /possibility/ to fully audit 
>> the
>> service whereas open source software does. 
>> 
>> It's a no brainer, quite frankly. 
>> 
>> We need to simply stop considering proprietary solutions at all (as it's 
>> clearly
>> ridiculous to have any case of trust built atop it) and make our starting 
>> point
>> the wide variety of open source software, some of which is poorly engineered 
>> and
>> some which is not.
>> 
>> The "what sucks the least" scale must begin with open source, not proprietary
>> offerings from for-profit companies with a centralised service.
>> 
>> Again, it's a no-brainer.
> 
> This is a pretty gross oversimplification that ignores a lot of realities 
> about
> the nature of trust and how complicated things like large software systems are
> assembled.
> 
> First, it seems that "trust" in the context of this thread means "do the 
> readers
> of this list trust this software" which has come to mean, from my reading, 
> "do 
> the members of this list have unfettered access to the source code". That's a
> rather narrow view of trust. There are all sorts of reasons a human rights 
> activist
> might choose to trust a vendor. After all, for a non-technical user, what's to
> recommend the opinion of a volunteer over the opinion of a number of 
> professionals
> working at a relatively small firm? The first is wholly dependent on the 
> expertise
> and access of the volunteer. The second is wholly dependent on the expertise 
> and
> access of the professional. The latter, however, comes with the sense of 
> trust that people tend to have for somebody whose livelihood depends upon 
> maintaining
> a track record of fulfilling obligations to customers with competence and 
> good faith.
> It's not so simple as volunteer is better than vendor.
> 
> Second, I think it's hard to defend the claim that end users always know more
> about the inner workings of large open source projects than they do closed 
> ones
> at private firms. Does everybody who uses Debian observe key-signing parties
> among Debian developers? No, they don't. Do I use open firmware? Do I know 
> with
> absolute certainty what every piece of hardware in my laptop is doing? No, not
> really. We make decisions about which systems we should trust and in what way
> based on a complicated series of risk assessments, each based on a lot of 
> factors.
> I think the assertion that open source projects are always of higher quality 
> by virtue 
> of being open and that the issue is just that simple is hard to defend. For 
> most users, 
> the code being open doesn't make it any more possible for them to review it.
> They'd still have to trust another reviewer, right?  It's not so simple as 
> open versus 
> closed source.
> 
> Third, I think responses on the list tend to be excessively hostile toward 
> for-profit firms
> that hope to make a living by selling/making software. A good many such firms 
> have contributed
> substantially to the Linux kernel and the Debian distribution. There are a 
> lot of competing 
> interests at play, as made obvious by the parallel thread about Ubuntu's Dash 
> product search. 
> But I'm sure there are a lot of list members who've thoroughly enjoyed the 
> conveniences afforded 
> them by the Ubuntu distro, for example, only to break into hysterics over the 
> in-built product 
> search (which should be opt-in but is disabled pretty easily) without 
> offering up any 
> alternative suggestions for paying Canonical developers. Does it make sense 
> to expect all 
> security work to happen with grant money? Do we really want to discourage 
> attempts by small 
> firms to make something interesting and useful at a reasonable profit? I 
> don't think I do. And do 
> I choose to ignore that people paid by for-profit software companies often 
> contribute to important
> open source software projects? No, I'd rather not. It's not so simple as 
> unaffiliated
> and corporate.
> 
> Should you feel comfortable vouching for software which you've not been able 
> to adequately audit? 
> Of course not, but that doesn't mean users can't establish trust for a vendor 
> and its offerings in 
> some other fashion. And does it always make sense for a for-profit firm with 
> even the best of 
> intentions to open up all of their source? I don't think it always does. I 
> think these and plenty 
> of other open questions surrounding how tools for protecting sensitive 
> communications are made and 
> used should indicate that this is far from being "black-and-white" or a 
> "no-brainer".
> 
>> Cheers,
>> 
>> -- 
>> Julian Oliver
>> http://julianoliver.com
>> http://criticalengineering.org
>> --
>> Unsubscribe, change to digest, or change password at: 
>> https://mailman.stanford.edu/mailman/listinfo/liberationtech
> 
> --
> Q.
> --
> Unsubscribe, change to digest, or change password at: 
> https://mailman.stanford.edu/mailman/listinfo/liberationtech

--
Unsubscribe, change to digest, or change password at: 
https://mailman.stanford.edu/mailman/listinfo/liberationtech

Reply via email to