On Sun, Oct 26, 2014 at 10:59 AM, Stephane Bortzmeyer <bortzme...@nic.fr>
wrote:

> On Sat, Oct 25, 2014 at 07:35:11PM -0700,
>  Watson Ladd <watsonbl...@gmail.com> wrote
>  a message of 54 lines which said:
>
> > Before DPRIV: anyone who owns the DNS box at an ISP can see all
> > dns-queries go through, and know who made them.
> >
> > After: exactly the same.
>
> You seem to consider that DPRIVE = encryption of the stub-to-resolver
> link and nothing else. But DPRIVE may work on other things that will
> improve the situation such as recommending local (local = on the
> user's machine or network) resolvers before, or instead, IAP's
> resolvers.
>
> One of the main reasons we get little done in security is that people tend
to derail discussions of the possible with demands for the perfect.

The other main reason is reductionism, demanding security solutions that
only fit in one box.


Traditionally the IETF demanded end-to-end security on a take it or leave
it basis. And so today 99.99% of email is not encrypted with PGP or S/MIME.
But something like 50% is secured using STARTTLS.

The reason I originally designed OmniBroker and PRIVATE-DNS was that I have
spent over ten years trying to get browser providers to implement DNSSEC so
we could do security policy and I know the constraints they raise. Their #1
concern is to minimize latency. In security concerns, preventing state
censorship is probably a higher concern than privacy. But only Mozilla is
likely to give a candid answer on that.


So we have a hierarchy of security concerns.

0) [Solved] Authenticity of authoritative data

1) Authenticity, Integrity and Confidentiality of DNS stub-resolver traffic

2) Confidentiality of DNS resolver-authoritative traffic

3) Disclosure by the resolver


We can address (1) very simply and cheaply and do so in a fashion that is
compatible with (3). What we can't do is to provide (3) without severe
impact on latency.

And if you don't solve the latency problem then you end up with the Harvard
TOR-DOX issue. Use of TOR is very thin so any use of TOR becomes suspect.
So when someone called in a bomb threat to avoid taking a final that day,
all the Harvard police needed to do was to look at the campus network logs,
find out who was using TOR when the threat was called in and call them both
in for interrogation.

We can address 1 and 2 with encryption. But solving 3 properly requires
steganography.
_______________________________________________
dns-privacy mailing list
dns-privacy@ietf.org
https://www.ietf.org/mailman/listinfo/dns-privacy

Reply via email to