* Ted Hardie wrote:
>If the pattern of download is consistent and the passive surveillance
>system is keyed to look for that pattern, you're right that this is a
>risk.  You can mitigate it as a user by downloading from multiple sites at
>the blog site--if you're looking at custom cars and support information and
>knitting, you're not going to trigger the pattern (or not as easily).  A
>site that knows its data may be sensitive can also vary the data to avoid
>easy pattern matching; this might get easier with the deployment of HTTP
>2.0, since multiple flows are multiplexed over a single TLS connection;
>deliver different in-line ads for the same content and you get different
>patterns.

Right, thanks for the confirmation. Is this an area where the IETF can
and should do more? As an example, should the HTTP/2.0 specification
discuss or an accompanying specification discuss protocol options that
could be useful in mitigating such attacks? Perhaps some kind of meta
document that discusses when adding automatically generated noise to
mitigate such attacks becomes abusive and harmful to the network? It
seems to me this would be a next attack point if suddenly "everything"
goes encrypted over the wire, but I am not sure what level of interest
there is in doing anything about it.

(Even with HTTP/1.1 there are many options to add some innocious noise
and randomness, a browser can decide to forget a cached resource and
fetch it again, or decide it is not going to prefetch a resource or do
it later than normally, and with HTTP/2.0 servers can do similar things
with server-push).
-- 
Björn Höhrmann · mailto:[email protected] · http://bjoern.hoehrmann.de
Am Badedeich 7 · Telefon: +49(0)160/4415681 · http://www.bjoernsworld.de
25899 Dagebüll · PGP Pub. KeyID: 0xA4357E78 · http://www.websitedev.de/ 
_______________________________________________
perpass mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/perpass

Reply via email to