On Wed, 28 May 2003, ken wrote:
John Kozubik wrote:
d) set up an automated script on the server that _constantly_ fetches
random web pages, thus creating a constant stream of http traffic in and
out of the server, again diminishing traffic patterns. Log the actual
proxy requests in some temporary fashion and randomly hit those web sites
in an automated fashion throughout the day, regardless of whether someone
is requesting them through the proxy or not...and then, script a constant
stream of requests to the proxy as well
Fun difficult part is setting up fetching of random web pages
that looks like real user activity.
Yes, this is a somewhat interesting problem - probably not that difficult
considering that the goal here is to create plausible deniability in a
setting like a court of law. Generating traffic patterns that convince
other crytpographers (or even sysadmins) is much harder than generating
traffic patterns that simply create reasonable doubt.
Also, unless you have some very odd friends, user activity will
vary in statistically likely ways over time, so the ideal system
would randomly compensate for that.
Exactly. The ideal system would monitor in and outbound:
- web requests
- bytes transferred
- bytes per page
- pictures per page
- binary files transferred
- (all of those) / second
and generate pseudo-random browsing to smooth these variables over time.
Perhaps a script that chose random word pairs from the dictionary, googled
them, and browsed the pages that were returned would be a good platform.
-
John Kozubik - [EMAIL PROTECTED] - http://www.kozubik.com