"the ideal would be to hit a high enough rate that it makes real-time
analysis of content (by a human) impossible. By the time the service hit
that rate of chats, it will be nigh-unusable by people.  "

Every client could broadcast a message on a timer. Sometimes the message
would be wheat and sometimes chaff.

Then the downsides would be:

1) Additional latency between composing the message and the next timer
pulse. In terms of UX, slower sends.

2) A bigger buffer, flushing more often.

Problem #2 could be ameliorated with something like sharding. If there were
S shards and M messages total, a peer would buffer M/S messages.




On Tue, Jun 11, 2013 at 11:42 AM, Griffin Boyce <griffinbo...@gmail.com>wrote:

> Sean Cassidy <sean.a.cass...@gmail.com> wrote:
>
>> First is that if the load on the network is high enough, conversations
>> can hide in the noise. This is helped by dummy message generation
>> either by clients or servers (preferably clients to protect against
>> attackers that can monitor every node).
>
>
>   Unless I'm missing something (entirely possible): From your standpoint,
> the ideal would be to hit a high enough rate that it makes real-time
> analysis of content (by a human) impossible. By the time the service hit
> that rate of chats, it will be nigh-unusable by people.  This is more or
> less why chat channels (eg, IRC) were created in the first place.  And that
> doesn't preclude outside observers from storing and correlating the chats.
>
> ~Griffin
>
> --
> Too many emails? Unsubscribe, change to digest, or change password by
> emailing moderator at compa...@stanford.edu or changing your settings at
> https://mailman.stanford.edu/mailman/listinfo/liberationtech
>
--
Too many emails? Unsubscribe, change to digest, or change password by emailing 
moderator at compa...@stanford.edu or changing your settings at 
https://mailman.stanford.edu/mailman/listinfo/liberationtech

Reply via email to