Well said Francesco.

In reply to attack concerns, Twister is built on Bittorrent, which uses 
transactional algorithms to guess and block bad nodes, but I don't know if 
these still work in Twister because of its different behaviour.

In reply about Tor: if I *could* magically guess which packets contained child 
porn or other outright-evil material, I of course would block those. I would 
still relay politically nonsense material, but not because failing to do so is 
"censorship". My refusal to relay bullshit is not censorship, it's *my free 
speech* not to be forced to speak(/relay) on your behalf.

But, because Tor is more important to society right now than my objections, I 
endorse it. The zero-knowledge nature of Tor routers means I cannot guess 
material, and I believe it is far more good than evil. If I felt it was 
significantly about CP, I'd stop helping/endorsing/relaying (see for example 
Freenet).

Twister is not anonymous by design like Tor/Freenet, so I have the ability to 
see messages I relay. Therefore, I have an ethical imperative to act, I must 
either filter or stop relaying when I see harm to other humans being celebrated 
or fetishised.

If Twister were zero knowledge, I would not be able to do this. Instead, like 
Tor, I would need to guesstimate the good-ness of the traffic I relay, and 
decide whether to participate at all. If it was even 5% CP I'd feel I could not 
ethically take part. Functionally that's equivalent to 100% filtering as I am 
no longer relaying *anything*.

In this sense, Filtering would help me to relay *more* twister traffic, because 
II'd go from 0% to 99%. :)

On 12 September 2015 08:38:44 IST, Francesco Ariis <[email protected]> wrote:
>On Sat, Sep 12, 2015 at 07:39:50AM +0100, Cathal (Phone) wrote:
>> If they could, then they wouldn't need help.from the twister team.
>> 
>> The requested feature is a *refusal to relay*, it has no bearing on
>other nodes. If this were useful for an *attack* then it'd be trivial
>to code a fake node to do just that: connect and not relay.
>> 
>> In P2P unless you trust a node, you never trust a node. So anyone
>trusting a set of nodes to tell them the whole story will have a bad
>day. If ye want child porn in a world where non-monsters can choose not
>to relay it, you'd better keep hunting for nodes willing to share child
>porn with you.
>> 
>> The tragwdy of the commons is that in the absence of mechanisms for a
>community to protect it, a common resource usually ends up abused. This
>is a democratic and anarchic way to protect the commons of free speech
>and thought from abuse before it becomes known as just
>yet-another-refuge-for-cp.
>> 
>
>Hello Cathal,
>    thanks for your reply.
>
>If I am reading your message correctly, you say: "It doesn't matter,
>because even if it mattered, a malicious user could implement this
>trivially by themselves today, without anyone being able to detect it".
>I appreciate the explanation and would personally add: "*if* it
>matters,
>we need to find a way to prevent this kind of attack". Since Twister
>is a µblog service, I would consider a successful attack everything
>where a message gets delayed by more than 15 minutes (imagine twister
>being used to organise a political rally).
>But as I said I am quite ignorant on the inner mechanism of Twister,
>so I don't know if this is even possible (but you showed it is *not
>related* to the particular feature requested).
>
>I'll share with you another thought I had. I think everyone in this
>ML used TOR at least once (I see you suggest it as 'the ultimate
>anti-censorship technology' in your blog).
>Well, if we are to run an onion-router, there is a risk (I would say
>certainty) that a non-zero number of packets will contain illegal,
>morally abject or even terrorism related material. If every
>onion-router provider said "I don't want to risk retransmitting
>those", the whole network would be impaired in its primary goal
>(letting political dissenters retrieve/publish information without
>being identified, etc.).
>I am not saying that this related 1:1 to the case at hand (it doesn't
>translate well really, in one case the information is overt, in the
>tor-case we just 'peel' one layer of encryption), but that's a
>simple example where the _very_ understandable behaviour of many would
>influence the network as a whole.
>
>To Erkan and the proponents of the 'censorship-resilience is not open
>for debate' folks: imagine a scenario where a malicious party floods
>twister with a ton of gibberish material, rendering the transmission
>ineffective or very very slow. Wouldn't it be reasonable to block
>those nodes then? And even share a block-list to make the process
>speedier?
>
>What I am trying to convey with those two examples is that it might
>be not easy to foresee the results of our intents (those intents
>can come in the form of code or behaviours).
>
>To state how I feel about the issue: I think to let every node
>decide what/what not to relay is a valid request, from an ethical
>point of view (don't force someone to do what they don't want to
>do) and from a technical point of view (you won't be able to control
>this anyway).
>Twister is a nice piece of software, I hope the community can come
>up with a solution everyone can stand behind.

-- 
Sent from my Android device with K-9 Mail. Please excuse my brevity.

-- 
You received this message because you are subscribed to the Google Groups 
"twister-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to [email protected].
To post to this group, send email to [email protected].
For more options, visit https://groups.google.com/d/optout.

Reply via email to