I like the metaphor of creating URLs, address bars, and lock icons.  That
really seems like the direction we need to be thinking right now: what are the
minimal necessary concepts required for a usable and useful system?  Which
metaphors can we reuse?

I think this should be one discussion topic for the Wind Farm event.

.hc

Yaron Goland:
> For whatever it's worth the second article [1] I ever published on Thali (the 
> first [2] was defining the problem I was trying to tackle [3]) was about 
> user's rights and the last right was "A right to successfully use these 
> rights without being a computer/security expert".
> 
> But I must admit that this is aspirational. The complexity you describe is 
> exactly the problem. Some problems are just plain hard. My guess is that we 
> will require some sort of user training. Just like people had to learn about 
> URLs and address bars and lock icons, they are going to have to learn some 
> concepts for them to manage their local discovery well.
> 
> All we can do is try and learn.
> 
>         Yaron
> 
> 
> [1] http://www.goland.org/ausersbillofrights/
> [2] http://www.goland.org/opendataweb/
> [3] If you look at these articles you will see the name Paeony. This was the 
> original name for Thali and is a name that I personally still own the domain 
> for. The Thali name is owned by my employer. I maintain the Paeony name just 
> in case something goes sidewise. But keep in mind that everything we do in 
> Thali is released under MIT or Apache 2.0.
> ________________________________________
> From: Hans-Christoph Steiner <[email protected]>
> Sent: Wednesday, April 22, 2015 8:39 AM
> To: Yaron Goland; [email protected]
> Subject: Re: [guardian-dev] Discovery and user rights
> 
> Its great to see that you have incorporated ethics into your development
> process.  That really needs to be not only part of all technology development,
> but also tech education and research.
> 
> Reading through http://www.goland.org/localdiscoverybillofrights I agree with
> the laws that you have laid out, but my overriding thought is that making
> general systems based on discovery will always end up being really complicated
> to the user, since they have to be aware of all the permutations of access
> controls in order to have any say in how they are used.  Also, it is a pretty
> complicated UI question on top of that.
> 
> I think that complexity is inevitable since the goal here is to map the
> psychology of inter-personal relationships to a software system.  And the
> psychology of inter-personal relationships is vastly complicated.  So for
> systems like this to have any chance of working, they must be mapped as
> directly as possible to existing human behavior.
> 
> .hc
> 
> Yaron Goland:
>> For what it's worth we did some experimenting with NFC and eventually gave 
>> up. The reason is that the range is so outrageously small that if we didn't 
>> exactly line up the NFC transceivers we couldn't get data flowing. The 
>> result was a sort of rubbing dance with the phones that didn't result in a 
>> great user experience. None of this, btw, is unsolvable. I could imagine 
>> having a NFC  logo that identifies where the NFC transceivers are on the 
>> phone or maybe increasing their power a tiny bit so they don't have to be so 
>> outrageously close. But for our scenarios we actually want range since the 
>> point is to allow people to collaborate over as long a distance as their 
>> radios will support.
>>
>> Never the less I still ended up in a similar place to your mail below about 
>> range restrictions.
>>
>> I published an article on my blog, 
>> http://www.goland.org/localdiscoverybillofrights/, where I try to define 
>> what I believe are the rights that a user should have in terms of 
>> controlling presence. In one of the points I talk about how users should be 
>> able to have a policy that lets them only be discovered based on distance.
>>
>> In other words there might be someone who is harassing me at work. I don't 
>> want to walk around with an electronic leash on my neck constantly telling 
>> them where I am. But I still need to work with them. So I could imagine 
>> having logic that could measure the strength of the signal I'm getting from 
>> the harasser. I could then say things like "This person is only allowed to 
>> discover me if they are within 5 feet." At that distance (baring walls :( ) 
>> they can just see me. Also by having some connectivity it will be slightly 
>> harder for them to know I've essentially "un-friended" them.
>>
>> But I also realize that I'm just playing games. A determined attacker is 
>> going to be able to discover where people are and who has unfriended them. 
>> In my longer technical article that I'm still working on I talk about a 
>> bunch of these attack scenarios. But fundamentally if the person you are 
>> trying to avoid is part of your social group then there are just too many 
>> channels for them to use discovery to find you. I don't know if that problem 
>> is solvable. It's one thing to hide from strangers, but it's extremely 
>> difficult to hide from the friends of your friends.
>>
>>         Yaron
>>
>> ________________________________________
>> From: guardian-dev 
>> <[email protected]> on behalf of 
>> Hans-Christoph Steiner <[email protected]>
>> Sent: Tuesday, April 21, 2015 8:33 AM
>> To: [email protected]
>> Subject: Re: [guardian-dev] Discovery and user rights
>>
>> Sounds right on topic for us! We're working with the exact same issues and 
>> tech.
>>
>> I'd like to throw out on related idea, just to get the ideas flowing.  I've
>> been trying to think about how to keep the range small enough so that we can
>> rely on inter-personal skills for handling some of the privacy issues.  NFC 
>> is
>> perfect for that, Bluetooth and local WiFi are also pretty good, but can be
>> locally monitored.
>>
>> This is a core idea of the FDroid app swap process.  The authentication comes
>> from actually physically exchanging with the other people, and having that
>> session last as long as the people are physically present in the same room.
>> That maps concretely to our experience of agreeing to be in the same room 
>> with
>> people, and choosing who in that room to talk to.
>>
>> There are of course large limitations to working like this, so it won't work
>> for lots of use cases where local data transfer can be useful.
>>
>> .hc
>>
>> Yaron Goland:
>>> Guardian Folks,
>>>
>>>
>>> First off I sincerely apologize if this is the wrong mailing list to bring 
>>> this issue up on. I know y'all are working in this area and so I was hoping 
>>> we might share notes. But if this isn't the right place please tell me and 
>>> I promise to cease and desist.
>>>
>>>
>>> Right now I'm working on an design for how to use technologies like BLE to 
>>> enable discovery in ways that respect user rights and don't kill user 
>>> batteries.
>>>
>>>
>>> To this end I've written (and will soon publish) a Users Bill of Rights for 
>>> discovery based on Kim Cameron's Laws of Identity. One of the key rights is 
>>> the right to decide who can discover you.
>>>
>>>
>>> For example, imagine that you are part of a group of 100 or 200 people. 
>>> It's a big discussion list. The contents of the list are being passed 
>>> around via BLE/Wi-Fi. Specifically, BLE is used to discover people around 
>>> you and if they aren't people you have sent your post to then you will 
>>> contact them over Wi-Fi Direct (or Bluetooth or Multi-Peer Connectivity or 
>>> whatever) to pass the data.
>>>
>>>
>>> However there is someone in the group who has been harassing you and you 
>>> don't want them to be able to find you via your telephone. In other words, 
>>> you don't want your phone to tell them you are "in the area". So you go to 
>>> your app and say "Exclude person X from notifications".
>>>
>>>
>>> Now you post to that group of 100 or 200 people. In theory if your phone 
>>> discovers someone who is a member of the group who you haven't already sent 
>>> a copy of the post to then your phone should transmit it automatically.
>>>
>>>
>>> The easiest way to do all of this is to just announce your identity (in our 
>>> case a public key). That way if you see someone else's public key that you 
>>> know you haven't updated yet then your device can contact them. But of 
>>> course the inverse is a big problem. That is, if your phone is constantly 
>>> advertising your public key then the person you want to exclude will see it 
>>> and know you are in the area.
>>>
>>>
>>> In theory the answer to this is to not advertise your key directly but 
>>> rather to take your key and then encrypt it with the keys of the people you 
>>> are trying to reach. In other words if you want to send out 200 updates 
>>> then you would advertise, over BLE, 200 values. Each value is your identity 
>>> encrypted with each person's key.
>>>
>>>
>>> Which means that someone receiving your advertisement would see 200 values 
>>> and have no idea if any of those values are for them. So they would have to 
>>> try to decrypt each and every one using their private key to see if any 
>>> match. If they do then they know who is looking for them.
>>>
>>>
>>> The benefit of this approach is that if someone is advertising out they can 
>>> simply not advertise to any member of the group they have excluded. 
>>> Similarly if someone receives an advertisement from someone they excluded 
>>> then they can just ignore it.
>>>
>>>
>>> The problem is that this approach will almost certainly end up being a DOS 
>>> attack on the low bandwidth BLE channel and potentially turn each device 
>>> into a toaster oven as it tries to process potentially hundreds of tokens 
>>> from 10 or 20 different sources (depending on how many devices are in 
>>> range).
>>>
>>>
>>> BLE is unbelievably slow. In practice I'm told that you aren't going to get 
>>> much about 35KB/s and that's on a clear channel (which this isn't).
>>>
>>>
>>> Furthermore we are processing these trial decryptions on battery powered 
>>> devices. The good news is that modern phones generally use processors that 
>>> have built in support for encryption operators. But I'm not clear how much 
>>> of modern crypto software for iOS/Android actually uses those instructions. 
>>> Even so, performing literally 1000s of public key decryptions over a period 
>>> of a few hours isn't going to make anyone happy.
>>>
>>>
>>> So I did think of an optimization. The optimization is that we could try to 
>>> create pairwise keys between all members of the group. In that case what 
>>> someone would advertise is some value encrypted with the key shared with 
>>> that person. We could then use something like AES-128/GCM to do the 
>>> encryption. This has a couple of benefits. The encrypted value instead of 
>>> being measured in K as it would with say RSA 4K keys, would instead be 
>>> around 40 or so bytes. Also the expense of "testing" each value will go way 
>>> down, especially if we use code that supports the ARM processor's support 
>>> for AES. [1]
>>>
>>>
>>> AES-128/GCM/SHA-256 is what higher end TLS implementations use and mobile 
>>> processors these days are largely optimized to handle it quickly for that 
>>> reason. But to be fair the performance profile is likely to be very 
>>> different than TLS because we are talking about large numbers of short 
>>> encryptions/decryptions and most of those are going to fail. Unexpected 
>>> results do awful things to desktop processor pipelines, I don't know enough 
>>> about ARM's architecture to know what to expect. But I suspect it won't be 
>>> pretty.
>>>
>>>
>>>
>>>
>>>
>>>
>>> [1] Another option would be use to HMAC-SHA256 but I think this would 
>>> require an initialization vector to make the resulting hash not be 
>>> invariant and thus act as a beacon. I'm not clear if this would really 
>>> provide any meaningful savings.
>>>
>>>
>>>
>>>
>>>
>>> _______________________________________________
>>> List info: https://lists.mayfirst.org/mailman/listinfo/guardian-dev
>>> To unsubscribe, email:  [email protected]
>>>
>>
>> --
>> PGP fingerprint: 5E61 C878 0F86 295C E17D  8677 9F0F E587 374B BE81
>> https://pgp.mit.edu/pks/lookup?op=vindex&search=0x9F0FE587374BBE81
>> _______________________________________________
>> List info: https://lists.mayfirst.org/mailman/listinfo/guardian-dev
>> To unsubscribe, email:  [email protected]
>>
> 
> --
> PGP fingerprint: 5E61 C878 0F86 295C E17D  8677 9F0F E587 374B BE81
> https://pgp.mit.edu/pks/lookup?op=vindex&search=0x9F0FE587374BBE81
> 

-- 
PGP fingerprint: 5E61 C878 0F86 295C E17D  8677 9F0F E587 374B BE81
https://pgp.mit.edu/pks/lookup?op=vindex&search=0x9F0FE587374BBE81
_______________________________________________
List info: https://lists.mayfirst.org/mailman/listinfo/guardian-dev
To unsubscribe, email:  [email protected]

Reply via email to