On Sat, Nov 02, 2013 at 01:01:36PM +0100, [email protected] wrote:
> Let me take my "standard example" regarding trust.  Men always trust each 
> other *with respect to some topic*.  One can rarely rank peers by some level 
> of trust without such a reference.
> 
> For instance: concerning my income: there is little reason not to trust my 
> employer with that information, after all it's the one sending the money.  
> Also the bank will have an easy time to figure it out.  Health data is a 
> different beast: my doctor and the pharmacy will know about as good or even 
> better about my personal health state than myself.  Though it's neither my 
> employers business nor that of my bank.
> 
> Eventually that's how people organize their life wrt. information.  By 
> sharing some info you always risk it being no secret any longer.  In case of 
> defect of your trusted party.  So you have to deal and heal the situation.  
> As long as this is about a certain topic only, as long as the leak is within 
> bounds, this might be embarrassing.  But one can deal with it.  If it was 
> about _all_ info at once, it typically becomes a real problem.

Ok

So organizing your social contacts into groups is inevitable.

> So what makes human rights special?  They are inalienable.
> 
> Now how would we build a rights management system, which distinguishes 
> between those inalienable rights and those which one can trade?  After all 
> it's going to be binary encoded.  And it should be deadly easy to 
> verify/audit.
> 
> Hence we build a system taking this inalienability as the basic axiom.  
> (Frankly, in the beginning not even I confident that it would be possible to 
> successfully build a usable system based on such a rigid rights management 
> system.  But it turned out to work well.)
> 
> The idea was to map the situation to set theory: you start with a set and all 
> you can trade away is a strict subset.  Problem solved.  The full set you can 
> never even accidentally trade away.  That's what we map to "inalienable" 
> rights.
> 
> (((Note here: so far this is all the "math&theory" in practice computers are 
> to be used.  We could ignore our great rights management, switch them off and 
> manipulate the bits at the hard disk.  That's why Askemos combines the 
> principled solution with the other one: have independent witnesses for all 
> your data and updates.  That way no single broken machine can be used to 
> break the system in effect despite single machines may still be broken.)))

Have to think about this (including the bits I cut out)...
what "rights" are you actually modeling in Askemos?

The problem I see with forbidding the user to send his financial data
to a music forum is that the software wouldn't know what it is doing.
If the user chooses to torture himself, even we can't help.

> OK; forget about this detail.  It comes up rather late in the whole 
> reasoning, when we start to look into the situation how public knowledge and 
> such cultural heritage is to be treated.  After all: if we allow "ownership" 
> on information, we still want/need some knowledge to be public.  In fact we 
> need such knowledge to be able to communicate at all.

Oink II?

> >> >the trust levels serve the purpose to recreate a facebook-like user
> >> >experience. if you want to use psyc in a more high security fashion
> >> >you can use it differently.
> >> 
> >> That's precisely where I don't buy into the idea that this can
> >> be done using a single number.
> >
> >Facebook does it with a boolean.
> >Or so.
> >
> >No?
> 
> Probably yes.  I'll have to take your word for it.  ;-)
> 
> I assume facebook has a great solution there.  -  Could you explain to the 
> stupid mine which problem it happens solve?  ;-)

I'm saying that our goal is to provide people with reasons to get their
assets off of Faceboogle. I don't believe sharing your party pictures with
your employer is the #1 usage problem we have to deal with - users are
slowly learning to be aware of that and it is actually a tech problem of
Facebook that it cannot separate the employers from the party people.
With the PSYC channel logic separation in secushare is easy. 

The problem with Faceboogle is how everything is accessible for Mallory -
and that is currently what we are trying to solve. We can look into
perfect trust modeling, too - but given the current threat it's an
academic research topic.


-- [email protected]
   https://lists.tgbit.net/mailman/listinfo.cgi/secu-share

Reply via email to