So in the Personal Data Working Group of Ministry of Economy, Trade and
Industry in Japan which I was a member of,  last year, we discussed the
issue at length. We saw problems in explicit consent model as it is just
training the users to click yes: Turning an internet dog to a Pavlov's dog.
This is problematic especially because of the common attack which burry
important / sensitive thing in 30 pages of benign things to have the user
click "yes". In this respect, contextual consent is much better, requiring
the explicit consent in an out-of-context data collection and usage case
only. We even went on to discuss "banning" explicit consent when it is
"clear" from the context, i.e., the data is minimized and purpose
specification requirement is met.

Cheers,

Nat


2013/12/19 Joseph Lorenzo Hall <[email protected]>

> -----BEGIN PGP SIGNED MESSAGE-----
> Hash: SHA256
>
>
>
> On 12/18/13 10:15 AM, Nat Sakimura wrote:
> > Indeed the context matters especially as the consent is only given
> > in a context.
>
> There are a lot of contexts where consent is problematic to obtain,
> where people simply click right through informed consent prompts,
> and/or where obtaining consent is directly against the public interest
> (e.g., public health monitoring of disease would not work very well if
> folks could opt-out of such data sharing).  I say this because this is
> a big difference between the US and EU views on privacy regulation,
> with the EU favoring explicit, informed consent pretty heavily. I
> think the US view is less coherent, but would probably be
> characterized as "consent or opt-in is required for especially
> sensitive contexts, demographics, and data types".
>
> Nissenbaum's book talks at length about how consent is increasingly
> outdated and ineffective and shifting the obligation to "respect
> context" on the data management side has many benefits, but is quite
> hard to police/enforce.
>
> best, Joe
>
> > Also actors are very important since even "identification" has to
> > do with the observer and the domain. (Otherwise, we would not have
> > a notion such as "partially anonymous, partially unlink-able". )
> > Also, issues around generated/inferred attributes are important.
> > Acquired attributes + auxiliary knowledge may generate additional
> > attributes. This is often captured as "use" or "acquisition" and
> > implicit but is worth making note.
> >
> > Nat
> >
> >
> >
> > 2013/12/19 Joseph Lorenzo Hall <[email protected] <mailto:[email protected]>>
> >
> >
> >
> > On 12/18/13 8:17 AM, S Moonesamy wrote:
> >>>
> >>> I suppose, to avoid confusion, it probably is better to use
> >>> the definition portion of it instead of the defined word in
> >>> the usual conversation.
> >
> >> There has been some discussion on other IETF mailing lists about
> >> the definition of the word "privacy".  Warren and Brandeis are
> >> often cited in a U.S context.  The "right of personal immunity"
> >> is broader than privacy.
> >
> >> Within an IETF context it might be a problem if the "right to be
> >> let alone" is used.  In my opinion a right is guaranteed by law
> >> and that doesn't fit in with what the IETF does.
> >
> > Many of us from academia (in my case, having recently jumped ship
> > for civil society) that study privacy are more persuaded by Helen
> > Nissenbaum's notion of privacy as "contextual integrity". Here's
> > the skiny in shorter-than-book-length form:
> >
> > "I give an account of privacy in terms of expected flows of
> > personal information, modeled with the construct of
> > context-relative informational norms. The key parameters of
> > informational norms are actors (subject, sender, recipient),
> > attributes (types of information), and transmission principles
> > (constraints under which information flows). Generally, when the
> > flow of information adheres to entrenched norms, all is well;
> > violations of these norms, however, often result in protest and
> > complaint. In a health care context, for example, patients expect
> > their physicians to keep personal medical information con½dential,
> > yet they accept that it might be shared with specialists as needed.
> > Patients’ expectations would be breached and they would likely be
> > shocked and dismayed if they learned that their physicians had sold
> > the information to a marketing company. In this event, we would say
> > that informational norms for the health care context had been
> > violated." [1]
> >
> > Much of the scholarship these days in privacy thinking is
> > increasingly based on this kind of contextual definition of privacy
> > (and in the U.S., at least, the Obama administration embraced this
> > in a recasting of fair information principles in their Consumer
> > Privacy Bill of Rights).
> >
> > At CDT, we argue that "abuse" or "harm" is an anemic framing, and
> > that there are important privacy interests implicated after
> > information has been fixed and collected but before any use has
> > been made. See Brookman and Hans [2], if you're interested in
> > reading more.
> >
> > [1]:
> > http://www.amacad.org/publications/daedalus/11_fall_nissenbaum.pdf
> > [2]:
> >
> http://www.futureofprivacy.org/wp-content/uploads/Brookman-Why-Collection-Matters.pdf
> >
> >
> >
> > _______________________________________________ ietf-privacy
> > mailing list [email protected] <mailto:[email protected]>
> > https://www.ietf.org/mailman/listinfo/ietf-privacy
> >
> >
> >
> >
> > -- Nat Sakimura (=nat) Chairman, OpenID Foundation
> > http://nat.sakimura.org/ @_nat_en
>
> - --
> Joseph Lorenzo Hall
> Chief Technologist
> Center for Democracy & Technology
> 1634 I ST NW STE 1100
> Washington DC 20006-4011
> (p) 202-407-8825
> (f) 202-637-0968
> [email protected]
> PGP: https://josephhall.org/gpg-key
> fingerprint: 3CA2 8D7B 9F6D DBD3 4B10  1607 5F86 6987 40A9 A871
>
>
> -----BEGIN PGP SIGNATURE-----
> Version: GnuPG v1.4.13 (Darwin)
> Comment: Using GnuPG with Thunderbird - http://www.enigmail.net/
>
> iQIcBAEBCAAGBQJSsb2cAAoJEF+GaYdAqahxVccQAL0TfywfHMna0kGRyuSoH7lz
> GIgrBgh2vA195IijqFzKjKv0bZ4YAGOPksjwuUWnumSHgiqbtArOk3t664NH6hlO
> M6IV7dAc31f4wVFU05Fn+bhU0wEFLsMR49hdyu5GbqRPEWQIG9TA9Rzk0xo3LF+f
> w++6wEl/LZ29j4xfe9+W7omoPKxZDELpVY0xgkFpMHlm0l0RbaP0kRgxfVBSExcg
> LQiIsDeibjMPQ1sP4KK3570bvX1aaUCSBKr56yj51Rxp8F2JhnWmQxEX4nHK4ruW
> fBfzQ35uQHxOoCtVXhsr2j6LPtjBNwNbsschL9jSRTDGE1Gr9E9NVPTN8fDD2OP5
> DhlV7DC3slSEyPIRUB2V6RLAch25s5mmUs3wCQI4WZyLHBUg/tWVx9sTpJEEG6ht
> Z5Gqg4SJU4xwHgshJLmX+Pjwn+MC0xxCGLpOdsVegejUbRWneDt+7kEeHz8sgFsU
> zYTKGFs6HplaAhnh4ktZTzFf67jyBT48J6+aR8Wn8yk8IVEJ4dXuRTxg3XdhyJHu
> M2iTU/9huz/PNpnkl88ZdwM5jkFxLgJFbKxTUEngmv553iqwUWqo9QZRg3geY1Cx
> dsAX3ATgULRVjgkNxmwa/Ut2ZhQY/QlI5ztkTyB9Hl75F1CysK5+VyPhb6mSMNGR
> g9kLuDVBrTF6+2rxeQ2C
> =FeRd
> -----END PGP SIGNATURE-----
>
>


-- 
Nat Sakimura (=nat)
Chairman, OpenID Foundation
http://nat.sakimura.org/
@_nat_en
_______________________________________________
ietf-privacy mailing list
[email protected]
https://www.ietf.org/mailman/listinfo/ietf-privacy

Reply via email to