Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-21 Thread Noein
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

Some thoughts, not aiming at anybody in particular.

The pressure from Fox News, the childish founders' jealousies, the void
FBI threats, the patriarch complex of Mr. Wales, if they're real,
should be of no inflated importance. Our personal tastes about what
images we like and which we don't should be of little weigh compared to
what is at stake.
Wikipedia and its sister projects are making history. They're forging a
century-lasting [[Masterpiece of the Oral and Intangible Heritage of
Humanity]]. For the first time in its existence, humanity has a tool to
break free from ignorance and manipulation.
These projects are giving freedom of choice for every human. (but ok, it
may take centuries)

We should remember the big picture from time to time.


-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJL9j/vAAoJEHCAuDvx9Z6LX1YH/2ba5+ka5DfeuvXvweBoS4Cd
h8Kk59GmEAAbguXOTCg6kiCbgH6nO5r62WAcBo7PnInmbOo0Zi75SAmdsUMPdb6e
6dnFzQxO4jb1GHmVY7xuAaDYl96bo0DIlFmmyQdFhSn04QTNGXbvDjaMkF5oB2xV
i0uyTSP5MiNs8NbqWqgItcUqq+GZNrKyhJeDzP9MAJFojj7mDau0CxIjpkrVxeh9
g2uvTxS/p1PSSLFL4l+7qSJQtX3ZCNMwqCHdw7OiUXPYfgXwGckW4nimdhjHuwtC
v+QF1kTLvRZVTA/ZOm2CdEysx0iif4/tOl8neKC9ePz1W4OCAYEPLAbkInE1GZE=
=S6Zj
-END PGP SIGNATURE-

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-19 Thread wiki-list
Samuel Klein wrote:
 To Robert's point below,
 
 I would appreciate a serious discussion on Commons, grounded in this
 sort of precedent, about what a special concern and stronger
 justification for inclusion might look like.  An OTRS-based model
 release policy?  How does one prove that one really is the
 photographer / the person in a photograph?
 
 There was the start of a discussion about this here, but I haven't
 seen further discussion recently:
 http://commons.wikimedia.org/wiki/Commons_talk:Sexual_content#Consent_clarification
 

Do you not avoid the problem by simply not accepting photographs from 
unapproved sources? Just because someone genuinely could upload a 
photograph of themselves and/or partner engaged in some sexual activity 
is no reason to accept such images.

Flickr delete accounts all the time for revenge postings. Where private 
photos of ex-partners are uploaded to flickr and posted into the adult 
groups, sometimes with contact data.

A problem with images on wikimedia is that they have a free license, 
which gives them a life outside of wikimedia. I'm reminded of the 14 yo 
that had a self portrait used as the art work for a porn DVD, the 
distributor saying that it was found on a PD image site.
http://www.ephotozine.com/article/Flickr-user-Lara-Jade-has-images-stolen-5442

Also last week in the UK a Press Complaint Commission said that A 
magazine did not intrude into a young woman's privacy when it published 
photos that she had uploaded to social networking site Bebo when she was 
15 because the images had already been widely circulated online.
http://www.theregister.co.uk/2010/05/13/bebo_loaded/

So one may find that once an image is widely circulated on the internet 
the person featured losses any rights to privacy over such images.

Surely one could source representative images from the porn industry.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-19 Thread David Goodman
I recall personally deleting and asking for oversight of an
identifiable picture of a clearly underage person in a similar
context, where the images were the basis of an internet meme. The
picture was oversighted; the article on the meme itself was almost
unanimously deleted from WP.

The courts may be fools. We are not. (at least not as often).


David Goodman, Ph.D, M.L.S.
http://en.wikipedia.org/wiki/User_talk:DGG



On Wed, May 19, 2010 at 12:39 PM,  wiki-l...@phizz.demon.co.uk wrote:
 Samuel Klein wrote:
 To Robert's point below,

 I would appreciate a serious discussion on Commons, grounded in this
 sort of precedent, about what a special concern and stronger
 justification for inclusion might look like.  An OTRS-based model
 release policy?  How does one prove that one really is the
 photographer / the person in a photograph?

 There was the start of a discussion about this here, but I haven't
 seen further discussion recently:
 http://commons.wikimedia.org/wiki/Commons_talk:Sexual_content#Consent_clarification


 Do you not avoid the problem by simply not accepting photographs from
 unapproved sources? Just because someone genuinely could upload a
 photograph of themselves and/or partner engaged in some sexual activity
 is no reason to accept such images.

 Flickr delete accounts all the time for revenge postings. Where private
 photos of ex-partners are uploaded to flickr and posted into the adult
 groups, sometimes with contact data.

 A problem with images on wikimedia is that they have a free license,
 which gives them a life outside of wikimedia. I'm reminded of the 14 yo
 that had a self portrait used as the art work for a porn DVD, the
 distributor saying that it was found on a PD image site.
 http://www.ephotozine.com/article/Flickr-user-Lara-Jade-has-images-stolen-5442

 Also last week in the UK a Press Complaint Commission said that A
 magazine did not intrude into a young woman's privacy when it published
 photos that she had uploaded to social networking site Bebo when she was
 15 because the images had already been widely circulated online.
 http://www.theregister.co.uk/2010/05/13/bebo_loaded/

 So one may find that once an image is widely circulated on the internet
 the person featured losses any rights to privacy over such images.

 Surely one could source representative images from the porn industry.

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-18 Thread Samuel Klein
To Robert's point below,

I would appreciate a serious discussion on Commons, grounded in this
sort of precedent, about what a special concern and stronger
justification for inclusion might look like.  An OTRS-based model
release policy?  How does one prove that one really is the
photographer / the person in a photograph?

There was the start of a discussion about this here, but I haven't
seen further discussion recently:
http://commons.wikimedia.org/wiki/Commons_talk:Sexual_content#Consent_clarification

Sam.
--
user:sj
+1 617 529 4266



On Wed, May 12, 2010 at 2:47 PM, George Herbert
george.herb...@gmail.com wrote:
 On Tue, May 11, 2010 at 3:51 PM, Robert Rohde raro...@gmail.com wrote:
[...]
 However, I also see the issue from another frame that is not part of
 Tim's spectrum.  Sexual photographs, especially those of easily
 recognized people, have the potential to exploit or embarrass the
 people in them.  I place a high value on not doing harm to the models
 pictured.

 This is essentially a consent issue.  If the model is a well-known
 porn star and wants to be shown nude to the world, then there is no
 problem.  However, many of the sexual images we receive depict
 non-notable individuals who appear to be engaged in private conduct.
 If the uploader is being honest and responsible, then this may be fine
 too.  However, if the uploader is malicious, then the subject may have
 no idea how their image is being used.  Even if the person pictured
 consented to having the photographs made, they may still be horrified
 at the idea that their image would be used in an encyclopedia seen by
 millions.

 At present, our controls regarding the publication of a person image
 are often very lax.  With regards to self-made images, we often take
 a lot of things on faith, and personally I see that as irresponsible.

 In a sense, this way of looking at things is very similar to the issue
 of biographies of living persons.  For a long time we treated those
 articles more or less the same as all other articles.  However,
 eventually we came to accept that the potential to do harm to living
 persons was a special concern which warranted special safeguards,
 especially in the case of negative or private information.

 I would say that publishing photos of living persons in potentially
 embarrassing or exploitative situations should be another area where
 we should show special concern for the potential harm, and require a
 stronger justification for inclusion and use than typical content.
 (Sexual images are an easy example of a place where harm might be
 done, but I'd say using identifiable photos of non-notable people
 should be done cautiously in any situation where there is potential
 for embarrassment or other harm.)

 Obviously, from this point of view, I consider recent photos of living
 people to be rather different from illustrations or artwork, which
 would require no special treatment.


 Much of the discussion has focused on the potential to harm (or at
 least offend) the viewer of an image, but I think we should not forget
 the potential to harm the people in the images.


 I would like to second this particular point, though I am largely
 inclusionist in the larger debate here.

 I handled an OTRS case in which exactly this happened; a ex-boyfriend
 stole a camera which a female college student had taken private nude
 pictures, posted them to Flickr, then someone copied them to Wikipedia
 to illustrate one of our sex-related articles (for which, the specific
 picture was reasonably educational/on topic/appropriate).

 The student was extremely upset and angry about each of these abuses
 of her privacy and property.

 This is probably the exception rather than the rule, but it is worth
 keeping in mind.


 --
 -george william herbert
 george.herb...@gmail.com

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-14 Thread Jussi-Ville Heiskanen
Elias Gabriel Amaral da Silva wrote:
 2010/5/13 Jussi-Ville Heiskanen cimonav...@gmail.com:
   
 Samuel Klein wrote:
 
 I agree strongly with this.  You are right to point out the connection
 to improving BLP policies -- we should be much more careful to
 confirming model rights for people in any potentially exploitative or
 embarrassing photos.

 Such ideas have been around for a long time.  What are the arguments
 against implementing stronger requirements for images of people?

   
 Not an argument as such, but I would imagine that
 with regard to amateur photography of all sorts, in
 the long term the main effect would be to educate
 them in the correct practices of model rights. After
 all I would expect that amateur photographers would
 not really have great difficulty in obtaining model
 rights, once they know that is a requirement.
 

 Why just amateur photos? Professional should respect model rights as
 much as amateur photos.

   
I think my unwritten assumption was that amateurs
might not in all cases know what is required, but
professionals do -- as a default; the professionals who
don't, not deserving the name.


   
 None of these is an argument against, as such, just
 pointing out some of the ramifications that might
 follow. My guess is that after a lot of existing images
 were removed, the ratio of new images uploaded would
 infact be skewed *in* *favor* of amateur images,
 rather than *against*. I could be wrong of course.
 

 Interesting. Why do you think so?

   

I think deep down it comes down to the profit
motive. Enthusiasts rarely are governed by it,
so they give freely. I would imagine that if some
amateur photographer in the process of donating
their images learned more about model releases
and all that jazz, they would infact be grateful,
and more motivated than ever to pay forward for
the learning experience. Certainly it has been my
personal experience writing on wikipedia, that as
I learn more, the more I feel an obligation to pay
forward, for what I have recieved.


Yours,

Jussi-Ville Heiskanen


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Elias Gabriel Amaral da Silva
2010/5/13 Delirium delir...@hackish.org:
 On 05/11/2010 09:45 AM, Aryeh Gregor wrote:
 The obvious solution is not to display images by default that a large
 number of viewers would prefer not to view.  Instead, provide links,
 or maybe have them blurred out and allow a click to unblur them.  You
 don't hide any information from people who actually want it (you
 require an extra click at most), and you don't force people to view
 images that they don't want to view.  This allows as many people as
 possible to get what they want: people who want to see the images can
 see them, and those who don't can choose not to.  The status quo
 forces people to view the images whether or not they want to.  And a
 lot of people don't want to look at naked people without warning, for
 whatever reason.


 I don't actually mind this proposal, and would like it myself for a lot
 of pages. But I'm not sure naked people are actually at the top of the
 list (perhaps someone should try to determine it empirically via some
 sort of research on our readers?). If I personally were to list two
 kinds of images I would want hidden by default, it'd be: 1. spiders; and
 2. gory medical conditions. Do I get that option? Do I get that option
 if more than x% of readers agree?

(At a first read, I thought your response as a statement you found
this proposal amusing)

I might risk restating some points I already mentioned on other emails
(and, worse, being a complete stranger, to not 'fit' on the
discussions here), but I'm not disturbed by sexual content at all. I
feel torture images disturbing, however. In fact, I remember not
sleeping well after reading some random Wikipedia articles on the
subject. To paraphrase Sharon Stone, blocking/blurring porn by default
and failing to do this with torture images is an indecency.

[ Actually I would find ok if a parent didn't liked his or her small
children to view this (not just images, but text included - the
richness of some wikipedians' prose is frightening..). But I find
misguided any attempts by WMF sites to cooperate with any kind of
parental control. That is, I think that building a place where
everyone has access to the sum of human knowledge does not include or
permit provisions for parental control. ]

Other than that, I find that browsing the web at random might be
embarrassing, if there are strangers at the room - like at an
university lab or cybercafe (it's not a Wikipedia-specific problem
actually). It's only a matter of chance if the next clicked link will
have this kind of stuff or not. (Random example, many sites have porn
ads, many forum links contains embedded pictures, etc). So I might
disable image loading with a browser feature (I wouldn't trust the
site to do this filtering - or, worse, IM friend links, with a lot of
4chan-like stuff: those memes will often spread porn to blogs and
forums that wouldn't otherwise have it). This makes loading faster,
plus if I the content I want to see require image loading, I may just
click a button. Some browsers support this by default (such as opera),
and others through an extension. (I don't usually do this btw, because
I'm generally ok with it)

Anyway, I am not proposing here to just include torture depictions in
the (maybe long) of images censored for casual, unregistered readers.
I disagree with any kind of opt-out censoring, for any purpose or
pretext, even if to evade the censor it just requires a sign up.
(Also, I think any poll for selecting the targets of the block would
be biased towards supposing the majority of the voters supports some
kind of censoring - even if 'nothing at all' is an option)

But, to think about this a bit, it's a lot harder to block torture
images than explicit porn, because of the political component here.
For example: blocking recent Iraq american explicit torture images and
failing to block detailed depictions of historical torture devices is
unreasonable, since the shocking component - for me, at least - is
often centered on it happening to some human being. The issue here
would be where to draw the line, and the exact place might be
interpreted as a form of political pushing. Wouldn't it be awesome if
some pro-america editor managed to clean some articles for the
casual reader? :)

[ You might replace pro-america to anti-pornography as well - that's
how I would see it ]

-- 
Elias Gabriel Amaral da Silva tolkiend...@gmail.com

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Jussi-Ville Heiskanen
Delirium wrote:

 I don't actually mind this proposal, and would like it myself for a lot 
 of pages. But I'm not sure naked people are actually at the top of the 
 list (perhaps someone should try to determine it empirically via some 
 sort of research on our readers?). If I personally were to list two 
 kinds of images I would want hidden by default, it'd be: 1. spiders; and 
 2. gory medical conditions. Do I get that option? Do I get that option 
 if more than x% of readers agree?


   
Ooh! Can I play? Beside those two, 3. people with perfect
teeth smiling, 4. charred bodies, 5. decomposing bodies,
6. women with Double-D breasts -- or above, whether fully
clothed or not, 7. any women with silicone implants,
8. tele-evangelists, 9. big hair, 10. Disco, including
people dressed Disco-style, 11. Ku Kux Klansmen,
12. images of any foods made from liver, 13. sauerkraut,
14. suet, 15. lard, 16. skinned animals that haven't been
fully butchered yet (including humans), ...


Yours,

Jussi-Ville Heiskanen



___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Jussi-Ville Heiskanen
Samuel Klein wrote:

 I agree strongly with this.  You are right to point out the connection
 to improving BLP policies -- we should be much more careful to
 confirming model rights for people in any potentially exploitative or
 embarrassing photos.

 Such ideas have been around for a long time.  What are the arguments
 against implementing stronger requirements for images of people?
   

Not an argument as such, but I would imagine that
with regard to amateur photography of all sorts, in
the long term the main effect would be to educate
them in the correct practices of model rights. After
all I would expect that amateur photographers would
not really have great difficulty in obtaining model
rights, once they know that is a requirement.

It might however have a chilling effect on those people
sharing images that don't really require model releases,
such as photographs shot in public spaces, especially
where the person in the frame can't even be recognized.

Another question is, if such a stance on model rights
were taken, would it be reasonable to just retroactively
apply it to images already on the site, or should there
be a reasonable attempt to inform the uploaders to
let them secure a model release and add it to the
media information. I think something like that was
done when we went to town on images that didn't
have a specified rationale of use.

None of these is an argument against, as such, just
pointing out some of the ramifications that might
follow. My guess is that after a lot of existing images
were removed, the ratio of new images uploaded would
infact be skewed *in* *favor* of amateur images,
rather than *against*. I could be wrong of course.

It still might be worth doing just for its own sake.


Yours,

Jussi-Ville Heiskanen


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Elias Gabriel Amaral da Silva
2010/5/13 Jussi-Ville Heiskanen cimonav...@gmail.com:
 Samuel Klein wrote:

 I agree strongly with this.  You are right to point out the connection
 to improving BLP policies -- we should be much more careful to
 confirming model rights for people in any potentially exploitative or
 embarrassing photos.

 Such ideas have been around for a long time.  What are the arguments
 against implementing stronger requirements for images of people?


 Not an argument as such, but I would imagine that
 with regard to amateur photography of all sorts, in
 the long term the main effect would be to educate
 them in the correct practices of model rights. After
 all I would expect that amateur photographers would
 not really have great difficulty in obtaining model
 rights, once they know that is a requirement.

Why just amateur photos? Professional should respect model rights as
much as amateur photos.

 None of these is an argument against, as such, just
 pointing out some of the ramifications that might
 follow. My guess is that after a lot of existing images
 were removed, the ratio of new images uploaded would
 infact be skewed *in* *favor* of amateur images,
 rather than *against*. I could be wrong of course.

Interesting. Why do you think so?

-- 
Elias Gabriel Amaral da Silva tolkiend...@gmail.com

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Aryeh Gregor
On Tue, May 11, 2010 at 7:53 PM, Anthony wikim...@inbox.org wrote:
 On Tue, May 11, 2010 at 12:45 PM, Aryeh Gregor 
 simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com wrote:
 [[Daniel Pearl]] does not contain an image
 of him being beheaded (although it's what he's famous for), and
 [[Goatse.cx]] does not contain an image of its subject matter.  Why?


 Primarily because of copyright issues, at least with regard to the latter, I
 believe.  I used to joke about what's next, a photo on Goatse.cx? when
 arguing with the when you look up X you expect a photo of X crowd, but
 then, for a while it actually came true.

This is a standard fair-use case: it's a notable image, and our
informative article about it is not complete without a copy of the
image.  It's the same reason we can put an image on, I don't know,
http://en.wikipedia.org/wiki/Dora_Maar_au_Chat.  Except we don't,
because it's revolting.

On Wed, May 12, 2010 at 3:53 AM, Ray Saintonge sainto...@telus.net wrote:
 Each such issue will have its own spectrum of supporters and
 detractors.  It should not be our role to decide for them; we can only
 make it easier for them to make decisions consistent with their own beliefs.

Okay, fine.  Currently, people have no ability to decide whether they
want to view a particular image.  They get to see it with no warning
whether they like it or not.  I propose that if we think it's
reasonably likely they wouldn't actually want to view it, they should
be asked first, so they can decide not to see it if they prefer not to
see it.  Do you disagree?

 Not necessarily. Supermarket tabloids still sell well. Sometimes it's
 the advertisers, and not the readers who determine this.

Because tabloids aim to provide entertainment more than information.
If your goal is entertainment, then yes, more titillating content will
scare away some readers; but it will attract others, because that sort
of content does entertain people.  So if you're a tabloid, the balance
tilts more heavily toward prurient content (as well as exaggerated
content, content based on shoddy evidence or rumor, . . .).  If your
goal is to provide information, on the other hand, you don't care so
much if people are more entertained, and the balance leans more
heavily in favor of the socially conservative.  We should take our
cues from reputable publications, not tabloids.

(That said, I'm pretty sure tabloids in America don't routinely
contain even topless pictures, let alone full-frontal nudity as at
[[Human]].)

 Each project will be left to determine its own standards.  When dealing
 with Commons relevant language is a meaning less term.

I'm not talking about Commons.  I'm talking about the Wikipedias,
mainly, particularly the English Wikipedia.

On Thu, May 13, 2010 at 12:17 AM, Delirium delir...@hackish.org wrote:
 I don't actually mind this proposal, and would like it myself for a lot
 of pages. But I'm not sure naked people are actually at the top of the
 list (perhaps someone should try to determine it empirically via some
 sort of research on our readers?). If I personally were to list two
 kinds of images I would want hidden by default, it'd be: 1. spiders; and
 2. gory medical conditions. Do I get that option? Do I get that option
 if more than x% of readers agree?

We have a perfectly good categorization system already, so any
implementation would likely permit you to blacklist any category you
like.  Of course, this depends on someone maintaining an accurate
[[Category:Spiders]] . . . this scheme doesn't work well if people
remove redundant categories from images.  It's also pretty ugly if
every image has fifty categories, on the other hand.  I'd hope that if
people widely blacklist particular categories (especially if some
projects do so by default), that would create the right incentives for
people to categorize things in a way that's useful for the system.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Pedro Sanchez
On Thu, May 13, 2010 at 2:50 PM, Aryeh Gregor
simetrical+wikil...@gmail.comsimetrical%2bwikil...@gmail.com
 wrote:

 On Tue, May 11, 2010 at 7:53 PM, Anthony wikim...@inbox.org wrote:
  On Tue, May 11, 2010 at 12:45 PM, Aryeh Gregor 
  simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com 
 simetrical%2bwikil...@gmail.com simetrical%252bwikil...@gmail.com
 wrote:
  [[Daniel Pearl]] does not contain an image
  of him being beheaded (although it's what he's famous for), and
  [[Goatse.cx]] does not contain an image of its subject matter.  Why?
 
 
  Primarily because of copyright issues, at least with regard to the
 latter, I
  believe.  I used to joke about what's next, a photo on Goatse.cx? when
  arguing with the when you look up X you expect a photo of X crowd, but
  then, for a while it actually came true.

 This is a standard fair-use case: it's a notable image, and our
 informative article about it is not complete without a copy of the
 image.  It's the same reason we can put an image on, I don't know,
 http://en.wikipedia.org/wiki/Dora_Maar_au_Chat.  Except we don't,
 because it's revolting.


It's at this moments that I'm really thankful that spanish wikipedia has
committed to actually be a free-content encyclopedia and only use Commons
material (which, as we know, doesn't allow fairuse).

I can only hope all this shaking won't change that.
___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-13 Thread Andreas Kolbe
--- On Tue, 11/5/10, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 The obvious solution is not to display images by default
 that a large
 number of viewers would prefer not to view.  Instead,
 provide links,
 or maybe have them blurred out and allow a click to unblur
 them.  You
 don't hide any information from people who actually want it
 (you
 require an extra click at most), and you don't force people
 to view
 images that they don't want to view.  This allows as
 many people as
 possible to get what they want: people who want to see the
 images can
 see them, and those who don't can choose not to.  The
 status quo
 forces people to view the images whether or not they want
 to.  And a
 lot of people don't want to look at naked people without
 warning, for
 whatever reason.


A similar method is used by the Chinese Wikipedia article on masturbation.

It hides its gallery of images in an expandable box:

http://zh.wikipedia.org/wiki/%E5%B0%84%E7%B2%BE

Andreas


  

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Ray Saintonge
Tim Starling wrote:
 Solution 1: Exercise editorial control to remove particularly
 offensive images from the site.

 Standard answer 1: Some people may wish to see that content, it would
 be wrong for us to stop them.

 Solution 2: Tag images with an audience-specific rating system, like
 movie classifications. Then enable client-side filtering.

 Standard answer 2: This could potentially enable censorship which is
 wrong as per answer 1. Also, we cannot determine what set of content
 is right for a given audience. By encouraging people to filter, say,
 R-rated content, we risk inadvertently witholding information that
 they would have consented to see, had they been fully informed about
 its nature.

 Solution 3: Tag images with objective descriptors, chosen to be useful
 for the purposes of determining offensive character by the reader. The
 reader may then choose a policy for what kinds of images they wish to
 filter.

 Standard answer 3: This also enables censorship, which is wrong as per
 answer 1. Also, tagging images with morally-relevant descriptors
 involves a value judgement by us, when we determine which descriptors
 to use. It is wrong for us to impose our moral values on readers in
 this way.

 The fundamental principle of libertarianism is that the individual
 should have freedom of thought and action, and that it is wrong for
 some other party to infringe that freedom. I've attempted to structure
 the standard answers above in a way that shows how they are connected
 to this principle.

   
Those who rely on standard answers don't really exercise freedom of 
thought, only an absence of thought.

Ec

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Ray Saintonge
Aryeh Gregor wrote:
 On Tue, May 11, 2010 at 1:13 AM, Tim Starling tstarl...@wikimedia.org wrote:
   
 On foundation-l we are divided between moderates and libertarians. The
 libertarians are more strident in their views, so the debate can seem
 one-sided at times, but there is a substantial moderate contingent,
 and I count myself among them. Conservatives have no direct voice
 here, but they are conceptually represented by Fox News and its audience.
 
 There are some people here who exactly fit your description of
 conservatives, such as me.  But we're forced by the overwhelming
 libertarian majority to play the part of moderates as a compromise.
 Regardless, more people than just religious conservatives would prefer
 not to see naked people without warning.  At the very least, few
 people would be happy in unexpected nudity showing up while they're
 browsing at work, with children watching them, etc. -- it's
 embarrassing.  You're probably correct that this is *historically* due
 to religious conservatism, but the preference remains even for
 completely irreligious people.
   

This is an important point, and I say this as one who considers himself 
to be somewhere on the irreverently liberal (not libertarian) end of the 
spectrum. Even as one who considers some measure of these illustrations 
as acceptable, but who regards an excess of them to be tiresome, 
especially when they start to appear in unexpected circumstances.  
Perhaps a parallel might be drawn with a deeply religious conservative 
beset by proselytizers intent on converting him to the beliefs he 
already has with arguments far below the quality of his own theological 
experience.

 The standard objection here is But then we have to hide Muhammad
 images too!  This is, of course, a non sequitur.  A large percentage
 of English speakers prefer not to see nude images without warning, but
 only a tiny percentage prefer not to see pictures of Muhammad, so the
 English Wikipedia should cater to the former group but not the latter.
  The Arabic Wikipedia might also cater to the latter group -- indeed,
 I see no pictures of Muhammad at http://ar.wikipedia.org/wiki/محمد.
 But we only need to look at large groups of viewers, not small
 minorities.  If the minority is small enough, their benefit from not
 having to see the images is outweighed by the majority's benefit in
 the aesthetic appeal of the images.
   

Each such issue will have its own spectrum of supporters and 
detractors.  It should not be our role to decide for them; we can only 
make it easier for them to make decisions consistent with their own beliefs.

 It's really very easy to determine where to draw the line.  There are
 a multitude of English-language informative publications
 (encyclopedias, newspapers, news shows, etc.) published by many
 independent companies, and the major ones all follow quite similar
 standards on what sorts of images they publish.  Since news reporting,
 for instance, is very competitive, we can surmise that they avoid
 showing images only because their viewers don't want to see them.  Or
 if it's because of regulations, those are instituted democratically,
 so a similar conclusion follows.
   

Not necessarily. Supermarket tabloids still sell well. Sometimes it's 
the advertisers, and not the readers who determine this.

 The solution is very simple.  Keep all the images if you like.
 Determine, by policy, what sorts of images should not be shown by
 default, based on the policies of major publications in the relevant
 language.  If an image is informative but falls afoul of the policy,
 then include it as a link, or a blurred-out version, or something like
 that.  This way people can see the images only if they actually want
 to see them, and not be forced to see them regardless.  

Each project will be left to determine its own standards.  When dealing 
with Commons relevant language is a meaning less term.

 It would
 hardly be any great burden when compared to the innumerable byzantine
 policies that already encumber everything on Wikipedia.
   

That speaks to keeping things simple, avoiding the compulsion to 
overexplain everything.  Excessive explanation tends to make laws and 
policies more obscure.

 The reason that this isn't the status quo has nothing to do with
 libertarianism.  As I argue above, the properly libertarian solution
 would be to give people a choice of which images they view if there's
 doubt whether they'd like to view them.  Rather, quite simply, we have
 sexual images in articles without warning because Wikipedia editors
 tend to be sexually liberal as a matter of demographics, and have a
 lot more tolerance for nudity than the average person.  With no
 effective means of gathering input from non-editors, they decide on an
 image policy that's much more liberal than what their viewers would
 actually like.  This is a gratuitous disservice to Wikipedia's
 viewers, and should be rectified.
I have no problem with a liberal 

Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Andreas Kolbe
--- On Tue, 11/5/10, wjhon...@aol.com wjhon...@aol.com wrote:
If there is enough of a perceived need for content filtering, someone will fill 
that void.  That someone does not need to be us.  Google does this job with 
their image browser already without the need for any providers to actively 
tag any images.  How do they do that?  I have no idea, but they do it.  I 
would suggest a child-safe approach to Commons, is simply to use the Google 
image browser with a moderate filter setting.  Try it, it works.
It doesn't work if you enter Commons through the main page, or an image page, 
and then search through its categories. The best-thumbed pages of library books 
are usually the ones that have nude images; it's human nature. Commons is no 
different if you look at the top-1000.
With respect to minors, the libertarian position that anyone should be able to 
see whatever they want to see is simply a fringe position. Every country 
legally defines some things as harmful to minors* and expects providers to 
behave in a way that prevents that harm. Arguing about whether the harm is real 
is an idle debate that's of no interest to teachers, say, who are legally bound 
by these standards and can experience professional repercussions if they fail 
in their duty of care.
I would suggest that any parent who is allowing their young children as one 
message put it, to browser without any filtering mechanism, is deciding to 
trust that child, or else does not care if the child encounters objectionable 
material.  The child's browsing activity is already open to five million porn 
site hits as it stands, Commons isn't creating that issue.  And Commons cannot 
solve that issue.  It's the parents responsibility to have the appropriate 
self-selected mechanisms in place.  And I propose that all parents who care, 
already *do*.  So this issue is a non-issue.  It doesn't actually exist in any 
concrete example, just in the minds of a few people with spare time.
As I see it, a working filter system for adult content would relieve teachers 
and librarians of the headache involved in making Commons or WP available to 
minors. Do we have figures on how many schools or libraries in various 
countries block access to Wikimedia sites over concerns related to content 
harmful to minors? Is this a frequently-voiced concern, or are we making more 
of it than it is?
The most sensible access control system would be one that can be set up on a 
physical computer used by minors. (Linking it to user account data would not 
work, as IP users should have normal access.) And if the same child is allowed 
to surf the net freely by their parents at home, then that is perfect. It is 
the parents' choice, and every parent handles this differently.
If an outside developer were to create such a filter product, that would be 
great too. I just wonder how they would cope with categories and images being 
renamed, new categories being created, etc. And does anyone actually know how 
Google manages to filter out images in safe search?Andreas
* See the Miller test for minors reproduced at 
http://commons.wikimedia.org/wiki/Commons_talk:Sexual_content#Pornography


  
___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Andreas Kolbe
I am sorry about the horrible formatting in my last post (any advice 

appreciated). I'll try this again.



--- On Tue, 11/5/10, wjhon...@aol.com wjhon...@aol.com wrote:

 If there is enough of a perceived need for content filtering, someone 

 will fill that void.  That someone does not need to be us.  Google does 

 this job with their image browser already without the need for any 

 providers to actively tag any images.  How do they do that?  I have no 

idea, but they do it.  I would suggest a child-safe approach to Commons, 

is simply to use the Google image browser with a moderate filter 

setting.  Try it, it works.



It doesn't work if you enter Commons through the main page, or an image 

page, and then search through its categories. The best-thumbed pages of 

library books are usually the ones that have nude images; it's human 

nature. Commons is no different if you look at the top-1000.



With respect to minors, the libertarian position that anyone should be able 

to see whatever they want to see is simply a fringe position. Every country 

legally defines some things as harmful to minors* and expects providers 

to behave in a way that prevents that harm. Arguing about whether the harm 

is real is an idle debate that's of no interest to teachers, say, who are 

legally bound by these standards and can experience professional 

repercussions if they fail in their duty of care.



 I would suggest that any parent who is allowing their young children as 

 one message put it, to browser without any filtering mechanism, is 

 deciding to trust that child, or else does not care if the child 

 encounters objectionable material.  The child's browsing activity is 

 already open to five million porn site hits as it stands, Commons isn't 

 creating that issue.  And Commons cannot solve that issue.  It's the 

 parents responsibility to have the appropriate self-selected mechanisms 

 in place.  And I propose that all parents who care, already *do*.  So 

 this issue is a non-issue.  It doesn't actually exist in any concrete 

 example, just in the minds of a few people with spare time.



As I see it, a working filter system for adult content would relieve 

teachers and librarians of the headache involved in making Commons or WP 

available to minors. Do we have figures on how many schools or libraries in 

various countries block access to Wikimedia sites over concerns related to 

content harmful to minors? Is this a frequently-voiced concern, or are we 

making more of it than it is?



The most sensible access control system would be one that can be set up on 

a physical computer used by minors. (Linking it to user account data would 

not work, as IP users should have normal access.) And if the same child is 

allowed to surf the net freely by their parents at home, then that is 

perfect. It is the parents' choice, and every parent handles this 

differently.


If an outside developer were to create such a filter product, that would be 

great too. I just wonder how they would cope with categories and images 

being renamed, new categories being created, etc. And does anyone actually 

know how Google manages to filter out images in safe search?



Andreas


* See the Miller test for minors reproduced at 
http://commons.wikimedia.org/wiki/Commons_talk:Sexual_content#Pornography



  

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Gregory Maxwell
On Tue, May 11, 2010 at 6:51 PM, Robert Rohde raro...@gmail.com wrote:
[snip]
 However, I also see the issue from another frame that is not part of
 Tim's spectrum.  Sexual photographs, especially those of easily
 recognized people, have the potential to exploit or embarrass the
 people in them.  I place a high value on not doing harm to the models
 pictured.

Sexually explicit photographs are only one of many classes of
photograph which pose the risk of embarrassment.

http://commons.wikimedia.org/wiki/File:Childhood_Obesity.JPG  (the
original was not anonymized, and this image was subject to a lengthy
argument as the photographer was strongly opposed to concealing the
identity of the involuntary model)

Or people who might show up here without their knoweldge,
http://commons.wikimedia.org/wiki/Category:People_associated_with_HIV/AIDS

(Not even getting into all the photographs of people performing
activities which are illegal in some-place or another, simply being
gay will get you executed in some places, no explicit photographs
required, and using some drugs can get you long sentences in many
others...)

So please don't make it out like there is a unique risk there.

Commons has a policy related to identifiable images:

http://commons.wikimedia.org/wiki/Commons:Photographs_of_identifiable_people

(and why do people on foundation-l keep insisting on discussing these
things without even bothering to link to the existing policy pages?)


I'm all for strengthening it up further, but I hope an hysterical
reaction to sexual images isn't abused to make a mess of the policy
and convert it into something which will be less practically
enforceable than the current policy.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Andreas Kolbe
--- On Tue, 11/5/10, wjhon...@aol.com wjhon...@aol.com wrote:
 I would suggest a child-safe approach to
 Commons, is simply to use the Google image browser with a
 moderate filter setting.  Try it, it works.


Actually, it doesn't. For example, if you search for 

masturbation site:commons.wikimedia.org

you get explicit photographs, both with Strict and Moderate safe search. 

Andreas


  

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread George Herbert
On Tue, May 11, 2010 at 3:51 PM, Robert Rohde raro...@gmail.com wrote:
[...]
 However, I also see the issue from another frame that is not part of
 Tim's spectrum.  Sexual photographs, especially those of easily
 recognized people, have the potential to exploit or embarrass the
 people in them.  I place a high value on not doing harm to the models
 pictured.

 This is essentially a consent issue.  If the model is a well-known
 porn star and wants to be shown nude to the world, then there is no
 problem.  However, many of the sexual images we receive depict
 non-notable individuals who appear to be engaged in private conduct.
 If the uploader is being honest and responsible, then this may be fine
 too.  However, if the uploader is malicious, then the subject may have
 no idea how their image is being used.  Even if the person pictured
 consented to having the photographs made, they may still be horrified
 at the idea that their image would be used in an encyclopedia seen by
 millions.

 At present, our controls regarding the publication of a person image
 are often very lax.  With regards to self-made images, we often take
 a lot of things on faith, and personally I see that as irresponsible.

 In a sense, this way of looking at things is very similar to the issue
 of biographies of living persons.  For a long time we treated those
 articles more or less the same as all other articles.  However,
 eventually we came to accept that the potential to do harm to living
 persons was a special concern which warranted special safeguards,
 especially in the case of negative or private information.

 I would say that publishing photos of living persons in potentially
 embarrassing or exploitative situations should be another area where
 we should show special concern for the potential harm, and require a
 stronger justification for inclusion and use than typical content.
 (Sexual images are an easy example of a place where harm might be
 done, but I'd say using identifiable photos of non-notable people
 should be done cautiously in any situation where there is potential
 for embarrassment or other harm.)

 Obviously, from this point of view, I consider recent photos of living
 people to be rather different from illustrations or artwork, which
 would require no special treatment.


 Much of the discussion has focused on the potential to harm (or at
 least offend) the viewer of an image, but I think we should not forget
 the potential to harm the people in the images.


I would like to second this particular point, though I am largely
inclusionist in the larger debate here.

I handled an OTRS case in which exactly this happened; a ex-boyfriend
stole a camera which a female college student had taken private nude
pictures, posted them to Flickr, then someone copied them to Wikipedia
to illustrate one of our sex-related articles (for which, the specific
picture was reasonably educational/on topic/appropriate).

The student was extremely upset and angry about each of these abuses
of her privacy and property.

This is probably the exception rather than the rule, but it is worth
keeping in mind.


-- 
-george william herbert
george.herb...@gmail.com

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-12 Thread Delirium
On 05/11/2010 09:45 AM, Aryeh Gregor wrote:
 The obvious solution is not to display images by default that a large
 number of viewers would prefer not to view.  Instead, provide links,
 or maybe have them blurred out and allow a click to unblur them.  You
 don't hide any information from people who actually want it (you
 require an extra click at most), and you don't force people to view
 images that they don't want to view.  This allows as many people as
 possible to get what they want: people who want to see the images can
 see them, and those who don't can choose not to.  The status quo
 forces people to view the images whether or not they want to.  And a
 lot of people don't want to look at naked people without warning, for
 whatever reason.


I don't actually mind this proposal, and would like it myself for a lot 
of pages. But I'm not sure naked people are actually at the top of the 
list (perhaps someone should try to determine it empirically via some 
sort of research on our readers?). If I personally were to list two 
kinds of images I would want hidden by default, it'd be: 1. spiders; and 
2. gory medical conditions. Do I get that option? Do I get that option 
if more than x% of readers agree?

-Mark


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Jussi-Ville Heiskanen
Tim Starling wrote:

 Libertarians want all information to be available to everyone. Some
 say all adults, some say children too should be included. Their
 principles allow for individuals to choose for themselves to avoid
 seeing that which offends them, which leaves the problem of how the
 reader is meant to tell in advance whether a given picture might
 offend them, before they have actually seen it.

 Their ideology does not allow them to consider any solution which
 involves one person making a decision on behalf of another, and all
 the reasonable solutions seem to involve some element of this. So they
 are left with no option but to downplay the impact of seeing offensive
 content.


   

Plainly put, my view is this. An individual should be able
to make this choice. An individuals proxy should be able
to make this choice for them. I may have different ideas
about when in a natural justice sense such proxying of
choice should happen, but in law parents often do, and
some places schools do (in loco parentis), and there are
other special cases.

What I do object to is any solution that facilitates one
person or a group of persons for making that choice
for another group, without them being their proxies.

That is to say, parents or schools making those choices,
I find defensible, though not ideal (I do think those
choises often have unintended consequences, including
the retardation of their pupils education and developement
as human beings). However, whatever gloss you put on
it, I am not comfortable with the state, or a religious
community doing it, or helping the state or religious
community do it.


Yours,

Jussi-Ville Heiskanen


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Tim Starling
On 11/05/10 23:06, Anthony wrote:
 I assume here you're talking about choosing what images to allow on the
 websites.  I wouldn't call that making a decision on behalf of another,
 but I assume that's what you're referring to.  If I'm wrong, please correct
 me.

I'm including:

Solution 1: Exercise editorial control to remove particularly
offensive images from the site.

Standard answer 1: Some people may wish to see that content, it would
be wrong for us to stop them.

Solution 2: Tag images with an audience-specific rating system, like
movie classifications. Then enable client-side filtering.

Standard answer 2: This could potentially enable censorship which is
wrong as per answer 1. Also, we cannot determine what set of content
is right for a given audience. By encouraging people to filter, say,
R-rated content, we risk inadvertently witholding information that
they would have consented to see, had they been fully informed about
its nature.

Solution 3: Tag images with objective descriptors, chosen to be useful
for the purposes of determining offensive character by the reader. The
reader may then choose a policy for what kinds of images they wish to
filter.

Standard answer 3: This also enables censorship, which is wrong as per
answer 1. Also, tagging images with morally-relevant descriptors
involves a value judgement by us, when we determine which descriptors
to use. It is wrong for us to impose our moral values on readers in
this way.

The fundamental principle of libertarianism is that the individual
should have freedom of thought and action, and that it is wrong for
some other party to infringe that freedom. I've attempted to structure
the standard answers above in a way that shows how they are connected
to this principle.

 Religious conservatives think that seeing certain images, or reading
 certain text, is morally dangerous. Seeing these images, they believe,
 may lead the person into sin, and thus jeopardise their eternal soul.

 
 I think you've overstated that position.  Would you include Larry Sanger in
 this category?  He doesn't seem to be in either of the other two.

I think it's difficult to distinguish Larry's own views from the show
he puts on for the media.

But more generally, yes I suppose I may be overstating. Studying
religious views on sex and pornography is interesting, because those
views align closely with the laws and norms of wider society. Unlike
wider society, religious conservatives can give a detailed, consistent
and complete justification for their views.

In terms of history and influence, the religious connection is plain.
But I'll admit that not every anti-pornography campaigner today is a
religious conservative.

[...]
 I wouldn't call them moderates.  They are most certainly not moral
 relativist, and they have no desire to find compromises between the other
 two/three terrible positions.  Let's add a fourth faction, the educators.

Suit yourself. But I think it's more worthwhile to classify the
ideologues than it is to classify the pragmatists.

-- Tim Starling


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Tim Starling
On 11/05/10 23:56, Mike Godwin wrote:
 That's a feature, not a bug. If there is a compromise that pleases some
 factions but not others, it's not exactly a compromise, is it?

The trick is to find a compromise which pleases both factions, or at
least upsets both equally.

In particular, I think there is potential for some very shaky and
tentative common ground, in the area of parental control over young
children. Libertarians might be convinced to make an exception to
their principles for that case, which would open up room for small but
valuable concessions to conservatives.

-- Tim Starling


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Gregory Maxwell
On Tue, May 11, 2010 at 10:48 AM, Tim Starling tstarl...@wikimedia.org wrote:
[snip]
 But more generally, yes I suppose I may be overstating. Studying
 religious views on sex and pornography is interesting, because those
 views align closely with the laws and norms of wider society. Unlike
 wider society, religious conservatives can give a detailed, consistent
 and complete justification for their views.

And one which inevitably has apparently unresolvable conflicts with
one of our core organizing principles, NPOV.

Hundreds of millions of viewers a month visit Wikipedia, many of them
religious conservatives.  But our structure is not one that produces
articles on sexuality or religion (and sometimes on politics, and
science...) that are found to be acceptable by, at least, the most
hard-line among them.

Even the most widely acceptable initiative must eventually accept that
it can't please everyone. This is probably just one of the limits of
our model, and it's OKAY to have limits, everything does.

Consider, for a moment, the ALA list of most frequently challenged books:
http://www.ala.org/ala/issuesadvocacy/banned/frequentlychallenged/21stcenturychallenged/2009/index.cfm

I would propose that the reason we are subject to such a _small_
amount of complaint about our content is that much of the world
understands that what Wikipedia does is —in a sense— deeply subversive
and not at all compatible with ideas which must be suppressed.  This
fact gets a lot of names, some call it a liberal bias though I don't
think that is quite accurate.  But there very much is a bias— a
pro-flow-of-information bias.  We don't always realize we have it, but
I don't think we deny it when we do.

Jimmy brought this up in his keynote at Wikiconference New York in
2009: 
http://www.archive.org/download/NYwikiconf_wales_keynote_25july2009/NYwikiconf_wales_keynote_25july2009.ogv


There are other resources which address these subject areas in a
manner which religious conservatives may find more acceptable, such as
conservapedia.  It is a beneficial that there are alternative
information sources, no one wants a world where all reference works
are Wikipedia, so to the extent that our inability to cover some areas
to some people's satisfaction creates more room for alternatives it is
a good thing.

I'd like to address an idea that underlies a lot of this discussion
which I think is patently ridiculous:   That our inability to please
_everyone_ on _all_ articles is actually something to worry about.
It's not something that can actually be done, all we can hope to
choose is decide who we'll please, and by our core principles it
appears that we've chosen to error towards the libertarians.  In terms
of overall popularity we would have better off not to, but then again
I doubt we could have built something so useful another way. There is
no existence proof yet, at least.


The internet is chock full of things that hard-line religious
conservatives would believe imperil the soul of anyone who views it.
Even the most aggressive government censorship short of a total
internet ban only suppresses are relatively small amount of this
material. ... and yet people with these concerns continue to use the
internet happily and productively.  The impossibility of total
censorship means that don't look if you don't like is a reality for
everyone and not just libertarians.

(English) Wikipedia stopped being an encyclopedia about 3 million
articles ago. Today it is a collection of specialist encyclopaedias,
or really— a federation of 3.2 million separate articles sharing a
common set of principles and other infrastructure.  It is expected and
acceptable that some people may strongly approve some parts and
strongly oppose others.

Wikipedia, in the aggregate, is an excellent resource even for the
staunchest religious conservative.  But due to our core principles,
some parts of Wikipedia will _never_ be acceptable to that audience.
In at least a few cases, no amount of careful handling can satisfy a
hard factional information which must be suppressed to protect your
soul at the same time as fulfilling the effective direction from NPOV
to factually express all major viewpoints.

As with any of our other limitations— I would recommend that people
find other resources that meet their needs when Wikipedia doesn't,
just as do for millions of other webpages.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread David Gerard
On 11 May 2010 16:44, Gregory Maxwell gmaxw...@gmail.com wrote:

 There are other resources which address these subject areas in a
 manner which religious conservatives may find more acceptable, such as
 conservapedia.


Actually, Conservapedia has almost no readers or editors. (Its
activity rate is marginally higher than Citizendium.) Even the
American Christian right-wing conservatives have no use for it.


 I'd like to address an idea that underlies a lot of this discussion
 which I think is patently ridiculous:   That our inability to please
 _everyone_ on _all_ articles is actually something to worry about.
 It's not something that can actually be done, all we can hope to
 choose is decide who we'll please, and by our core principles it
 appears that we've chosen to error towards the libertarians.  In terms
 of overall popularity we would have better off not to, but then again
 I doubt we could have built something so useful another way. There is
 no existence proof yet, at least.


By the way, there appears to be an assumption - on the part of board
members, the WMF and some contributors to this thread - that Commons
has been somehow indiscriminate in what it accepts. This is *entirely
false*. Else this wonderful template would not exist:

http://commons.wikimedia.org/wiki/Template:Nopenis

See every other teplate starting no in
http://commons.wikimedia.org/wiki/Category:Message_templates .

(And yes, someone already did a version of the icon for [[m:DICK]].)


- d.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Excirial
*The trick is to find a compromise which pleases both factions, or at
least upsets both equally.
*
If we generalize the situation we could state the following:
The *Libertarians *point of view could be worded as: Allow everyone to view
all content
The *Conservative *point of view could be worded as: Disallow everyone to
view objectionable content

The difference here is that the *Libertarian *side would allow everyone a
choice in the matter (Don't want to see? Don't search), while the *Conservative
*side denies everyone access (Want to see? You cannot). As a result i
would say that the conversative opinion is the weaker one, as it would
intend to force its point of view on another group while the libertarian
view allows the conservative group to make a choice whether or not they want
to view content. The only middle ground i can see would be allowing the
conservative user to block images they do not wish to see, as that only
influences their own experience, and not the experience of others.

*In particular, I think there is potential for some very shaky and
tentative common ground, in the area of parental control over young
children. Libertarians might be convinced to make an exception to
their principles for that case, which would open up room for small but
valuable concessions to conservatives.
*
How would you wish to create or control such a system? How can we *Know*
what users are young children? Any IP might represent someone under the age
of 18, and any IP could actually be a child that qualifies as very young.
The only indication we might have are school IP's. I would point out that i
have heard of no school that allows unsupervised and unfiltered Internet
access to very young children. Furthermore i would (again) point out that i
would have to search for an explicit term in order to find one. Even if we
block Wiki access to such content there are a million other sides providing
the exact same thing.

What other options do we have? We could of course create a censor system
that allows network administrators to choose what content they refuse to
display, but such a system would only encourage true censorship as it might
be employed by ISP's and governments as well for very different reasons then
protecting children. And what difference would it make in the first place?
It would still fail to work for non school IP's such as home connections,
and not every school has an outbound address they can easily control. Sure,
the US and Europe generally connect schools trough dedicated IP's, but not
every country or school does so. The effect of such a system would be
trivial at best.

The last option we have is a per-user controlled settings that will hide
certain content. While i see some advantages with such a system (Eg: It
would hit no one besides a particular user), it would not solve the issue at
hand. Children are unlikely to create accounts to merely view wikipedia,
which would mean that parent would have to control settings on an IP basis.
And surprise - IP's are often dynamic which renders those changes void.

Don't get me wrong. I am not entirely adverse to your statement that parent
should be able to control what young children see, but i see no reasonable
way to implement that.

*To summarize:*
- We cannot reliably filter young users, as every IP might be a child.
Therefor we can not possibly create a catch-all system
- There are other technical and social factors that make such a system
undesirable in the first place.
- Don't search, don't find. And even if we hide wikipedia's content there
are plenty of other results that don't care who looks at them
- Removing content altogether would prevent anyone from seeing it, while
keeping content would allow people who wish to see it to look. A side that
enforces it opinion over anotheris par definition the weaker side in a
debate.

~Excirial

On Tue, May 11, 2010 at 5:08 PM, Tim Starling tstarl...@wikimedia.orgwrote:

 On 11/05/10 23:56, Mike Godwin wrote:
  That's a feature, not a bug. If there is a compromise that pleases some
  factions but not others, it's not exactly a compromise, is it?

 The trick is to find a compromise which pleases both factions, or at
 least upsets both equally.

 In particular, I think there is potential for some very shaky and
 tentative common ground, in the area of parental control over young
 children. Libertarians might be convinced to make an exception to
 their principles for that case, which would open up room for small but
 valuable concessions to conservatives.

 -- Tim Starling


 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread David Gerard
On 11 May 2010 17:45, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 Sure, and that's inevitable.  You aren't going to please people who
 have ideological problems with Wikipedia's entire premise.  But
 leaving aside people who think nudity is morally wrong on principle,
 we are still left with a very large number of people who would simply
 prefer not to see it.  Or would at least *sometimes* prefer not to see
 it (at work, when kids are around, etc.).  If these people want to
 look at even totally innocuous articles like [[Human]], they will be
 forced to look at images they don't want to see, with no warning.


You're a developer. Write something for logged-in users to block
images in local or Commons categories they don't want to see. You're
the target market, after all.

(If that isn't enough and you insist it has to be something for
default, then I fear you are unlikely to gain consensus on this.)


- d.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread David Goodman
I agree with David Gerard's suggestion above: this is a solution that
will meet a variety of needs,  and is therefore value-neutral. It can
be applied to more than categories--someone with a moderately slow
connection might wish to disable images in articles above a certain
size, or articles containing many images. Personally, I sometimes
disable image loading in my browser selectively in looking at certain
sites where the images interfere with use of the   material I actually
want.

David Goodman, Ph.D, M.L.S.
http://en.wikipedia.org/wiki/User_talk:DGG



On Tue, May 11, 2010 at 12:48 PM, David Gerard dger...@gmail.com wrote:
 On 11 May 2010 17:45, Aryeh Gregor simetrical+wikil...@gmail.com wrote:

 Sure, and that's inevitable.  You aren't going to please people who
 have ideological problems with Wikipedia's entire premise.  But
 leaving aside people who think nudity is morally wrong on principle,
 we are still left with a very large number of people who would simply
 prefer not to see it.  Or would at least *sometimes* prefer not to see
 it (at work, when kids are around, etc.).  If these people want to
 look at even totally innocuous articles like [[Human]], they will be
 forced to look at images they don't want to see, with no warning.


 You're a developer. Write something for logged-in users to block
 images in local or Commons categories they don't want to see. You're
 the target market, after all.

 (If that isn't enough and you insist it has to be something for
 default, then I fear you are unlikely to gain consensus on this.)


 - d.

 ___
 foundation-l mailing list
 foundation-l@lists.wikimedia.org
 Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Noein
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1

On 11/05/2010 12:44, Gregory Maxwell wrote:
 I would propose that the reason we are subject to such a _small_
 amount of complaint about our content is that much of the world
 understands that what Wikipedia does is —in a sense— deeply subversive
 and not at all compatible with ideas which must be suppressed.  This
 fact gets a lot of names, some call it a liberal bias though I don't
 think that is quite accurate.  But there very much is a bias— a
 pro-flow-of-information bias.  We don't always realize we have it, but
 I don't think we deny it when we do.

And there is a general consensus here about those libertarian views?
I'm impressed. Sorry to repetitively check the ethical temperature of
the community, but I come from social horizons where it's not only not
natural, but generates hatred. I never could talk about libertarian
ideas outside of one or two family members and two or three friends.
Here, it seems the norm, and I simply can't believe it.
As I said before, Wikipedia acted like a magnet on me. I'm wondering if
it's uniting all the (internet connected) libertarian of the world. In
this case I'm surprised that it didn't receive more serious attacks from
the establishment.
-BEGIN PGP SIGNATURE-
Version: GnuPG v1.4.9 (MingW32)
Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/

iQEcBAEBAgAGBQJL6ajnAAoJEHCAuDvx9Z6LxQEIAOkmwi+7o3PyBbxOXDHrfTyT
5+OemY+gw4AEejtDq/ZV6jI3ngD/APehLY/slGWsiwxbOlhH3YPy1ELPdwOQdX75
YKyjprccvuLPgY+vkUTD4osn12hZWH0g83kPRjjvx4CeBiGL1kvxKDpU1qXDyhNX
sboxxSwXqMI9gVH787Wd03TWP7EXxdwPkt7TEc6M1oMXug4RhpUB9jdUr1ikO5Ni
09ws/S0zIHiVCd88BTfYxaG0JJYbt/vmSG0232Sz5w+CjXtVfigch6KHYVKrYxAV
XdXTPwvd0D63tXNBJ/lsZ9AjGk28Ktdyum9T7RROFXlQckBk3Fi7m9o57F0Fomk=
=IP4R
-END PGP SIGNATURE-

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Yann Forget
Hi,

2010/5/11 Noein prono...@gmail.com:

 On 11/05/2010 12:44, Gregory Maxwell wrote:
 I would propose that the reason we are subject to such a _small_
 amount of complaint about our content is that much of the world
 understands that what Wikipedia does is —in a sense— deeply subversive
 and not at all compatible with ideas which must be suppressed.  This
 fact gets a lot of names, some call it a liberal bias though I don't
 think that is quite accurate.  But there very much is a bias— a
 pro-flow-of-information bias.  We don't always realize we have it, but
 I don't think we deny it when we do.

 And there is a general consensus here about those libertarian views?
 I'm impressed. Sorry to repetitively check the ethical temperature of
 the community, but I come from social horizons where it's not only not
 natural, but generates hatred. I never could talk about libertarian
 ideas outside of one or two family members and two or three friends.
 Here, it seems the norm, and I simply can't believe it.
 As I said before, Wikipedia acted like a magnet on me. I'm wondering if
 it's uniting all the (internet connected) libertarian of the world. In
 this case I'm surprised that it didn't receive more serious attacks from
 the establishment.

I think that, world wide, people from conservative or traditional
cultures don't have as much Internet access as people with libertarian
views. This is specially true outside of the Western world. That may
explain why there are not so much opposition to the libertarian
position of Wikimedia. Internet access helps to get a larger view,
better understanding of other cultures, and a more open opinion on
sensitive subjects.

Yann

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Aryeh Gregor
On Tue, May 11, 2010 at 12:48 PM, David Gerard dger...@gmail.com wrote:
 You're a developer. Write something for logged-in users to block
 images in local or Commons categories they don't want to see. You're
 the target market, after all.

I'd be happy to do any software development if that were helpful.
I've been thinking about how best to do it, on and off, for some time.

However, I don't think it's reasonable to require opt-out for images
that a large percentage of viewers don't want to see without warning.
If the people who want to see it can see it with just a click anyway,
they aren't losing anything if it's hidden by default.  Especially if
it's just blurred out.

 (If that isn't enough and you insist it has to be something for
 default, then I fear you are unlikely to gain consensus on this.)

Does that mean you disagree with me but aren't saying why, or that you
agree but aren't bothering to say so because you're sure it won't
happen?  The latter is where self-fulfilling prophecies come from.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread wjhonson
If there is enough of a perceived need for content filtering, someone will fill 
that void.  That someone does not need to be us.  Google does this job with 
their image browser already without the need for any providers to actively 
tag any images.  How do they do that?  I have no idea, but they do it.  I 
would suggest a child-safe approach to Commons, is simply to use the Google 
image browser with a moderate filter setting.  Try it, it works.

I would suggest that any parent who is allowing their young children as one 
message put it, to browser without any filtering mechanism, is deciding to 
trust that child, or else does not care if the child encounters objectionable 
material.  The child's browsing activity is already open to five million porn 
site hits as it stands, Commons isn't creating that issue.  And Commons cannot 
solve that issue.  It's the parents responsibility to have the appropriate 
self-selected mechanisms in place.  And I propose that all parents who care, 
already *do*.  So this issue is a non-issue.  It doesn't actually exist in any 
concrete example, just in the minds of a few people with spare time.

W.J.


___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread David Gerard
On 11 May 2010 21:42, Aryeh Gregor simetrical+wikil...@gmail.com wrote:
 On Tue, May 11, 2010 at 12:48 PM, David Gerard dger...@gmail.com wrote:

 You're a developer. Write something for logged-in users to block
 images in local or Commons categories they don't want to see. You're
 the target market, after all.

 I'd be happy to do any software development if that were helpful.
 I've been thinking about how best to do it, on and off, for some time.
 However, I don't think it's reasonable to require opt-out for images
 that a large percentage of viewers don't want to see without warning.
 If the people who want to see it can see it with just a click anyway,
 they aren't losing anything if it's hidden by default.  Especially if
 it's just blurred out.


You're making an assumption there on no evidence: a large percentage
wanting to be opted out by default.

If you write it, then logged-in users could give you numbers. (e.g. a
Western World worksafe filter set will undoubtedly be popular.)

Commons admins are in fact *painstaking* in accurate categorisation;
the filter sets should be stable.


 (If that isn't enough and you insist it has to be something for
 default, then I fear you are unlikely to gain consensus on this.)

 Does that mean you disagree with me but aren't saying why, or that you
 agree but aren't bothering to say so because you're sure it won't
 happen?  The latter is where self-fulfilling prophecies come from.


I think it's a bad idea *and* you are unlikely to obtain consensus.
Because filtering for people who *haven't* asked is quite a different
proposition from filtering for people who *have* asked, in what I'd
hope are fairly obvious ways.


wjohn...@aol.com wrote:

I would suggest that any parent who is allowing their young children as one 
message put it, to browser without any filtering mechanism, is deciding to 
trust that child, or else does not care if the child encounters objectionable 
material.  The child's browsing activity is already open to five million porn 
site hits as it stands, Commons isn't creating that issue.  And Commons cannot 
solve that issue.  It's the parents responsibility to have the appropriate 
self-selected mechanisms in place.  And I propose that all parents who care, 
already *do*.  So this issue is a non-issue.  It doesn't actually exist in any 
concrete example, just in the minds of a few people with spare time.


Indeed.


- d.

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Delirium
On 05/11/2010 11:58 AM, Noein wrote:
 And there is a general consensus here about those libertarian views?
 I'm impressed. Sorry to repetitively check the ethical temperature of
 the community, but I come from social horizons where it's not only not
 natural, but generates hatred. I never could talk about libertarian
 ideas outside of one or two family members and two or three friends.
 Here, it seems the norm, and I simply can't believe it.
 As I said before, Wikipedia acted like a magnet on me. I'm wondering if
 it's uniting all the (internet connected) libertarian of the world. In
 this case I'm surprised that it didn't receive more serious attacks from
 the establishment.

I'm not sure how strong the consensus is, but I'd be careful about 
reading anything too wide into the fairly narrow set of issues we 
discuss here. The libertarians on this issue are taking a narrow 
position on availability of materials deemed offensive, and aren't 
necessarily political libertarians in the wider sense of low-tax, 
free-market, small-government sort of libertarianism. (Some are, but I'd 
guess that plenty of people with libertarian views as to Wikipedia 
content policies are politically on the left.)

-Mark

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Robert Rohde
Tim's post is excellent.  However there is a viewpoint on this issue
that is important to me personally that I feel is not well represented
by his spectrum.

To the extent that Tim's spectrum does represent me, I am probably
moderate.  I recognize that some people (e.g. the conservatives) find
certain content undesirable and I would gladly give them the tools to
self-filter that content if they wished.  Since conservatives are a
major segment of the population, I also think it is pragmatic for
public relations reasons to avoid collecting large numbers of
redundant and/or unused images that have the potential to offend such
people.


However, I also see the issue from another frame that is not part of
Tim's spectrum.  Sexual photographs, especially those of easily
recognized people, have the potential to exploit or embarrass the
people in them.  I place a high value on not doing harm to the models
pictured.

This is essentially a consent issue.  If the model is a well-known
porn star and wants to be shown nude to the world, then there is no
problem.  However, many of the sexual images we receive depict
non-notable individuals who appear to be engaged in private conduct.
If the uploader is being honest and responsible, then this may be fine
too.  However, if the uploader is malicious, then the subject may have
no idea how their image is being used.  Even if the person pictured
consented to having the photographs made, they may still be horrified
at the idea that their image would be used in an encyclopedia seen by
millions.

At present, our controls regarding the publication of a person image
are often very lax.  With regards to self-made images, we often take
a lot of things on faith, and personally I see that as irresponsible.

In a sense, this way of looking at things is very similar to the issue
of biographies of living persons.  For a long time we treated those
articles more or less the same as all other articles.  However,
eventually we came to accept that the potential to do harm to living
persons was a special concern which warranted special safeguards,
especially in the case of negative or private information.

I would say that publishing photos of living persons in potentially
embarrassing or exploitative situations should be another area where
we should show special concern for the potential harm, and require a
stronger justification for inclusion and use than typical content.
(Sexual images are an easy example of a place where harm might be
done, but I'd say using identifiable photos of non-notable people
should be done cautiously in any situation where there is potential
for embarrassment or other harm.)

Obviously, from this point of view, I consider recent photos of living
people to be rather different from illustrations or artwork, which
would require no special treatment.


Much of the discussion has focused on the potential to harm (or at
least offend) the viewer of an image, but I think we should not forget
the potential to harm the people in the images.

-Robert Rohde

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Anthony
On Tue, May 11, 2010 at 10:48 AM, Tim Starling tstarl...@wikimedia.orgwrote:

 On 11/05/10 23:06, Anthony wrote:
  I assume here you're talking about choosing what images to allow on the
  websites.  I wouldn't call that making a decision on behalf of another,
  but I assume that's what you're referring to.  If I'm wrong, please
 correct
  me.

 I'm including:

 Solution 1: Exercise editorial control to remove particularly
 offensive images from the site.

 Solution 2: Tag images with an audience-specific rating system, like
 movie classifications. Then enable client-side filtering.

 Solution 3: Tag images with objective descriptors, chosen to be useful
 for the purposes of determining offensive character by the reader. The
 reader may then choose a policy for what kinds of images they wish to
 filter.

 The fundamental principle of libertarianism is that the individual
 should have freedom of thought and action, and that it is wrong for
 some other party to infringe that freedom.


Well, I fully agree with that fundamental principle, and yet I wouldn't
categorize any of those solutions as violating that principle, nor would I
characterize them as making decisions on behalf of another.

And really, I don't see how you could possibly characterize these actions
that way.  You say Some people may wish to see that content, it would
be wrong for us to stop them.  I would think the libertarian response to
that is that a desire to have something does not constitute a right to force
others to provide it to you.

Now, granted, I don't think the line should be drawn at particularly
offensive images.  I'd say the primary, if not sole criterion (besides the
obvious legality and free license), would be the bona fide educational value
of the image.  I'd say the image of Saint Terese, while obviously highly
offensive to some, would be acceptable if (and, in my opinion, only if) some
text was included right on the image page which explained such things as who
created it, what he was trying to say with it, what historical impact it
had, etc.

I think such a task, applied to any and all questionable images, would also
be something that could be pointed to should the media come by and cry
pornography.  The explanation should be right there on the image page
justifying the educational nature of the image.  Yes, there will be some
people who will remain opposed to such images regardless of the explanation,
but at that point (and not before it) I think we can properly treat such
people as obviously irrational.

Of course, the explanations would need to be good ones.  I can think of a
lot of bad explanations that would likely be offered for the inclusion of
certain images, and I don't at all trust the community to recognize them as
such.

 I wouldn't call them moderates.  They are most certainly not moral
  relativist, and they have no desire to find compromises between the other
  two/three terrible positions.  Let's add a fourth faction, the
 educators.

 Suit yourself. But I think it's more worthwhile to classify the
 ideologues than it is to classify the pragmatists.


I think you should consider that some of the educators aren't pragmatists
at all.  I look at your three categories, and I like some of each, and
dislike some of each.  I fully agree with the principle you ascribe to the
libertarians that the individual should have freedom of thought and action,
and that it is wrong for some other party to infringe that freedom.  But a
freedom is just that, a freedom, not an entitlement.  People who want to
view porn should (and do) have the freedom to view porn.  But that doesn't
mean the WMF has a duty to host it for them.  The religious conservatives
are probably right that seeing certain images, at least for people at a
certain age, and especially without the proper context, is dangerous.  I'm
not quite sure what you mean by morally dangerous, but hard core
pornography, especially certain forms of hard core pornography, is probably
not very healthy.  I certainly don't want my son or daughter viewing porn
videos as he's learning about sex.  Finally, there's what you call the
moderates.  Actually I don't see much value in that position at all, at
least as you describe it.  But to the extent the moderates advocate taking
some ideas from both of the other two positions, I guess I can agree with
that.

As you describe them, the moderates are the pragmatists, and that's a big
part of what they've gotten wrong.  I would advocate that the WMF adopt a
principled position with respect to pornography, and then let people decide
whether or not they wish to use an encyclopedia built on those principles.

On Tue, May 11, 2010 at 12:45 PM, Aryeh Gregor 
simetrical+wikil...@gmail.com simetrical%2bwikil...@gmail.com wrote:

 [[Daniel Pearl]] does not contain an image
 of him being beheaded (although it's what he's famous for), and
 [[Goatse.cx]] does not contain an image of its subject matter.  Why?


Primarily because of copyright 

Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Samuel Klein
On Tue, May 11, 2010 at 12:06 PM, David Gerard dger...@gmail.com wrote:

 By the way, there appears to be an assumption - on the part of board
 members, the WMF and some contributors to this thread - that Commons
 has been somehow indiscriminate in what it accepts.

I don't read that.  What I see is a debate about whether Commons
discriminates in the right ways, not a belief that it is
indiscriminate.

The most frequent censorship on the Projects is done in the name of
notability -- and it is done all the time.

As an eventualist myself, I think that we are generally *too
discriminate* in what we accept -- Commons and English Wikipedia are
quite ready to delete media and articles simply because one group of
established editors disagrees with the notability standards or writing
style of a younger, less wiki-savvy group.

For instance, professional artist, animator, and free culture activist
Nina Paley had quite a difficult time contributing artwork to Commons
without having it (and her userpage!) deleted.
http://commons.wikimedia.org/wiki/User:Nina_Paley

As to whether we have the right policies for discriminating between
good and bad images of human sexuality, I think the worst  issue is
simply the lack of a standard for model releases.  When it comes to
assessing quality and appropriateness, I don't think the Commons
standards in that category are much worse than in other categories.
(Of course it may be true that the impact of standards in that
category is much greater than in other cats, because of its
disproportionate popularity).

SJ

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l


Re: [Foundation-l] Spectrum of views (was Re: Sexual Imagery on Commons: where the discussion is happening)

2010-05-11 Thread Samuel Klein
Tim, thank you for this excellent post.   A few comments:


Tim Starling writes:
 it's only the libertarians who value educational value above
 moral hazard

I don't really agree with this.  Contributors from across your
spectrum consider whether potentially-harmful information about a
person is educational enough to be in their biography.  The tension
between educational value and various hazards is something regularly
assessed by editors.

It's just that in areas of contentious sexual, religious, racist, or
violent content, there is a wide variation in assessment of moral
hazard.

In areas of an admittedly-embarrassing verifiable fact about a person,
there is (often) much less variation, and communities often come to an
agreement about whether the value outweighs the hazard.


Yann writes:
 I think that, world wide, people from conservative or
 traditional cultures don't have as much Internet access as
 people with libertarian views.

This is true, and worth considering.  Wikipedia is sometimes pointed
to as a reason for such communities to use the Internet.


Aryeh Gregor writes:
 First of all, the images rarely add much value.

Different audiences disagree widely on this point - for any
controversial image, groups opposed to it will find it valueless.


 Keep all the images if you like. Determine, by policy, [how to show them]

It's true that policies for what is kept in Commons are  different
from policies about what Projects show.

My view is, the latter should be determined by individual Projects -
though they may need help in implementing what they want.  On the
other hand, Commons is a project unto itself, and has its own content
policies (such as the prohibition on fair use).


 there's been edit-warring about [[Daniel Pearl]], but
 probably out of respect for his family

Largely true.  And we need to improve our standards of respect for
images of people.


Robert Rohde writes:

Sexual photographs, especially those of easily recognized
 people, have the potential to exploit or embarrass the
 people in them.  I place a high value on not doing harm to
 the models pictured.

 At present, our controls regarding the publication of a
 person image are often very lax.  With regards to
 self-made images, we often take a lot of things on faith,
 and personally I see that as irresponsible

I agree strongly with this.  You are right to point out the connection
to improving BLP policies -- we should be much more careful to
confirming model rights for people in any potentially exploitative or
embarrassing photos.

Such ideas have been around for a long time.  What are the arguments
against implementing stronger requirements for images of people?

SJ

___
foundation-l mailing list
foundation-l@lists.wikimedia.org
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l