If there is enough of a perceived need for content filtering, someone will fill 
that void.  That someone does not need to be us.  Google does this job with 
their image browser already without the need for any providers to actively 
"tag" any images.  How do they do that?  I have no idea, but they do it.  I 
would suggest a "child-safe" approach to Commons, is simply to use the Google 
image browser with a "moderate filter" setting.  Try it, it works.

I would suggest that any parent who is allowing their "young children" as one 
message put it, to browser without any filtering mechanism, is deciding to 
trust that child, or else does not care if the child encounters objectionable 
material.  The child's browsing activity is already open to five million porn 
site hits as it stands, Commons isn't creating that issue.  And Commons cannot 
solve that issue.  It's the parents responsibility to have the appropriate 
self-selected mechanisms in place.  And I propose that all parents who care, 
already *do*.  So this issue is a non-issue.  It doesn't actually exist in any 
concrete example, just in the minds of a few people with spare time.

W.J.


_______________________________________________
foundation-l mailing list
[email protected]
Unsubscribe: https://lists.wikimedia.org/mailman/listinfo/foundation-l

Reply via email to