Hi,
recently the report of the KnowPrivacy [1] study - a research project
by the School of Information from University of California in Berkeley
- hit the German media [2].
It came to the conclusion that All of the top 50 websites contained
at least one web bug at some point in a one month time
Tim 'avatar' Bartel wrote:
Hi,
recently the report of the KnowPrivacy [1] study - a research project
by the School of Information from University of California in Berkeley
- hit the German media [2].
It came to the conclusion that All of the top 50 websites contained
at least one web bug
Domas Mituzas wrote:
Do note, hu.wikipedia.org has external stats aggregator,
'stats.wikipedia.hu', which is hosted on vhost102.sx6.tolna.net - and
all our traffic is sent there (
http://hu.wikipedia.org/w/index.php?title=MediaWiki:Lastmodifiedatoldid=4493139
- as well as few other
We need tools to track user behavior inside Wikipedia. As it is now we
know nearly nothing at all about user behavior and nearly all people
saying anything about users at Wikipedia makes gross estimates and wild
guesses.
User privacy on Wikipedia is is close to a public hoax, pages are
transfered
Forgot a link to an article which describes very well privacy on
Wikipedia! ;)
http://en.wikipedia.org/wiki/The_Emperor%27s_New_Clothes
John at Darkstar skrev:
We need tools to track user behavior inside Wikipedia. As it is now we
know nearly nothing at all about user behavior and nearly all
John at Darkstar wrote:
We need tools to track user behavior inside Wikipedia. As it is now we
know nearly nothing at all about user behavior and nearly all people
saying anything about users at Wikipedia makes gross estimates and wild
guesses.
User privacy on Wikipedia is is close to a
On Thu, Jun 4, 2009 at 1:18 AM, Tim 'avatar' Bartel
wikipe...@computerkultur.org wrote:
Hi,
recently the report of the KnowPrivacy [1] study - a research project
by the School of Information from University of California in Berkeley
- hit the German media [2].
The case of vlswiki is
2009/6/4 Pedro Sanchez pdsanc...@gmail.com:
What I propose is this being re-added would cause a removal of sysop bit due
to misuse of powers.
Don't we have a committee that checks privacy violations?
The Foundation would surely have this power.
- d.
The interesting thing is who has interest in which users identity.
Lets make an example, some organization sets up a site with a honeypot
and logs all visitors. Then they correlates that with RC-logs from
Wikipedia and then checks out who adds external links back to
themselves. They do not need
The Ombudsman Commission would likely be that group. Although their
focus has traditionally been CheckUser, their purview actually covers
any and all violations of the privacy policy. Here is one such case. At
this moment, I agree: this sysop shouldn't be.
-Mike
On Thu, 2009-06-04 at 06:21
Domas Mituzas midom.li...@... writes:
Do note, hu.wikipedia.org has external stats aggregator,
'stats.wikipedia.hu', which is hosted on vhost102.sx6.tolna.net - and
all our traffic is sent there (
http://hu.wikipedia.org/w/index.php?title=MediaWiki:Lastmodifiedatoldid=4493139
The
Hi,
2009/6/4 Tisza Gergő gti...@gmail.com:
As for Doubleclick, that was probably a mistake on KnowPrivacy's part - maybe
they misidentified the aggregator (we use awstats) because Doubleclick uses a
similar method? If not, I would appreciate if they could serve with more
detailed information.
John at Darkstar wrote:
The interesting thing is who has interest in which users identity.
Lets make an example, some organization sets up a site with a honeypot
and logs all visitors. Then they correlates that with RC-logs from
Wikipedia and then checks out who adds external links back to
On Wed, Jun 3, 2009 at 7:22 PM, Ray Saintonge sainto...@telus.net wrote:
Milos Rancic wrote:
BTW, I am really skeptical about the idea that one large Internet
company sues Wikimedia or MediaWiki developers for their patents. It
would be a really bad PR for them.
The history of lawsuits often
Web bugs for statistical data are a legitimate want but potentially a
horrible privacy violation.
So I asked on wikitech-l, and the obvious answer appears to be to do
it internally. Something like http://stats.grok.se/ only more so.
So - if you want web bug data in a way that fits the privacy
David Gerard wrote:
External web bug trackers should be removed without
exception. People who add them innocently, out of an understandable
interest in collecting aggregated information that would not violate the
privacy policy, should be directed to request and help with internal
solutions,
Installing Google Analytics, even for our own purposes, is a bad idea.
For one, it creates a link to google that is not necessarily what we
want; it would be a big target for people to try and hack, and it
presents tempting security risks on Google's end. Not to mention, as
far as I know
On Thu, Jun 4, 2009 at 10:39 AM, Milos Rancic mill...@gmail.com wrote:
I really don't think
that Google, Facebook or Amazon are so stupid to sue WMF or anything
strongly connected with WMF because their business is strongly
connected to the perception of their behavior (by Internet users).
Dan Rosenthal wrote:
Installing Google Analytics, even for our own purposes, is a bad idea.
For one, it creates a link to google that is not necessarily what we
want; it would be a big target for people to try and hack, and it
presents tempting security risks on Google's end. Not to
On Thu, Jun 4, 2009 at 8:35 AM, Dan Rosenthal swatjes...@gmail.com wrote:
Installing Google Analytics, even for our own purposes, is a bad idea.
For one, it creates a link to google that is not necessarily what we
want; it would be a big target for people to try and hack, and it
presents
[repost with proper subscribed mail address]
Alex wrote:
The plain pageview stats are already available.
Erik Zachte has been doing some work on other stats.
http://stats.wikimedia.org/EN/VisitorsSampledLogRequests.htm
If I were to compile a wishlist of stats things:
1. stats.grok.se data
On Thu, Jun 4, 2009 at 6:01 AM, Neil Harrisuse...@tonal.clara.co.uk wrote:
Surely this is something which should be possible to block at the
MediaWiki level, by suppressing the generation of any HTML that loads
any indirect resources (scripts, iframes, images, etc.) whatsoever other
than from
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Aryeh Gregor wrote:
On Thu, Jun 4, 2009 at 6:01 AM, Neil Harrisuse...@tonal.clara.co.uk
wrote:
Surely this is something which should be possible to block at the
MediaWiki level, by suppressing the generation of any HTML that loads
any indirect
2009/6/4 Jon scr...@nonvocalscream.com:
Has apache/proxy level filtering been considered?
Filtering for what? Javascript is executed client-side, ie. after the
page has gone through the apache servers/proxies.
___
foundation-l mailing list
On Thu, Jun 4, 2009 at 10:44 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
On Thu, Jun 4, 2009 at 12:53 PM, Robert Rohderaro...@gmail.com wrote:
One idea is the proposal to install the AbuseFilter in a global mode,
i.e. rules loaded at Meta that apply everywhere. If that were done
2009/6/4 Robert Rohde raro...@gmail.com:
On Thu, Jun 4, 2009 at 10:44 AM, Aryeh Gregor
simetrical+wikil...@gmail.com wrote:
However, perhaps a default AbuseFilter could be installed telling
admins that installing Analytics is a violation of Foundation policy
and that they'll get desysopped
2009/6/4 Unionhawk unionhawk.site...@gmail.com:
So how do you propose we enforce this? I'm thinking we need to prevent this
from happening in the first place. Analytics like this could pretty much
give checkuser powers to anybody!
There's not that many places where this sort of thing could be
2009/6/4 Erik Zachte erikzac...@infodisiac.com:
Considering web bugs: comScore also proposed such a scheme to us.
Apart from the question how much it would bring us that we don't or can't
figure out ourselves an overriding concern is privacy.
So if we ran our own internal web bug mechanism,
Not to mention, as
far as I know the program is proprietary.
This is an example of whats the real problem here; its not the security
issues but the users political issues.
I'm not convinced that
we need to be tracking user behavior at this point in time, or that
the tradeoffs for
One idea is the proposal to install the AbuseFilter in a global mode,
i.e. rules loaded at Meta that apply everywhere. If that were done
(and there are some arguments about whether it is a good idea), then
it could be used to block these types of URLs from being installed,
even by admins.
On Jun 4, 2009, at 11:27 PM, John at Darkstar wrote:
Not to mention, as
far as I know the program is proprietary.
This is an example of whats the real problem here; its not the
security
issues but the users political issues.
I fail to see what that has to do with anything. I'm just
John at Darkstar wrote:
One idea is the proposal to install the AbuseFilter in a global mode,
i.e. rules loaded at Meta that apply everywhere. If that were done
(and there are some arguments about whether it is a good idea), then
it could be used to block these types of URLs from being
Is this enough? Of course not, there is so much more to learn.
Erik Zachte
There are a few very important missing items for the moment
* Number of unique visitors
* Number of page visits per visitors
All should be analyzed on user roles, possibly also on different time
spans (hour,
Hmm? There's no reason to do anything like that. The AbuseFilter would
just prevent sitewide JS pages from being saved with the particular URLs
or a particular code block in them. It'll stop the well-meaning but
misguided admins. Short of restricting site JS to the point of
uselessness,
John at Darkstar wrote:
Hmm? There's no reason to do anything like that. The AbuseFilter would
just prevent sitewide JS pages from being saved with the particular URLs
or a particular code block in them. It'll stop the well-meaning but
misguided admins. Short of restricting site JS to the
35 matches
Mail list logo