Ziko,
it does not jeopardize the Wikidata goal -- the current language link
system won't be switched off, but can be further used. Everything that
is working currently will still be possible afterwards. Wikidata can
still be used to represent the 99.2% of language links that are simple
-- this
Amir,
thank you for the thoughtful reply!
Indeed our current plan is a kind of a staged deployment in the sense
that we will not automatically transfer the links but let the editor
community do it. On our test systems we already see bots being tried
out and rewritten, so we expect that as soon
This number, 99.2% was also mentioned on the Berlin Hackathon. It
sounds much higher than what my (very scientifically relevant,
obviously) gut feeling tells me. Could you indicate where this number
is coming from?
On Tue, Jun 26, 2012 at 2:45 PM, Denny Vrandečić
denny.vrande...@wikimedia.de
I got the number from Brent Hecht, a researcher at Northwestern, who
has a number of great papers published on Wikipedia-related topics.
CC-ing him, so he knows I am blam.., er, referencing him :)
Cheers,
Denny
2012/6/26 Martijn Hoekstra martijnhoeks...@gmail.com:
This number, 99.2% was also
One major problem with double language links I've encountered before was
that they confuse interwiki bots and therefore break things. Several
articles on the Cantonese Wikipedia (zh-yue.wp) pertaining to local
political and cultural issues in the Cantonese-speaking world have
__NOBOT__ on them
Hi All,
Brent Hecht here :-) This has been a really interesting discussion, and I
wanted to chime in with a few notes.
The 99.2% is based on a quick script I wrote that looked at reciprocity among a
sample of interlanguage links (ILLs) in 25 languages to address some questions
that Denny
Hi Denny,
This is a really interesting list.
Looking at the Hungarian list, I find that in many instances the duplicate
interwiki link is actually commented out (in the form of !-- Source:
[[en: something]] -- or !-- wrong interwikis: [[en: ..] [[fr: ..]] --),
and not real duplicate links. (In
Hi Denny,
TL;DR: It's a very important question, but don't worry about it too
much. Just do Wikidata well as it is currently planned.
Now, the full reply.
I wrote a bit of an essay about it in 2008:
https://meta.wikimedia.org/wiki/Tips_for_resolving_interwiki_conflicts
I also started a page to
Thanks for this list. For the languages I know, I've started going
through and fixing ones that are clearly wrong. If a number of people do
that, that should improve the general quality/consistency of interwiki
links. I second the other comment that it'd be nice if the parsing could
be re-run
Hello,
So may I guess that double links are usually the result of a
Wikipedian who was not sure which language link to set, so in doubt,
he simply put in the language links for two different articles?
And in general, is it imagineable that different languages divide the
knowledge in different
10 matches
Mail list logo