On Aug 10, 2012, at 6:50 PM, [email protected] wrote:

> 
> 
> 
> 
> ------------------------------
> 
> Message: 4
> Date: Fri, 10 Aug 2012 16:39:40 -0700
> From: Rob Lanphier <[email protected]>
> To: Wikimedia developers <[email protected]>
> Subject: [Wikitech-l] Use cases for Sites handling change (Re:
>    Wikidata    blockers weekly update)
> Message-ID:
>    <capzpxh4uxs-ensbv_zsfk1ijas1_zpzvrbgvomm70wzyhvc...@mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> Hi everyone,
> 
> I'm starting a separate thread, because this is an important topic and
> I don't think it's well served as a subtopic of a "Wikidata blockers"
> thread.
> 
> To recap, Jeroen submitted changeset 14295 in Gerrit
> <https://gerrit.wikimedia.org/r/#/c/14295/> with the following
> summary:
>> This commit introduces a new table to hold site data and configuration,
>> objects to represent the table, site objects and lists of sites and
>> associated tests.
> 
>> The sites code is a more generalized and less contrived version of the
>> interwiki code we currently have and is meant to replace it eventually.
>> This commit does not do away with the existing interwiki code in any way yet.
> 
>> The reasons for this change where outlined and discussed on wikitech here:
>> http://lists.wikimedia.org/pipermail/wikitech-l/2012-June/060992.html
> 
> Thanks Brian for summarizing an important point:
> 
> On Fri, Aug 10, 2012 at 6:33 AM, bawolff <[email protected]> wrote:
>> 
> 
> First and foremost, I'm a little confused as to what the actual use
> cases here are. Could we get a short summary for those who aren't
> entirely following how wikidata will work, why the current interwiki
> situation is insufficient? I've read the I0a96e585 and
> http://lists.wikimedia.org/pipermail/wikitech-l/2012-June/060992.html,
> but everything seems very vague "It doesn't work for our situation",
> without any detailed explanation of what that situation is. At most
> the messages kind of hint at wanting to be able to access the list of
> interwiki types of the wikidata "server" from a wikidata "client" (and
> keep them in sync, or at least have them replicated from
> server->client). But there's no explanation given to why one needs to
> do that (are we doing some form of interwiki transclusion and need to
> render foreign interwiki links correctly?

Now I can't speak for what Wikidata is doing, but there is a use case for 
interwikis in transclusion. I have never considered before that this would not 
render the the foreign interwiki links correctly. I doubt the situation has 
actually occurred yet, but there are two expected cases I can think of off the 
top of my head which might touch on this area.

Number one, people on Wikisource are expecting interwiki transclusion to be a 
solution to texts in duplicate languages. It inevitable that with enough such 
texts created there will be links made to a Wikipedia article insude one 
subdomain which end up being transcluded into another subdomain.

Number two is a bit different but probaly problematic all the same, Wikisource 
developers are searching for a solution that will allow the OCR text layer if a 
djvu file to replaced with the human validated text. A non-human solution. When 
a good solution is found the djvu files at Common can be expected to be 
processed this way. There will be interwiki links originally made in various 
Wikisource subdomains as part of the validated text that will be processed back 
into the Commons djvu file. Obviously the process can not leave them as 
wikitext, but interwikis originating from any possible language subdomain of 
Wikisource will have to handled by one piece of software.

I don't understand more than half of what is said on this list, although I do 
really try to follow along.  And that is whole a lot more than I understand 
about how the use cases I outlined will actually work their computer magic. But 
if the talk of duplicate entries for different originating wikis means what I 
think it might mean, it strikes me as problematic.

BirgitteSB
_______________________________________________
Wikitech-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikitech-l

Reply via email to