[Wikidata-l] features needed by langlink bots (was Re: update from the Hebrew Wikipedia)

2012-10-24 Thread Merlissimo

Just to answer these questions for my _java_ interwiki bot MerlIwBot:

Am 12.10.2012 16:45, schrieb Amir E. Aharoni:


Will the bots be smart enough not to do anything to articles that are
already listed in the repository and have the correct links displayed?



Will the bots be smart enough not to do anything with articles that
have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?



yes, because this is wikidata independent for my java bot and worked 
already before. So running my bot is always save on these wikis. 
non-1-to-1 langlinks can be moved partly to wikidata because mixing if 
wikidata and local langlinks is possible. But please note that since 
change https://gerrit.wikimedia.org/r/#/c/25232/ which is live with 
1.21.wmf2 no multiple langlinks are display anymore.



Will the bots be smart enough to update the repo in the transition
period, when some Wikipedias have Wikidata and some don't?


My bot can update the repository but will cause much load on local 
wikis. That's because currently there is no possibility to know if a 
langlinks is stored local or at wikidata. Local langlinks can be stored 
on main page or any included page. So the whole source code of every 
included page must be checked first.
I created a feature request which could solve this problem: 
https://bugzilla.wikimedia.org/show_bug.cgi?id=41345 . If hope this 
will be added before wikidata goes live.


To update the repository my bot need to know the corresponding 
repository script url for a local wiki. Currently 
http://wikidata-test-repo.wikimedia.de/w/api.php is hard coded at my bot 
framework. But this repository url will change for hewiki. There is 
currently no way to request this info using api. I also created a 
feature requests for this: 
https://bugzilla.wikimedia.org/show_bug.cgi?id=41347


Merlissimo

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Some questions

2012-10-24 Thread Luca Martinelli
2012/10/23 Lydia Pintscher lydia.pintsc...@wikimedia.de:
 On Mon, Oct 22, 2012 at 6:38 PM, Luca Martinelli
 martinellil...@gmail.com wrote:
 Hi there, I got a number of questions. Hope not to bother.

 1) Has the testing of phase 1 already started on hu.wp?

 No.

:(

When will it, so?

 2) Will phase 1 be also expanded to ns=/=0 (i.e. templates,
 categories, help pages...) and to other projects (i.e. Wikisource,
 Wikiquote...)?

 The first already works on the demo system.
 As for other projects: Let's get this working on Wikipedia first.
 We're not even there yet ;-) Technically more is possible.

Ok, perfect. :) This is actually a question coming from
it.wikisource's admins, they're really into the project too, but they
were a bit disappointed by the fact this is still
Wikipedia-only-oriented at the moment.

 3) When will be possible to test phase 2 on Wikidata-test-repo?

 The first parts on phase 2 are already on the test system. More will
 come over the next weeks.

Good, can't wait to test them. :)))

 4) Is there already a script for testing the automatic upload of
 interlinks? I'd like to run some tests with my bot.

 What do you want to test exactly?

I would like to test the PyWikidata thing, but we'll discuss it later
on IRC. I already saw Amir's (Aharoni, not Amir1) explanation, but at
the moment I have no time to reply. I will (hopefully) in the next
hour.

Thanks again for your kindness. :)

-- 
Luca Sannita Martinelli
http://it.wikipedia.org/wiki/Utente:Sannita

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] features needed by langlink bots (was Re: update from the Hebrew Wikipedia)

2012-10-24 Thread Amir Ladsgroup
Is it helpful?
http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata

On 10/24/12, Merlissimo m...@toolserver.org wrote:
 Just to answer these questions for my _java_ interwiki bot MerlIwBot:

 Am 12.10.2012 16:45, schrieb Amir E. Aharoni:

 Will the bots be smart enough not to do anything to articles that are
 already listed in the repository and have the correct links displayed?

 Will the bots be smart enough not to do anything with articles that
 have interwiki conflicts (multiple links, non-1-to-1 linking etc.)?


 yes, because this is wikidata independent for my java bot and worked
 already before. So running my bot is always save on these wikis.
 non-1-to-1 langlinks can be moved partly to wikidata because mixing if
 wikidata and local langlinks is possible. But please note that since
 change https://gerrit.wikimedia.org/r/#/c/25232/ which is live with
 1.21.wmf2 no multiple langlinks are display anymore.

 Will the bots be smart enough to update the repo in the transition
 period, when some Wikipedias have Wikidata and some don't?

 My bot can update the repository but will cause much load on local
 wikis. That's because currently there is no possibility to know if a
 langlinks is stored local or at wikidata. Local langlinks can be stored
 on main page or any included page. So the whole source code of every
 included page must be checked first.
 I created a feature request which could solve this problem:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=41345 . If hope this
 will be added before wikidata goes live.

 To update the repository my bot need to know the corresponding
 repository script url for a local wiki. Currently
 http://wikidata-test-repo.wikimedia.de/w/api.php is hard coded at my bot
 framework. But this repository url will change for hewiki. There is
 currently no way to request this info using api. I also created a
 feature requests for this:
 https://bugzilla.wikimedia.org/show_bug.cgi?id=41347

 Merlissimo

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l



-- 
Amir

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Some questions

2012-10-24 Thread Lydia Pintscher
On Wed, Oct 24, 2012 at 2:04 PM, Luca Martinelli
martinellil...@gmail.com wrote:
 1) Has the testing of phase 1 already started on hu.wp?

 No.

 :(

 When will it, so?

I'd love to tell you more than soon but I can't unfortunately :/
I'll send an email here as soon as it's happening.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Community Communications for Wikidata

Wikimedia Deutschland e.V.
Obentrautstr. 72
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Some questions

2012-10-24 Thread Luca Martinelli
Ok, just one more thing I just remembered: what if I find some data
about a potentially relevant article, but there's no article on no
project at all?

I mean, is it possible to create first the Wikidata entry and then the
Wikipedia article?

L.

2012/10/24 Lydia Pintscher lydia.pintsc...@wikimedia.de:
 On Wed, Oct 24, 2012 at 2:04 PM, Luca Martinelli
 martinellil...@gmail.com wrote:
 1) Has the testing of phase 1 already started on hu.wp?

 No.

 :(

 When will it, so?

 I'd love to tell you more than soon but I can't unfortunately :/
 I'll send an email here as soon as it's happening.


 Cheers
 Lydia

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] New release of Pywikidata: feature requests?

2012-10-24 Thread Joan Creus
Hi all,

I've released a new version of Pywikidata @
https://github.com/jcreus/pywikidata (not really a release, since there is
no version numbering; I should begin, maybe...).

New features include:

   - Changes pushed to the server are only the properties which have
   changed. This means that it is really faster, and it is the way it should
   be (according to the spec).
   - 'uselang' is no longer accepted, per spec.
   - Bot flag allowed.

Right now there's some more stuff left to do (including adapting to changes
to the API, and adding the claim system [still beta]), but is there any
feature request of something which would be useful for bots?

In regards to Pywikipediabot integration I think someone is working on it
(thanks!); yet I think it would be better to wait. Pywikipedia is a mature
project and Pywikidata is still evolving constantly (mostly due to API
changes, which break it). So I'd wait until Pywikidata has matured too, and
Wikidata is deployed to WMF wikis.

By the way, pull requests will be gladly accepted :).

Joan Creus (joancreus@freenode, Joancreus@wikipedia)
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l