Sumana says i forward this e-mail to wikidata-l. So i did it :)
-- Forwarded message --
From: Amir Ladsgroup ladsgr...@gmail.com
Date: Fri, 19 Oct 2012 03:18:57 +0330
Subject: Wikidata bug (or semi-bug ;)
To: wikitec...@lists.wikimedia.org
Hello I'm working on running PWB
Is there anybody who can help me?
On 10/19/12, Amir Ladsgroup ladsgr...@gmail.com wrote:
Sumana says i forward this e-mail to wikidata-l. So i did it :)
-- Forwarded message --
From: Amir Ladsgroup ladsgr...@gmail.com
Date: Fri, 19 Oct 2012 03:18:57 +0330
Subject: Wikidata
Is it helpful?
http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata
On 10/24/12, Merlissimo m...@toolserver.org wrote:
Just to answer these questions for my _java_ interwiki bot MerlIwBot:
Am 12.10.2012 16:45, schrieb Amir E. Aharoni:
Will the bots be smart enough not to do anything
wikidata.api.getItemById or
wikidata.api.getItemByInterwiki, or whatever.
Just some thoughts.
Joan Creus.
2012/10/25 Amir Ladsgroup ladsgr...@gmail.com
Lydia is right. I changed PWB to make it run on wikidata.
Now i can write a code and integrate data of wikipedia and add it to
wikidata
The old code needed id but the new one doesn't need I tried this and worked:
site=wikipedia.getSite('wikidata',fam='wikidata')
page=wikipedia.Page(site,Helium)
page.put(u,uBOT: TESTING BOO,wikidata={'type':u'sitelink',
'site':'de', 'title':'BAR'})
Are you sure you are using the updated code?
On
why wbgetentity does not exist in wikidata.org's API? bots have problems now
http://wikidata.org/w/api.php
On Tue, Oct 30, 2012 at 9:45 PM, Gregor Hagedorn g.m.haged...@gmail.comwrote:
Great work, my congratulations!
---
Some first impressions:
Changing the
On Thu, Nov 22, 2012 at 10:43 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Thu, Nov 22, 2012 at 8:11 PM, Sven Manguard svenmangu...@gmail.com
wrote:
I'm getting contradictory messages from Wikidata staff then. I mean we
already knew that we could, the issue is whether or not
Finally the bug is solved. you can get and set items on wikidata via PWB
At first you must define a wikidataPage and these are methods you can
use. I'll try to expand and make more methods but i don't know any and
your help will be very useful:
Supports the same interface as Page, with the
Thanks Amir and Lydia. The RTL bugs was really annoying
On 12/11/12, Lydia Pintscher lydia.pintsc...@wikimedia.de wrote:
On Mon, Dec 10, 2012 at 10:37 PM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
Yay, all my right-to-left fixes are live :)
Thank you!
Thank you for writing them!
There is no need of using q### anymore.
For other languages it's not possible for now but i gonna add it soon
On 12/11/12, Luca Martinelli martinellil...@gmail.com wrote:
2012/12/10 Amir Ladsgroup ladsgr...@gmail.com:
Finally the bug is solved. you can get and set items on wikidata via PWB
What is exact time of the next deployment (it and he)?
And what time you think is best to disable interwiki bots?
On Mon, Jan 28, 2013 at 2:12 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Mon, Jan 28, 2013 at 10:16 AM, Jan Kučera kozuc...@gmail.com wrote:
How is the Hungarian
Congrats. Is iw bots working correctly? I made r11016 in pwb to disable
them but i'm not sure it's working
On Thursday, January 31, 2013, Marco Fleckinger
marco.fleckin...@wikipedia.at wrote:
Hey,
congratulation for this important step.
With this I may also use it though Italian is the only
I did this:
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/11073
so updated bots are not a concern anymore :)
BTW:congrats.
On Thu, Feb 14, 2013 at 1:11 AM, Katie Chan k...@ktchan.info wrote:
On 13/02/2013 21:31, Denny Vrandečić wrote:
You have examples of that? Did not happen to my
When?
On Thu, Feb 14, 2013 at 2:08 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Thu, Feb 14, 2013 at 11:34 AM, Yuri Astrakhan
yuriastrak...@gmail.com wrote:
I guess it only makes sense if the deployment will be few languages at a
time. If at some point you will start
I want to add it's now possible to import another wiki's articles via
pywikipediabot:
https://www.mediawiki.org/wiki/Special:Code/pywikipedia/11103
More info and examples:
http://www.wikidata.org/wiki/Wikidata:Requests_for_permissions/Bot/Dexbot
The code is very simple:
# -*- coding: utf-8 -*-
Hello,
I made commits r11209-11212 for making easier to change claims via API
you can see the manual here:
http://www.mediawiki.org/wiki/Manual:Pywikipediabot/Wikidata#Changing_or_creating_claims.2Fstatements
you simply can call and change properties and give them values
I tested it on two
Guys!
You can continue this conversion in a more public place like WD:PC
It's bothering for people like me to receive e-mail every five minutes
in a topic which I'm not interested
So please continue this in a somewhere else
On 5/7/13, Jane Darnell jane...@gmail.com wrote:
What is interesting
In Persian Wikipedia, we solved almost all of iw conflicts out of two or
three thousands conflict
Best regards
On Jun 18, 2013 12:42 AM, יונה בנדלאק bende...@walla.com wrote:
I look for list with old interwiki. now i get one, i start work on it, and
hope to get some help from hebrew
Hello,
Yes that WAS a parsing error, I noticed (someone reported similar
error in talk page of my bot a while ago), I fixed and I'm writing the
correcting bot, and because I have put some quality control units(!)
in my code, the death place has to be a geographical feature
(P107=618123) or doesn't
I fixed all of the mistakes (They were just 7 edits)
https://www.wikidata.org/w/index.php?title=User:Dexbot/Possible_mistakesaction=history
Best
On 9/24/13, Amir Ladsgroup ladsgr...@gmail.com wrote:
Hello,
Yes that WAS a parsing error, I noticed (someone reported similar
error in talk page
I fixed all of the mistakes (They were just 7 edits)
https://www.wikidata.org/w/index.php?title=User:Dexbot/Possible_mistakesaction=history
Best
On 9/24/13, Amir Ladsgroup ladsgr...@gmail.com wrote:
Hello,
Yes that WAS a parsing error, I noticed (someone reported similar
error in talk page
Two questions:
*Why don't you add Wikiquote?, It's pretty look like Wikipedia in
concept (you don't have to make so many new items when you add
Wikiquote as client of Wikidata)
*Wikisource has a big difference in interwiki mapping. you can map an
item in wikisource to several item in another
On Sun, Nov 3, 2013 at 11:32 PM, Daniel Kinzler daniel.kinz...@wikimedia.de
wrote:
Am 03.11.2013 19:59, schrieb Federico Leva (Nemo):
*Wikisource has a big difference in interwiki mapping. you can map an
item in wikisource to several item in another language (we have an
open bug in
hi, you can use this:
https://www.wikidata.org/w/index.php?title=Special%3AAllPagesfrom=to=namespace=120
On Fri, Feb 14, 2014 at 10:46 AM, Jeff Thompson j...@thefirst.org wrote:
Hello.
Is there a way to see an automatically-generated list of properties with
their name and data type?
(I
represent a very tiny percentage of the total...
Cheers,
Micru
On Sun, Apr 27, 2014 at 12:28 PM, Amir Ladsgroup ladsgr...@gmail.comwrote:
there are some problems in using bio template for example they used it
for a group of people
https://it.wikipedia.org/wiki/Fratelli_Wright
On Sun, Apr
I started my bothttps://www.wikidata.org/wiki/Special:Contributions/Dexboton:
P31 (instance of), P21 (gender), P19 (place of birth), and P20 (place
of death)
I also wrote the code to import dates of birth and death but I'm not
running it yet because there is one important question: What is the
Do you know why this edit isn't shown correctly?
https://www.wikidata.org/w/index.php?title=Q4119465diff=123932128oldid=123931985
Best
On Tue, Apr 29, 2014 at 9:34 PM, Daniel Kinzler daniel.kinz...@wikimedia.de
wrote:
Am 29.04.2014 17:25, schrieb David Cuenca:
Is it possible to have just
this problem is being tracked in
https://bugzilla.wikimedia.org/show_bug.cgi?id=60999
Best
On Wed, Apr 30, 2014 at 8:20 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Wed, Apr 30, 2014 at 11:45 AM, Amir Ladsgroup ladsgr...@gmail.com
wrote:
Do you know why this edit isn't
Thank you, let me check
On 7/23/14, Tom Morris tfmor...@gmail.com wrote:
On Wed, Jul 23, 2014 at 7:03 AM, Amir Ladsgroup ladsgr...@gmail.com
wrote:
but since th file is huge It makes me lots of problem if I want to
check myself and answer these question :(
curl http
I always get the list and run my labeler bot on them, It adds labels
in case a sitelink exists. I can work on it and make it better to fix
more labels.
Best
On 8/14/14, Lukas Benedix lukas.bene...@fu-berlin.de wrote:
Hi,
I found ~16.000 Items without any label. I have no idea how it's
I started this report, you can find it here
https://www.wikidata.org/wiki/User:Ladsgroup/Birth_date_report2.
Best
On Tue, Aug 19, 2014 at 3:27 PM, Amir Ladsgroup ladsgr...@gmail.com wrote:
It's possible and rather easy to add them .just several regexes and list
of months in that language
).
Wikidata is less precise than Wikipedia here, but not actually wrong.
Maybe these cases should be treated separately from the potential errors.
Cheers,
Magnus
On Wed, Aug 20, 2014 at 3:54 PM, Amir Ladsgroup ladsgr...@gmail.com
wrote:
I started this report, you can find it here
https
/User:Ladsgroup/Birth_date_report2.3 Yes No
No
Best
On Wed, Aug 20, 2014 at 8:00 PM, Amir Ladsgroup ladsgr...@gmail.com wrote:
Yes, I have two options, I can just skip them or I can mark them in
another color like pink or something else.
What do you think? and what color if you recommend the latter
Hey,
It's pywikibot: https://www.mediawiki.org/wiki/PWB
Both branches support Wikidata
Best
On 8/29/14, Benjamin Good ben.mcgee.g...@gmail.com wrote:
Which python framework should a new developer use to make a wikidata
editing bot?
thanks
-Ben
--
Amir
Yes, Core has better support
Best
On 8/29/14, Maarten Dammers maar...@mdammers.nl wrote:
Don't use compat, use core.
Amir Ladsgroup schreef op 29-8-2014 2:27:
Hey,
It's pywikibot: https://www.mediawiki.org/wiki/PWB
Both branches support Wikidata
Best
On 8/29/14, Benjamin Good
/before/
starting the removal.
Il 02/09/2014 09:09, Amir Ladsgroup ha scritto:
Hey,
My bot finished initial part of removing all of Link GA, and Link FA
in nl, sv, pl Wikis (and some other wikis). I just removed badges that
are already in Wikidata. My bot cleaned up more than 12/13 of articles
I can connect all of them by bot but I'm not sure it should be done
automatically.
Happy birthday Wikidata :)
On 10/29/14, James Forrester jdforres...@gmail.com wrote:
On Wed Oct 29 2014 at 10:56:42 Denny Vrandečić vrande...@google.com
wrote:
There’s a small tool on WMF labs that you can use
congrats Lydia :)
On Wed, Nov 5, 2014 at 1:23 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
Hey everyone,
Wow! We won! \o/ This is incredible. I am so happy to see this
recognition of all the work we've put into Wikidata together. Magnus
and I had a blast at the award
14:18 heeft Amir Ladsgroup ladsgr...@gmail.com het
volgende geschreven:
I'm writing a parser so I can feed gender classification to Kian, It'll be
done soon and you can use it :)
On Sat, Mar 14, 2015 at 12:53 PM Sjoerd de Bruin sjoerddebr...@me.com
wrote:
Hm, the Wikidata Game is really
I just published the code https://github.com/Ladsgroup/Kian
I really appreciate any comments or changes on the code.
On Sun, Mar 15, 2015 at 2:30 PM Amir Ladsgroup ladsgr...@gmail.com wrote:
I'm cleaning up and pep8fiying the code to publish it.
On Sat, Mar 14, 2015 at 8:42 PM Cristian
One mistake https://www.wikidata.org/wiki/Q2963097 I just found via the
report. Article in French Wikipedia is about a French type of cheese but
connected to an article in Russian Wikipedia about a French playwriter.
Best
On Fri, Mar 20, 2015 at 3:59 AM Amir Ladsgroup ladsgr...@gmail.com wrote
sjoerddebr...@me.com
Op 14 mrt. 2015, om 14:18 heeft Amir Ladsgroup ladsgr...@gmail.com het
volgende geschreven:
I'm writing a parser so I can feed gender classification to Kian, It'll
be done soon and you can use it :)
On Sat, Mar 14, 2015 at 12:53 PM Sjoerd de Bruin sjoerddebr...@me.com
wrote
Try to download it, or change the character encoding to utf-8 or unicode.
And yes it's based on dumps. :)
On Fri, Mar 20, 2015 at 3:51 AM Ricordisamoa ricordisa...@openmailbox.org
wrote:
Il 20/03/2015 01:11, Amir Ladsgroup ha scritto:
OK, I have some news:
1- Today I rewrote some parts
Hey,
On Mon, Mar 9, 2015 at 5:50 AM, Tom Morris tfmor...@gmail.com wrote:
On Sun, Mar 8, 2015 at 7:34 AM, Amir Ladsgroup ladsgr...@gmail.com
wrote:
This is the result for German Wikipedia:
... so I got list of articles in German Wikipedia that doesn't have item
in Wikidata. There were 16K
Result for English Wikipedia (6366 articles classified as human)
https://tools.wmflabs.org/dexbot/kian_res_en.txt
On Mon, Mar 9, 2015 at 2:39 PM, Amir Ladsgroup ladsgr...@gmail.com wrote:
Hey,
On Mon, Mar 9, 2015 at 5:50 AM, Tom Morris tfmor...@gmail.com wrote:
On Sun, Mar 8, 2015 at 7:34
...@mdammers.nl het
volgende geschreven:
Hi Amir,
Amir Ladsgroup schreef op 9-3-2015 om 22:40:
Result for English Wikipedia (6366 articles classified as human)
https://tools.wmflabs.org/dexbot/kian_res_en.txt
Sounds like fun! Can you run it on the Dutch Wikipedia too? On
https://tools.wmflabs.org
/wikidata-game/
Some may require text parsing. Not sure how to get that working; haven't
spent much time with (artificial) neural nets in a while.
On Sat, Mar 7, 2015 at 12:36 PM Amir Ladsgroup ladsgr...@gmail.com
mailto:ladsgr...@gmail.com wrote:
Some useful tasks that I'm looking
do
that? it's because the huge set of data (training set) we have already and
neural networks algorithms.
Best
Best,
Eric
https://www.wikidata.org/wiki/User:Emw
1. Prometheus. https://www.wikidata.org/wiki/Q590010
On Sat, Mar 7, 2015 at 1:44 PM, Amir Ladsgroup ladsgr...@gmail.com
wrote
are suspicious to merge?
Maybe it would be easy to start merging them.
Best
On Sat, Mar 7, 2015 at 12:36 PM Amir Ladsgroup ladsgr...@gmail.com
wrote:
Some useful tasks that I'm looking for a way to do are:
*Anti-vandal bot (or how we can quantify an edit).
*Auto labeling for humans (That's
In technical terms a machine which is using forward and backward
propagation to make a approximate prediction [1] is being called a neural
network and doesn't matter if I agree or not.
BTW: I use BGFS not gradient descending.
[1]: https://en.wikipedia.org/wiki/Artificial_neural_network
On Sat,
Hey,
I spent last few weeks working on this lights off [1] and now it's ready to
work!
Kian is a three-layered neural network with flexible number of inputs and
outputs. So if we can parametrize a job, we can teach him easily and get
the job done.
For example and as the first job. We want to add
Some useful tasks that I'm looking for a way to do are:
*Anti-vandal bot (or how we can quantify an edit).
*Auto labeling for humans (That's the next task).
*Add more :)
On Sat, Mar 7, 2015 at 3:54 PM, Amir Ladsgroup ladsgr...@gmail.com wrote:
Hey,
I spent last few weeks working
with categories of humans in
them.
Best
On Sun, Mar 8, 2015 at 3:07 AM, Amir Ladsgroup ladsgr...@gmail.com wrote:
On Sat, Mar 7, 2015 at 9:19 PM, Jeroen De Dauw jeroended...@gmail.com
wrote:
Hey,
Yay, neural nets are definitely fun! Am I right in understanding this is
a software you created
, 2015 at 2:58 PM Amir Ladsgroup ladsgr...@gmail.com wrote:
Sure, tonight it will be done.
Best
On Thu, Mar 12, 2015 at 2:08 AM, Sjoerd de Bruin sjoerddebr...@me.com
wrote:
I'm ready for it! All existing humans on nlwiki have a gender now, so
it's easy to review this batch. Bring it on.
Op
lists (Lijst van voorzitters van de SER and Lijst van
voorzitters van de WRR) and a music group (Viper (Belgische danceact)).
Will play the gender game the next few days to check them.
Greetings,
Sjoerd de Bruin
sjoerddebr...@me.com
Op 14 mrt. 2015, om 00:51 heeft Amir Ladsgroup ladsgr
Hello,
I started bot of auto-transliterating names of humans, initially with
Persian and English (as a pair) since I know both and I can debug. After
some modifications, In the last check, In more than several hundreds of
edits I checked, I couldn't find any errors, I want to expand this bot for
Firstly, Note name of the thread, It's about transliterating names of
humans, not transliterating in general, so translation doesn't make sense
at all in this case.
Secondly, I can transliterate names of Chinese people to Dutch or other
Latin languages too, it will work well and it will have
Unconnected pages should be rewritten to use QueryPage class so 1- It can
use better caching 2- It can send results to API (which currently It can't
and that's why we don't have Special:Unconnected support in Pywikibot)
I started to do this a while ago but I got busy so I deferred this CS
58 matches
Mail list logo