Bináris wikipo...@gmail.com schrieb:
2013/1/28 Amir Ladsgroup ladsgr...@gmail.com
What is exact time of the next deployment (it and he)?
If you want to catch it, join #wikimedia-wikidata on IRC. It was great
to
follow it on D-day!
And what time you think is best to disable interwiki
Hor Meng Yoong yoon...@gmail.com schrieb:
On Mon, Jan 28, 2013 at 10:52 PM, Katie Filbert
katie.filb...@wikimedia.dewrote:
PostgreSQL is really not supported at all yet for Wikibase.
MediaWiki
core support for Postgres is also currently broken, while patches for
sites
schema updates,
Why not just block the bots on wikis that use wikidata?
On Tue, Jan 29, 2013 at 6:51 AM, Bináris wikipo...@gmail.com wrote:
2013/1/28 Amir Ladsgroup ladsgr...@gmail.com
What is exact time of the next deployment (it and he)?
If you want to catch it, join #wikimedia-wikidata on IRC. It was
So are the same bots doing different things? I seem to remember there was
one giant toolserver pybot instance doing only interwiki.
On Tue, Jan 29, 2013 at 9:17 AM, Nikola Smolenski smole...@eunet.rs wrote:
On 29/01/13 10:02, Magnus Manske wrote:
Why not just block the bots on wikis that use
AFAIK there are many iw-bots and several others.
I don't think that there are iw-bots doing anything else as well. It would not
be very useful now. But acually I don't know it. If yes, blocking those
specific ones might be a good idea.
Marco
Magnus Manske magnusman...@googlemail.com
On 29/01/13 10:28, Magnus Manske wrote:
So are the same bots doing different things? I seem to remember there
was one giant toolserver pybot instance doing only interwiki.
OTOH, yes, I believe there are bots doing only interwikis that could
probably be blocked. But isn't anyone from Hungarian
Spin off from the Phase 1 thread.
2013/1/29 Magnus Manske magnusman...@googlemail.com:
Why not just block the bots on wikis that use wikidata?
This looks like the right thing to me, but I don't want to be too rude
to the bot operators and I do want the bots to keep doing useful
things.
Imagine
On Tue, Jan 29, 2013 at 10:53 AM, Amir E. Aharoni
amir.ahar...@mail.huji.ac.il wrote:
Spin off from the Phase 1 thread.
2013/1/29 Magnus Manske magnusman...@googlemail.com:
Why not just block the bots on wikis that use wikidata?
This looks like the right thing to me, but I don't want to be
Ahh, I see that there was no response to Denny's question about wikidata stats?
I took a look in one of the hourly stats files with this:
curl
http://dumps.wikimedia.org/other/pagecounts-raw/2013/2013-01/pagecounts-20130128-150001.gz
| zcat - | grep wikidata
It does appear that wikidata is
On 29.01.2013 11:35, Ed Summers wrote:
It does appear that wikidata is showing up in there, but it's just one line:
undefined//www.wikidata.org/w/api.php 8 50103
It would be nice to correct the 'undefined' so that it was something
like 'wd'. Also, it's too bad that we don't actually
:
2013-01/pagecounts-20130129-11.gz | zcat - | egrep ' Q\d+ '
de Q10 3 17060
de Q7 1 20607
en Q1 4 26849
en Q10 2 16419
en Q100 2 15580
en Q106 1 8122
en Q17 1 9697
en Q2 9 45346
en Q22 1 7835
en Q3 1 8520
en Q35 1 377
en Q374 3 21466
en Q4 9 57882
en Q400 1 34656
en Q6700 1 29519
en Q711 1 7148
On 29.01.2013 13:09, Ed Summers wrote:
Ok. From looking very quickly at pollForChanges I guess the polling
doesn't use the API either? Does that mean that users of Wikidata who
want to keep up to date with changes need to be hosted in the
Wikimedia datacenter and granted read access to the
) but I don't see
any wikidata pages being accessed, for example:
2013-01/pagecounts-20130129-11.gz | zcat - | egrep ' Q\d+ '
de Q10 3 17060
de Q7 1 20607
en Q1 4 26849
en Q10 2 16419
en Q100 2 15580
en Q106 1 8122
en Q17 1 9697
en Q2 9 45346
en Q22 1 7835
en Q3 1 8520
en Q35 1 377
en
On Tue, Jan 29, 2013 at 7:14 AM, Daniel Kinzler
daniel.kinz...@wikimedia.de wrote:
3rd party clients which want to embed data from Wikidata, but cannot access
the
database directly, are not yet supported. We have designed the architecture
in a
way that should allow us to support them easily
On Tue, Jan 29, 2013 at 7:29 AM, Denny Vrandečić
denny.vrande...@wikimedia.de wrote:
Are the numbers absolute, or a sample of out of a thousand?
I believe they are absolute. I'll see if I can figure out what's going
on by asking over on the analytics list.
//Ed
(was Database used by wikidata)
We would LOVE for more developers or systems administrators to help
support MediaWiki core on other RDBMSes; one of the most helpful things
you can do is help look at the changes other developers are making that
might affect your preferred RDBMS, and provide
Hi!
At the moment, wikidata-test-client.wikimedia.de/wiki is configured to
act as Hebrew client to wikidata-test-repo.wikimedia.de/wiki. wmf8 can
be tested there until tomorrow's deployment. (Sorry for the remaining
auto-imported content.)
Best,
--
Silke Meyer
Systemadministratorin und
As owner of interwiki bot i now see:
Disabling of interwiki bots is about one row in source code of bot,
depend how ofter owners update. So 1-2 days can disable many bots, the
others shoud be blocked for a while
But there is problem now - other wikis still use classic intwrwiki
links and on
On Tue, Jan 29, 2013 at 4:27 PM, Jan Dudík jan.du...@gmail.com wrote:
And on wikidata are oudtated data, because many new articles are
created (moved and deleted) daily, but the most used platform -
pywikipedia is not ready yet for wikidata.
Depending on what you mean with ready it is:
On Tue, Jan 29, 2013 at 7:51 PM, Samat sama...@gmail.com wrote:
I agree with you.
I am also waiting for somebody, who can change pywiki compatible with
wikidata. I have no time and knowledge for it, but I have a bot (at least on
huwiki, not on wikidata) and I have access to the Hungarian
2013/1/29 Samat sama...@gmail.com:
On Tue, Jan 29, 2013 at 7:54 PM, Lydia Pintscher
lydia.pintsc...@wikimedia.de wrote:
On Tue, Jan 29, 2013 at 7:51 PM, Samat sama...@gmail.com wrote:
I agree with you.
I am also waiting for somebody, who can change pywiki compatible with
wikidata. I have
2013/1/29 Lydia Pintscher lydia.pintsc...@wikimedia.de
Have you looked at the link I posted? What exactly is missing for you
to do what you want to do?
As far as I see, these are just code fragments, Lego elements to build
something from, but they are not yet integrated into interwiki.py.
2013/1/25 Daniel Kinzler daniel.kinz...@wikimedia.de
Hi!
I thought about the RDF export a bit, and I think we should break this up
into
several steps for better tracking. Here is what I think needs to be done:
Daniel,
I am answering to Wikidata-l, and adding Tpt (since he started working
Some of our insights into the SMW RDF export (which we found to be
difficult to configure and use):
1. Probably most relevant: total lack of support for xml:lang, which would
have been essential to our purposes.
Wikidata should be planned with support for language in mind.
2. We also found that
24 matches
Mail list logo