On 26 April 2013 17:15, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
On 26.04.2013 16:56, Denny Vrandečić wrote:
The third party propagation is not very high on our priority list. Not
because
it is not important, but because there are things that are even more
important -
like getting
On 26.04.2013 21:13, Sebastian Hellmann wrote:
Hi Daniel,
Am 26.04.2013 18:01, schrieb Daniel Kinzler:
You guys are the only reason the interface still exists :) DBpedia is the
only
(regular) external user (LuceneSearch is the only internal user). Note that
there's nobody really
On 04.05.2013 12:05, Jona Christopher Sahnwaldt wrote:
On 26 April 2013 17:15, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
*internal* JSON representation, which is different from what the API returns,
and may change at any time without notice.
Somewhat off-topic: I didn't know you have
On 4 May 2013 17:12, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
On 04.05.2013 12:05, Jona Christopher Sahnwaldt wrote:
On 26 April 2013 17:15, Daniel Kinzler daniel.kinz...@wikimedia.de wrote:
*internal* JSON representation, which is different from what the API
returns,
and may change
On 04.05.2013 19:13, Jona Christopher Sahnwaldt wrote:
We will produce a DBpedia release pretty soon, I don't think we can
wait for the real dumps. The inter-language links are an important
part of DBpedia, so we have to extract data from almost all Wikidata
items. I don't think it's sensible
Dear Jeremy,
please read email from Daniel Kinzler on this list from 26.03.2013 18:26 :
* A dispatcher needs about 3 seconds to dispatch 1000 changes to a client wiki.
* Considering we have ~300 client wikis, this means one dispatcher can handle
about 4000 changes per hour.
* We currently have
Hi Daniel,
On Fri, Apr 26, 2013 at 6:15 PM, Daniel Kinzler daniel.kinz...@wikimedia.de
wrote:
On 26.04.2013 16:56, Denny Vrandečić wrote:
The third party propagation is not very high on our priority list. Not
because
it is not important, but because there are things that are even more
On 26.04.2013 17:31, Dimitris Kontokostas wrote:
What we do right now in DBpedia Live is that we have a local clone of
Wikipedia
that get's in sync using the OAIRepository extension. This is done to abuse
our
local copy as we please.
It would be owesome if this Just Worked (tm) for
Hello All ,
i'm planning to write a proposal for WikiData to DBpedia project in GSoC2013
i've found in the change propagation
paghttp://meta.wikimedia.org/wiki/Wikidata/Notes/Change_propagatione
:
Support for 3rd party clients, that is, client wikis and other consumers
outside of Wikimedia, is
Hello Dimirtis
what do you thing of that ?
shall i write this part as an abstract part in the proposal and wait for
more details ,
or could we have a smiliar plan like the one already implemented in dbpedia
http://wiki.dbpedia.org/DBpediaLive#h156-3
thanks
regards
On Fri, Apr 26, 2013 at
Well, PubSubHubbub is a nice idea. However it clearly depends on two
factors:
1. whether Wikidata sets up such an infrastructure (I need to check
whether we have capacities, I am not sure atm)
2. whether performance is good enough to handle high-volume publishers
Basically, polling to recent
11 matches
Mail list logo