On 07/08/13 15:40, Mingli Yuan wrote:
Also, something similar to Magnus' Wiri, here is a bot developed by us
on sina weibo (a twitter-like microblogging provider in China)
* http://weibo.com/n/%E6%9E%9C%E5%A3%B3%E5%A8%98
We use dataset from wikidata with some dirty hacks. It is only a
Dear Adam,
thanks for the pointer. The paper gives an overview of how to design a
wiki-based data curation platform for a specific target community. Some
of the insights could also apply to Wikidata, while other won't transfer
(e.g., you cannot invite the Wikidata community for a
Hi Mingli,
thanks, this very interesting, but I think I need a bit more context to
understand what you are doing and why.
Is your goal to create a library for accessing Wikidata from Clojure
(like a Clojure API for Wikidata)? Or is your goal to use logical
inference over Wikidata and you
to a
certain dialect.
See also: http://meta.wikimedia.org/wiki/Special_language_codes
Greetings -- Purodha
*Gesendet:* Sonntag, 04. August 2013 um 19:01 Uhr
*Von:* Markus Krötzsch mar...@semantic-mediawiki.org
*An:* Federico Leva (Nemo) nemow...@gmail.com
*Cc:* Discussion list for the Wikidata project
On 04/08/13 13:17, Federico Leva (Nemo) wrote:
Markus Krötzsch, 04/08/2013 12:32:
* Wikidata uses be-x-old as a code, but MediaWiki messages for this
language seem to use be-tarask as a language code. So there must be a
mapping somewhere. Where?
Where I linked it.
Are you sure? The file you
or
traditional?]).
I invite any language experts to look at the file and add
comments/improvements. Some of the issues should possibly also be
considered on the implementation side: we don't want two distinct codes
for the same thing.
Cheers,
Markus
On 04/08/13 16:35, Markus Krötzsch
Hi,
I am happy to report that an initial, yet fully functional RDF export
for Wikidata is now available. The exports can be created using the
wda-export-data.py script of the wda toolkit [1]. This script downloads
recent Wikidata database dumps and processes them to create RDF/Turtle
files.
On 14/04/12 15:38, Gerard Meijssen wrote:
Hoi,
The Wikidata project is probably the software used by OmegaWiki, the
original Wikidata.
Ah, great, this completes the confusion :-D
Cheers,
Markus
On 14 April 2012 16:12, Jeroen De Dauw jeroended...@gmail.com
mailto:jeroended...@gmail.com
On 12/04/12 21:10, Daniel Kinzler wrote:
This is an interesting criticism, and there's an excellent retort by Denny in
the comments. Just fyi.
Thanks, very good discussion and very good answer by Denny. I should
have a chat with Mark at some point to check out what he thinks about it
(it is
Hi Andreas,
thanks for the input. I have drafted the current text about geo-related
datatypes, but I am far from being an expert in this area. Our mapping
expert in Wikidata is Katie (Aude), who has also been working with
OpenStreetMap, but further expert input on this topic would be quite
Martynas,
what you are proposing below is not W3C recommended RDF but an extension
of triples to quads. As far as I know, this extension is not compatible
yet with existing standards such as SPARQL and OWL. Named graphs work
with SPARQL, but are mostly used in another way than you suggest.
On 04/04/12 23:23, Gregor Hagedorn wrote:
Wikidata can (and probably will) store information about each moon of
Uranus, e.g., its mass. It does probably not make sense to store the mass of
Moons of Uranus if there is such an article. It does not help to know that
the article Moons on Uranus also
In general, policies for notability in Wikidata will be governed by the
community of (all) Wikidata editors. On the technical side, we aim to
achieve two things:
* The system should be able to handle a lot of data.
* The interfaces and data access features should minimize the negative
impact
301 - 313 of 313 matches
Mail list logo