On 20.03.2015 19:08, Markus Kroetzsch wrote:
...
Feedback (and other interesting queries) are welcome :-)
Here's another nice query:
"Find all astronomical bodies on which Wikidata has some coordinates,
ordered by the number of coordinates that refer to them":
ata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
currently non-existing countries?
Because I have such code in my Python bot:
http://paste.debian.net/162319/
And even with so many filters, there is a bit strange "Kingdom of
Netherlands" which duplicates "Netherlands" but having only few cities.
Dmitriy
On Fri, Mar 20, 20
FILTER NOT EXISTS { ?statement2 :P582q ?endDate2 }
}
} ORDER BY DESC(?neighbours)
Just to give an example of a slightly more complex query ;-) Note how we
use the expression (:P47s/:P47v) rather than :P47c to access the value
of potentially
before the official WMF service goes online.
Cheers,
Markus
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia
Hi all,
I am happy to announce the release of Wikidata Toolkit 0.4.0 [1], the
Java library for programming with Wikidata and Wikibase. The main new
features are:
* Full support of a variety of new Wikidata features (including
statements on properties and new datatypes)
* More robust JSON pa
the empty reference problem (a statement has a reference which
contains no data whatsoever; look for: "snaks":[]). Maybe somebody wants
to look into this (this should not really happen, even if the JSON were
serialized correctly).
Cheers,
Markus
--
Markus Kroetzsch
Faculty of Comput
9:27 PM, Lukas Benedix
wrote:
I second this!
btw: what is the status of the problem with the missing dumps with
history? (latest available from November 2014)
Lukas
Am Do 26.02.2015 um 14:52 schrieb Markus Kroetzsch:
Hi,
It's that time of the year again when I am sending a remind
Hi Stas,
Since the JSON dumps and EntityData exports are (largely) free of
errors, there is already code for fixing this problem. Maybe we could
just use this.
Cheers,
Markus
On 27.02.2015 01:06, Stas Malyshev wrote:
Hi!
It's that time of the year again when I am sending a reminder that
ial:EntityData/Q3261.json
The JSON in the JSON dumps is the same.
Cheers,
Markus
[1] https://github.com/wmde/WikibaseDataModelSerialization/issues/77
[2]
http://dumps.wikimedia.org/wikidatawiki/20150207/wikidatawiki-20150207-pages-meta-current.xml.bz2
--
Markus Kroetzsch
Faculty of Compute
hat you would
have made some technical choices differently, but you should not be
frustrated because of that. We, too, have had many heated tech
discussions about Wikidata, and each of us has had to give up some
positions in the process.
Cheers,
Markus
--
Markus Kroetzsch
Faculty of Computer S
at
https://meta.wikimedia.org/wiki/Wikidata/Archive/Wikidata/historical).
Anyway, our discussion was about the role of RDF, not about a
comprehensive history of Wikidata. Nevertheless, be assured that if
anybody would contest the contributions of OmegaWiki, I will react in a
similar fashion.
Markus
Dear Gerard:
...
This is the essence of Wikidata. After that we can all complain about
the fallacies of Wikidata.. I have my pet pieves and it is not your RDF
SPARQL and stuff. That is mostly stuff for academics and it its use is
largely academic and not useful on the level where I want progress
ta this would actually be. Which also
depends on whether we would omit descriptions in languages that can easily be
covered by language fallback (e.g. no separate descriptions in de-ch and de-at).
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463
import that is generated automatically --
should we in all of this cases switch to offering a web service that
gives you the data if you really need?
So, +1 for auto-generated descriptions, but -1 for not having them in
the data anymore.
Cheers,
Markus
--
Markus Kroetzsch
Faculty of Co
org/wikidata-exports/miga/?props
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
ed"
in "Linked Open Data".
Suggestions for improvements and contributions on github are welcome.
Cheers,
Markus
[1] http://korrekt.org/page/Introducing_Wikidata_to_the_Linked_Data_Web
[2] https://www.mediawiki.org/wiki/Wikidata_Toolkit
--
Markus Kroetzsch
Faculty of Computer Sci
Hi,
I got interested in subclass of (P279) and instance of (P31) statements
recently. I was surprised by two things:
(1) There are quite a lot of subclass of statements: tenth of thousands.
(2) Many of them make a lot of sense, and (in particular) are not
(obvious) copies of Wikipedia categor
capabilities
that allow you to analyse the data. Again, your requirements are welcome.
Cheers,
Markus
--
Markus Kroetzsch
Faculty of Computer Science
Technische Universität Dresden
+49 351 463 38486
http://korrekt.org/
___
Wikidata-l mailing list
19 matches
Mail list logo