On 08.01.2015 18:38, Denny Vrandečić wrote:
Yes, CC-BY is great.
Good. I have officially released the article text under this license now:
https://korrekt.org/page/Wikidata:_A_Free_Collaborative_Knowledgebase
Cheers,
Markus
On Thu Jan 08 2015 at 7:01:12 AM Markus Krötzsch
mar...@semantic
On 08.01.2015 15:10, ja...@j1w.xyz wrote:
Prior to viewing Markus Krötzsch's Wikidata page, I was unaware of the
Wikidata: A Free Collaborative Knowledgebase article [1] written by
Denny Vrandečić and Markus Krötzsch. This is a very helpful article
that in my opinion should be featured
On 08.01.2015 21:29, Thad Guidry wrote:
Hi Marcus!
Yes, you and I are on the same page.
I do indeed get this impression ;-)
Yes, I know about the
Property-first view of WIkidata. No quibbles. But there is still an
issue with Assumptions for Country P17 being used for an instance of
Dear Thad,
The second part of your email has good points in it, too. As you say,
one must allow for adjustments in the intended meaning of a property in
real life, and adjusting too much could be dangerous. The method you
suggest (creating a new property and deprecating the old one, rather
On 08.01.2015 20:37, Thad Guidry wrote:
...
Right, Freebase would not stick a Property called Country right on an
instance of a Music Band. We would put Country under the Musical Group
type, and give it a better definition like The nation or territory that
this item originated from.
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit
their own data right now? Entering wrong claims deliberately would maybe
not be the biggest issue here (since it is already in conflict with
other general policies -- we do not want
on should point to Wikidata or to Wikimedia or something else. But
besides this minor point this seems to be a nice way to have COI
declarations in the data (would also be interesting to know which living
people have official Wikimedia accounts).
Cheers,
Markus
On 07.01.2015 15:25, Markus Krötzsch
P.S. I also should declare a COI on this discussion: I am Q18618630. --
Markus
On 07.01.2015 15:25, Markus Krötzsch wrote:
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit
their own data right now? Entering wrong claims deliberately
On 04.11.2014 18:18, Cristian Consonni wrote:
Hi Markus,
2014-11-01 0:29 GMT+01:00 Markus Krötzsch mar...@semantic-mediawiki.org:
Nice. We are running the RDF generation on a shared cloud environment and I
am not sure we can really use a lot of RAM there. Do you have any guess how
much RAM you
On 31.10.2014 14:51, Cristian Consonni wrote:
2014-10-30 22:40 GMT+01:00 Cristian Consonni kikkocrist...@gmail.com:
Ok, now I have managed to add the Wikidata statements dump too.
And I have added a wikidata.hdt combined dump of all of the above.
Nice. We are running the RDF generation on
curious if there is any formal collaboration
(in-place|proposed|possible) between dbpedia and wikidata?
Phil
This message optimized for indexing by NSA PRISM
On Wed, Oct 29, 2014 at 2:34 PM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
Martynas,
Denny is right. You could set up
On 30.10.2014 11:49, Cristian Consonni wrote:
2014-10-29 22:59 GMT+01:00 Lydia Pintscher lydia.pintsc...@wikimedia.de:
Help with this would be awesome and totally welcome. The tracking bug
is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143
Speaking of totally awesome (aehm :D):
* see:
Hi Christian,
Awesome :-) Small note: I just got a Bad Gateway when trying
http://data.wikidataldf.com/ but it now seems to work.
It also seems that some of your post answers the question from my
previous email. That sounds as if it is pretty hard to create HDT
exports (not much surprise
Martynas,
Denny is right. You could set up a Virtuoso endpoint based on our RDF
exports. This would be quite nice to have. That's one important reason
why we created the exports, and I really hope we will soon see this
happening. We are dealing here with a very large project, and the
Hi Cristian,
As Daniel said, the live export is currently somewhat limited. However,
we provide RDF dumps that contain all the data:
http://tools.wmflabs.org/wikidata-exports/rdf/
This shows how the final live exports should also look (more or less),
and it could be a blueprint for somebody
Dear all:
Those of you active in research may be interested in submitting to a
recently announced special issue of the Journal of Web Semantics that
explicitly refers to Wikidata in its call:
JWS Special Issue on Knowledge Graphs
Dear all,
I am happy to announce the third release of Wikidata Toolkit [1], the
Java library for programming with Wikidata and Wikibase. The main new
features are:
* Full support for the (now) standard JSON format used by Wikidata
* Huge performance improvements (decompressing and parsing
Hi,
I fully agree with Thomas and the other replies given here. Let me give
some other views on these topics (partly overlapping with what was said
before). It's important to understand these things to get the subclass
of/instance of thing right -- and it would be extremely useful if we
On 13.09.2014 21:25, Jeremy Baron wrote:
On Sat, Sep 13, 2014 at 7:23 PM, Denny Vrandečić vrande...@gmail.com wrote:
I am not a lawyer, but if I remember correctly, copyright covers expression,
not content. Since the Wikidata data model and its representation in JSON is
rather unique, an ISBN
On 09.09.2014 11:33, Daniel Kinzler wrote:
Am 09.09.2014 01:40, schrieb Denny Vrandečić:
Create a third item in Wikidata, and use that for the language links. Any
Wikipedia that has two separate articles can link to the separate items, any
Wikipedia that has only one article can link to the
On 09.09.2014 11:47, Thomas Douillard wrote:
The composite item seems to be a sort of composite geographical/human
system, like an ecosystem (community of living organisms together with
the nonliving components of their environment)
https://www.wikidata.org/wiki/Q37813 a special kind of
Hi,
I just updated the data for the Wikidata classes and properties browser
[1] -- was about time -- and added some improvements on the way:
(1) Classes and properties are now always ordered by usage (most used
first), which was not possible to do before. Examples:
** properties related
On 08.09.2014 14:27, Jeroen De Dauw wrote:
Hey,
\o/
Where are the source code and issue tracker for this? Probably good if
those where linked from the tool.
True, but it's not quite in our master branch yet: the code is part of
the extended WDTK examples module, see
On 08.09.2014 14:53, Markus Krötzsch wrote:
...
http://tools.wmflabs.org/wikidata-exports/miga/#_item=1204
That first shows population. When then clicking on the link, you see
the data type is quantity, not string.
Yes, I think this is a bug in how we use IRIs and labels for datatypes
function in the code
that I tweaked to adjust this until it seemed right, but there is no
deeper principle behind this.
Very cool...
Thanks :-)
Markus
-Ben
On Mon, Sep 8, 2014 at 9:24 AM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote
Hi all,
I'd like to share a little tool with you that has been created at a
recent hackathon. Not my work but a nice idea that might inspire others:
AnnoT is a manual text annotation tool where you can, while typing
text into an HTML form, select Wikidata items for some of the words. You
On 22.08.2014 09:10, Lydia Pintscher wrote:
On Fri, Aug 22, 2014 at 3:35 AM, Legoktm legoktm.wikipe...@gmail.com wrote:
* I feel uncomfortable linking to Facebook/Twitter/etc. on the main page.
Fair enough. How about we make
https://www.wikidata.org/wiki/Wikidata:Social_media prettier and
On 19.08.2014 22:23, David Cuenca wrote:
...
Actually I have one last question :) At the moment Gerard is using is
a list of:value on category item pages which has the effect of being
the inverse of instance of. And then he adds further conditions as
qualifiers, see:
On 19.08.2014 16:13, Lydia Pintscher wrote:
On Tue, Aug 19, 2014 at 11:19 AM, David Cuenca dacu...@gmail.com wrote:
Thanks for the stats, Gerard. Two thoughts:
- With so many items without description I wonder why we don't have the
automatic descriptions gadget enabled by default.
I am a bit
Hi all,
We have a lot of statements saying that something is an instance of a
Wikipedia disambiguation page (Q4167410). Unfortunately, this kind of
information says something about a particular Wikipedia article in a
particular language, and often is not true for other languages.
Moreover,
On 19.08.2014 12:20, Gerard Meijssen wrote:
Hoi,
I cannot parse this ..
What Thomas is saying is that classification (putting things into
categories) and querying (finding things based on certain properties)
can be combined in a natural way. In ontology languages like OWL, you
can make
I guess (many categories could be expressed by
queries to improve results; a gentle, community-led transition will be
possible and preferred; categories won't be switched off just because
Wikidata is switched on).
Cheers,
Markus
Cheers,
Micru
On Tue, Aug 19, 2014 at 12:30 PM, Markus
On 12.08.2014 23:49, Andre Engels wrote:
...
In my opinion, I think manual descriptions should be kept, as opposed to be
deleted completely. Instead, automatically generated descriptions should be
provided when there is no manual description and people should be asked to
override automatically
On 25/07/14 15:28, David Cuenca wrote:
Worried about the harshness that lately has been developing, I have
started a new initiative to counter that by promoting more dialogue,
civility, and friendliness
https://www.wikidata.org/wiki/Wikidata:Shelter
When you are with friends you don't need to
On 04/07/14 14:49, Magnus Manske wrote:
On Fri, Jul 4, 2014 at 1:40 PM, Scott MacLeod
worlduniversityandsch...@gmail.com
mailto:worlduniversityandsch...@gmail.com wrote:
Jane, Lydia and WikiDatans,
These are great and helpful developments, which seem to be quite far
along now.
big thumb sizes,
but when file is JPG, don't try to generate a thumb bigger than the
original file or you will get a beautiful error.
Regards
2014-07-02 22:33 GMT+02:00 Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org:
Dear
On 02/07/14 16:29, David Cuenca wrote:
On Tue, Jul 1, 2014 at 11:07 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
My hope is that with my other suggestion (using P31 values as
features to correlate with), the property suggester
On 01/07/14 21:47, Lydia Pintscher wrote:
On Tue, Jul 1, 2014 at 9:44 PM, Andy Mabbett a...@pigsonthewing.org.uk wrote:
On 1 July 2014 20:20, Lydia Pintscher lydia.pintsc...@wikimedia.de wrote:
We have just deployed the entity suggester. This helps you with
suggesting properties. So when you
On 01/07/14 22:14, Markus Krötzsch wrote:
...
(2) Grade I listed building
http://tools.wmflabs.org/wikidata-exports/miga/?classes#_cat=Classes/Id=Q15700818
Related properties: English Heritage list number, masts, Minor Planet
Center observatory code, home port, coordinate location, OS grid
On 01/07/14 22:43, Bene* wrote:
Am 01.07.2014 22:23, schrieb Markus Krötzsch:
P.S. One weakness of my algorithm you can already see: it has troubles
estimating the relevance of very rare properties, such as Minor
Planet Center observatory code above. A single wrong annotation may
then lead
I've seen three formats proposed so far:
(1) Map + order fields (current format)
(2) Arrays
(3) Map + sort-index inside each map item
The last was proposed by Fredo; I think it got lost a bit. The idea
there would be to store something like index: 1 in the objects that
are inside the map to
FYI: this project claims to use Wikidata (among other resources) for
multilingual word-sense disambiguation. One of the first third-party
uses of Wikidata that I am aware of (but other pointers are welcome if
you have them). Wiktionary and OmegaWiki are also mentioned here.
Cheers,
Markus
Eric,
Two general remarks first:
(1) Protege is for small and medium ontologies, but not really for such
large datasets. To get SPARQL support for the whole data, you could to
install Virtuoso. It also comes with a simple Web query UI. Virtuoso
does not do much reasoning, but you can use
Hi Gerard,
On 13/06/14 11:08, Gerard Meijssen wrote:
Hoi,
When you leave out qualifiers, you will find that Ronald Reagan was
never president of the United States and only an actor. Yes, omitting
the statements with qualifiers is wrong but as a consequence the total
of the information is wrong
11:41, Markus Krötzsch mar...@semantic-mediawiki.org
mailto:mar...@semantic-mediawiki.org wrote:
Hi Gerard,
On 13/06/14 11:08, Gerard Meijssen wrote:
Hoi,
When you leave out qualifiers, you will find that Ronald Reagan was
never president of the United States
On 13/06/14 15:52, Bene* wrote:
...
Did I understand you right, Markus, that you leave out all statements
which have at least one qualifier? Wouldn't it make more sense to leave
out the qualifiers only but add the statements without qualifiers
anyway? Because this would solve eg. Gerard's
Gerard,
You sometimes sound as if everything is lost just because somebody put
an RDF file on the Web ;-)
If you don't like the simplified export, why don't you just use our main
export which contains all the data? Can't we all be happy -- the people
who want simple and the people who want
/06/14 14:36, Markus Krötzsch wrote:
Hi all,
We have prepared a new browser for Wikidata Properties:
http://tools.wmflabs.org/wikidata-exports/miga/
It is based on Miga data browser [1]. This means it only works in Google
Chrome/Chromium, Opera, Safari, and the Android Browser
[Including Yaron, the Miga developer, who is not on this list yet]
On 12/06/14 17:21, Thomas Douillard wrote:
Hi Markus, first thanks a lot for these tools.
It would be cool to include a link to the property browser into some
template, ( Template:P' for example , as Template:Q' generates a
On 10/06/14 22:50, Gerard Meijssen wrote:
Hoi,
It is stated that there are no qualifiers included. In one of the
articles you write that it is to be understood that the vailidity of the
information is dependent on the existing qualifiers.
What is the value of these RDF exports with the
On 11/06/14 17:13, Derric Atzrott wrote:
You might also find the new property browser helpful:
http://tools.wmflabs.org/wikidata-exports/miga/
(as mentioned before, requires one of Google Chrome, Safari, Opera, or
Android Browser to work).
While an excellent list and a neat tool, it sadly
]
http://opendata.stackexchange.com/questions/107/when-will-the-wikidata-database-be-available-for-download/
Max Klein
‽ http://notconfusing.com/
On Tue, Jun 10, 2014 at 1:35 AM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Dear all,
I am happy
On 11/06/14 20:49, Bene* wrote:
Am 11.06.2014 17:27, schrieb Markus Krötzsch:
Yes, I know what you mean. I'd love to integrate property group
information into our view as well, but I don't know where to get this
information from (other than by scraping it from the wiki page, which
does not seem
On 07/06/14 00:40, Joe Filceolaire wrote:
Well they can ask.
As there is no real definition of what is a city and what the limits of
each city are I'm not sure they will get a useful answer. The population
of the City of London (Q23311), for instance, is only 7,375! Should we
change it from
Dear all,
I am happy to announce the second release of Wikidata Toolkit [1], the
Java library for programming with Wikidata and Wikibase. This release
fixes bugs and improves features of the first release (download, parse,
process Wikidata exports) and it adds new components for serializing
status in
the UK so one would need to have helper items there as well. If we need
new items in either case, the class-based modelling seems nicer since it
fits into the existing class hierarchy as you suggest.
Markus
L.
Il 10/giu/2014 10:21 Markus Krötzsch mar...@semantic-mediawiki.org
On 29/05/14 21:04, Andrew Gray wrote:
One other issue to bear in mind: it's *simple* to have properties as a
separate thing. I have been following this discussion with some
interest but... well, I don't think I'm particularly stupid, but most
of it is completely above my head.
Saying here are
...@gmail.com
mailto:dacu...@gmail.com:
Markus,
On Thu, May 29, 2014 at 12:53 AM, Markus Krötzsch
mar...@semantic-mediawiki.org
mailto:mar...@semantic-mediawiki.org wrote:
This is an easy question once you have been clear about what
human behaviour is. According
-- these are not the things
we normally have in Wikidata).
Markus
2014-05-29 13:43 GMT+02:00 Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org:
On 29/05/14 12:41, Thomas Douillard wrote:
@David:
I think you should have a look to fuzzy logic
David,
I need to answer to your first assertion separately:
On 29/05/14 01:48, David Cuenca wrote:
Well, our goal it to gather the whole human knowledge, not to use it.
No, that is really not the case. Our goal is to gather carefully
selected parts of the human knowledge. Our community
The other answers, under the original subject:
On 29/05/14 01:48, David Cuenca wrote:
Settled :) Let's leave it at defined as a trait of
I don't think it is very clear what the intention of this property is.
What are the limits of its use? What is it meant to do? Can behaviour
really be a
Hi David,
Interesting remark. Let's explore this idea a bit. I will give you two
main reasons why we have properties separate, one practical and one
conceptual.
First the practical point. Certainly, everything that is used as a
property needs to have a datatype, since otherwise the wiki
On 28/05/14 10:37, Daniel Kinzler wrote:
Key differences between Properties and Items:
* Properties have a data type, items don't.
* Items have sitelinks, Properties don't.
* Items have Statements, Properties will support Claims (without sources).
The software needs these
there is no separation).
many thanks for your detailed answer, and sorry if I'm bringing up
already discussed topics. It is just that when you stare long into
wikidata, wikidata stares back into you ;)
Cheers,
Micru
On Wed, May 28, 2014 at 11:39 AM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar
On 28/05/14 15:56, Daniel Kinzler wrote:
Am 28.05.2014 15:05, schrieb Jean-Baptiste Pressac:
Hello,
I am reading the documentation of WikiData where I learned that new properties
could be suggested for discussion. But this means adding knew properties to
WikiData. However, is it possible to use
On Wed, May 28, 2014 at 2:48 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
David,
Regarding the question of how to classify properties and how to
relate them to items:
* same as (in the sense of owl:sameAs) is not the right concept
David,
One of the uses is: what is the relationship between a
human and his behavior?
This is an easy question once you have been clear about what human
behaviour is. According to enwiki, it is a range of behaviours
*exhibited by* humans. The bigger question for me is, whether it is
useful
without the added maintenance cost on the data management level.
Cheers,
Markus
On Wed, May 14, 2014 at 2:33 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
I guess there is already a group of people who deal w
Hi Eric
Hi Eric,
Thanks for all the information. This was very helpful. I only get to
answer now since we have been quite busy building RDF exports for
Wikidata (and writing a paper about it). I will soon announce this here
(we still need to fix a few details).
You were asking about using these
Hi all,
On 20/04/14 01:20, Jeroen De Dauw wrote:
Hey all,
I am happy to announce the 0.7.3 release of Wikibase DataModel.
\o/
On that note, I can also add that I am about to update the documentation
of the data model, so that we also have a written account of these
things. Hopefully this
Hi Gerard.
On 09/04/14 10:54, Gerard Meijssen wrote:
Hoi,
What is the relevance of these tools when you have to have specialised
environments to use them ?
Not sure what you mean. Wikidata Toolkit doesn't have any requirements
other than plain old Java to run.
Nevertheless, we'd also like
Dear all,
There are quite a few Wikidata-related submissions to Wikimania [0]. The
selection of the program committee seems to be based on user votes to
some extent, so don't forget to add your name to the submission pages
you care about :-).
I just added another two:
* How to use Wikidata:
Hi,
Since a few weeks now, no daily dumps have been published for Wikidata.
Only empty directories are created every day. I could not find a related
email on any list I scanned. Can anybody clarify what the situation is now?
Cheers,
Markus
___
On 13/03/14 17:14, Katie Filbert wrote:
On Thu, Mar 13, 2014 at 5:06 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Hi,
Since a few weeks now, no daily dumps have been published for
Wikidata. Only empty directories are created every
Hi ValterVB,
On 04/03/14 20:17, ValterVB wrote:
Hi Markus, it’s an error of my bot (ValterVBot). Thanks to noted them. I
can fix it probably on friday or saturday, source should be Q11920 not
Q11329. Sorry for this problem.
Great, that should be fine.
ValterVB
PS I’m not sure if I reply to
On 03/03/14 15:50, addshorewiki wrote:
This should probably be directed at
https://github.com/Wikidata/Wikidata-Toolkit/issues ?
Yes, details related to the ongoing development of Wikidata Toolkit
should be discussed elsewhere. In this particular case, part of the
problem can probably be
Hi,
On 26/02/14 22:40, Michael Smethurst wrote:
Hello
*Really* not meaning to jump down any http-range-14 rabbit holes but
wasn't there a plan for wikidata to have uris representing things and
pages about those things?
From conversations on this list I sketched a picture a while back of all
On 26/02/14 18:41, Jeroen De Dauw wrote:
Hey,
you can create claims with wbsetclaim. But you would need to create
a valid
GUID [1] yourself. The claim-GUID you send with your request needs to be
entityId$GUID (e.g. Q2$5627445f-43cb-ed6d-3adb-760e66bd17ee).
Uh, didn't we fix this a long
Hi Zoltán,
We also plan to support writing API access in Wikidata Toolkit soon [1].
Wikidata Toolkit already has a Java implementation of all Wikidata data
objects, so one can represent statements and claims. We also will soon
start working on JSON serialization of these objects (which you
Hi Fredo,
On 20/02/14 19:59, Fredo Erxleben wrote:
Hello everybody,
Since I am working on the conversion from the dump files to the wdtk
data model, I will have to take apart the refs section of the JSON
representing the stored items.
Now a refs-section most likely looks like this:
(Tried to
This call is a scam. The conference is not a legit academic event but
aims at making money. It is a sad truth that there is an increasingly
large amount of (more or less) academic conference spam these days. IEEE
has been criticized for sponsoring events without sufficient quality
control [1],
On 10/01/14 03:21, emw wrote:
What about monthly/dump-based aggregated property usage statistics?
Property usage statistics would be very valuable, Dimitris. It would
help inform community decisions about how to steer changes in property
usage with less disruption. It would have other
Hi,
On a related note, there is also an upcoming project, Wikidata Toolkit
[1], that will look into implementing query functionality over Wikidata
content, not to replace the Wikidata query features but to provide
functionality that is not a top priority for the core development. The
first
On 12/11/13 16:26, Sven Manguard wrote:
Google would not have sent over a large chunk of cash to help get
Wikidata started if it didn't think it could use Wikidata. That Russian
search engine comany would not have sent over a large chunk of cash to
keep Wikidata going if it didn't think it could
Hi Antoine,
The main answer to your questions is that the data model of Wikidata
defines a *data structure* not the *informal meaning* that this data
structure has in an application context (that is: what we, humans, want
to say when we enter it). I try to explain this a bit better below.
as a suggestion, you can turn these kind of numbers into a probability
distribution using the beta distribution. If you use (1,1) as a prior you
get something like beta(251,1) for the the probability of the probability
that somebody named Aaron is male.
-Original Message-
From: Markus Krötzsch
Sent
nicely how to take the effect of time into account.
Markus
On Sun, Oct 13, 2013 at 6:16 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Hi all,
I'd like to share a little Wikidata application: I just used
Wikidata to guess the sex of people
longer than I
need. My main problem is sexing Asian authors. Not sure if name-based
approaches are promising there at all.
Markus
On Sun, Oct 13, 2013 at 11:16 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Hi all,
I'd like to share
Hi -- or better: Heya! -- Lydia:
Congratulations to your new role! This is great news for the project,
which allows Wikidata to proceed on its important mission in perfect
continuity. Denny has made huge contributions to the project in the past
1.5 years -- a task that often involved
Dear Wikidatanions (*),
I have just drafted a little proposal for creating more tools for
external people to work with Wikidata, especially to build services on
top of its data [1]. Your feedback and support is needed.
Idea: Currently, this is quite hard for people, since we only have WDA
, schrieb Markus Krötzsch:
If we have an IRI DV, considering that URLs are special IRIs, it seems
clear
that IRI would be the best way of storing them.
The best way of storing them really depends on the storage platform. It
may be a string or something else.
I think the real issue here is that we
in this particular case. Maybe we can fix this
somehow in the future when URIs are supported as a value datatype.
Markus
On Thu, Aug 22, 2013 at 11:33 AM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Hi all,
I think one source of confusion here
Hi all,
I think one source of confusion here are the overlapping names of
property datatypes and datavalue types. Basically, the mapping is as
follows right now:
[Format: property type = datavalue type occurring in current dumps]
'wikibase-item' = 'wikibase-entityid'
'string' = 'string'
On 15/08/13 21:38, Dan Brickley wrote:
...
FWIW there's also RDF/XML if you use a *.rdf suffix. This btw is of
great interest to us over in the schema.org http://schema.org project;
earlier today I was showing
http://www.wikidata.org/wiki/Special:EntityData/Q199154.rdf
On 15/08/13 19:33, Jona Christopher Sahnwaldt wrote:
http://www.wikidata.org/entity/Q215607.nt which redirects to
http://www.wikidata.org/wiki/Special:EntityData/Q215607.nt
The RDF stuff at Wikidata is in flux. The RDF you get probably won't
contain all the data that the HTML page shows, and
, Markus Krötzsch mar...@semantic-mediawiki.org
wrote:
On 11/08/13 22:29, Tom Morris wrote:
On Sat, Aug 10, 2013 at 2:30 PM, Markus Krötzsch
mar...@semantic-mediawiki.org mailto:mar...@semantic-mediawiki.org
wrote:
Anyway, if you restrict yourself to tools that are installed by
default
On 10/08/13 10:29, Byrial Jensen wrote:
...
(BTW, the time values seems to be OK again, after many syntax errors in
the beginning. But the coordinate values have some strange (probably
erroneous?) variations: Values where the precision and/or globe is given
as null, and values where the globe
Good morning. I just found a bug that was caused by a bug in the
Wikidata dumps (a value that should be a URI was not). This led to a few
dozen lines with illegal qnames of the form w: . The updated script
fixes this.
Cheers,
Markus
On 09/08/13 18:15, Markus Krötzsch wrote:
Hi Sebastian
) or what? That's what puzzles me. I know that a Wikipedia
can allow multiple languages (or dialects) to coexist, but in the
Wikidata language selector I thought you can only select real
languages, not language groups.
Markus
On 8/6/13, Markus Krötzsch mar...@semantic-mediawiki.org wrote:
Hi
full triples instead. This would give you a line-by-line export
in (almost) no time (some uses of [...] blocks in object positions would
remain, but maybe you could live with that).
Best wishes,
Markus
All the best,
Sebastian
Am 03.08.2013 23:22, schrieb Markus Krötzsch:
Update: the first
201 - 300 of 313 matches
Mail list logo