I would find this discussion easier to follow if the Wikidata identifiers
for the various classes and properties were mentioned, and there were
pointers to relevant documentation.
The only Wikidata class or property that I could easily find is Q205892.
It's discussion page,
://www.mediawiki.org/wiki/Wikibase/DataModel#Dates_and_times
Cheers,
Denny
On Wed, Jul 1, 2015 at 9:48 AM Peter F. Patel-Schneider
pfpschnei...@gmail.com mailto:pfpschnei...@gmail.com wrote:
Thanks.
This helps in finding out how to reproduce the numbers.
However, I'm still
of TimeValue that are only
suitable for the Gregorian and Julian calendars.
On 07/01/2015 09:24 AM, Markus Krötzsch wrote:
On 01.07.2015 18:03, Peter F. Patel-Schneider wrote: ...
Even the very nice email from Markus that gives numbers does not
provide any information on where the numbers come from
On 10/28/2015 12:08 PM, Tom Morris wrote:
[...]
> Going back to Ben's original problem, one tool that Freebase used to help
> manage the problem of incompatible type merges was a set of curated sets of
> incompatible types [5] which was used by the merge tools to warn users that
> the merge they
ge.
>
> If that is acceptable, it would be easy for me to filter all items with P1889,
> from the merge game at least.
>
> On Wed, Oct 28, 2015 at 8:50 PM Peter F. Patel-Schneider
> <pfpschnei...@gmail.com <mailto:pfpschnei...@gmail.com>> wrote:
>
> On 10
On 11/13/2015 01:21 AM, Markus Krötzsch wrote:
> On 12.11.2015 22:09, Peter F. Patel-Schneider wrote:
>> On 11/12/2015 09:10 AM, Markus Krötzsch wrote:
>> [...]
>>> On the other hand, it is entirely possible to implement correct OWL QL
>>> (note:
>>>
On 11/12/2015 09:10 AM, Markus Krötzsch wrote:
[...]
> On the other hand, it is entirely possible to implement correct OWL QL (note:
> *QL* not *RL*) reasoning in SPARQL without even using "rules" that need any
> recursive evaluation [3]. This covers all of RDFS, and indeed some of the
> patterns
There
also should be tool support, for exammple to ensure that all
instances of the food-type food are subclasses of the non-food-type
food (and maybe vice-versa).
But what else can be done? Every other approach that I have seen
has what I consider to be worse problems.
> Stas Malyshev
> smalys...@wikimedia.org
Peter F. Patel-Schneider
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
It's very pleasant to hear from someone else who thinks of Wikidata as a
knowledge base (or at least hopes that Wikidata can be considered as a
knowledge base). Did you get any pushback on this or on your stated Wikidata
goal of structuring the sum of all human knowledge?
Did you get any
I am a relative [sic] outsider to Wikidata and I just tried to answer this
question by looking at wikidata.
It turns out that there is information in Wikidata that indicates that
https://www.wikidata.org/wiki/Property:P22 (father) is only to be used on
people. Look at
On 08/26/2015 06:16 AM, Svavar Kjarrval wrote:
On mið 26.ágú 2015 11:45, Ole Palnatoke Andersen wrote:
I've just completed #100wikidays, and my 100th article was about a
horse: https://www.wikidata.org/wiki/Q12003911 That horse is the
grandfather of https://www.wikidata.org/wiki/Q20872428,
described above, but without such work it is
very hard to figure out just what is supposed to be done in any except the
simple cases.
- Svavar Kjarrval
Peter F. Patel-Schneider
Nuance Communications
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
On 09/28/2015 11:24 PM, Federico Leva (Nemo) wrote:
> Peter F. Patel-Schneider, 28/09/2015 22:27:
>>> >I'm aguing against making such inference part of wikibase/wikidata core
>>> >functionality, and hiding it's working ("magic").
>>> >
>>>
t with a bot system.
I would argue that inference-making bots should be considered only as a
stop-gap measure, and that a different mechanism should be considered for
making inferences in Wikidata. I am not arguing for Inference done Just Right
(tm). It is not necessary to get inference perfect the
On 09/28/2015 08:12 AM, Daniel Kinzler wrote:
> Am 28.09.2015 um 16:43 schrieb Thomas Douillard:
>> Daniel Wrote:
>>> (*) This follows the principle of "magic is bad, let people edit". Allowing
>>> inconsistencies means we can detect errors by finding such inconsistencies.
>>> Automatically
On 09/28/2015 07:25 AM, Daniel Kinzler wrote:
> Am 28.09.2015 um 16:14 schrieb Peter F. Patel-Schneider:
>> I worry about this way of specializing properties. How are people, and
>> particularly programs, going to be able to find out that a qualifier is
>> needed, which qua
On 09/28/2015 08:30 AM, Daniel Kinzler wrote:
> Am 28.09.2015 um 17:27 schrieb Peter F. Patel-Schneider:
>> Are you arguing against any tool that makes inferences combining multiple
>> pieces of data in Wikidata? Would you also argue against this if the
>> inferred
>
On 09/24/2015 10:59 AM, Lydia Pintscher wrote:
> On Thu, Sep 24, 2015 at 7:54 PM, Tom Morris wrote:
>> Thanks! Is there any more information on the issue with MusicBrainz?
>>
>> 17:26:27 sjoerddebruin: yes, we went for MusicBrainz first,
>> but it turned out to be
On 09/24/2015 11:31 AM, Tom Morris wrote:
> On Thu, Sep 24, 2015 at 2:18 PM, Peter F. Patel-Schneider
> <pfpschnei...@gmail.com <mailto:pfpschnei...@gmail.com>> wrote:
>
> On 09/24/2015 10:59 AM, Lydia Pintscher wrote:
> > On Thu, Sep 24, 2015 at 7:54 PM,
On 08/05/2016 08:57 AM, Daniel Kinzler wrote:
> Am 05.08.2016 um 17:34 schrieb Peter F. Patel-Schneider:
>> So some additions are breaking changes then. What is a system that consumes
>> this information supposed to do? If the system doesn't monitor announcements
>>
only signalling, e.g., an annoucement on some web page, is not
adequate because there is no guarantee that consuming tools will be changed
in response.
Peter F. Patel-Schneider
Nuance Communications
On 08/05/2016 11:56 AM, Stas Malyshev wrote:
> Hi!
>
>> Consumers of data generally cannot te
e.
>
> As I said, format versioning. Maybe even semver or some suitable
> modification of it. RDF exports BTW already carry version. Maybe JSON
> exports should too.
Right. I'm all for version information being added to the Wikidata JSON dump
format. It woul
that it is not expecting as deficient and would counsel against using
such software.
Peter F. Patel-Schneider
Nuance Communications
PS: JSON is a particularly problematic encoding for data because many aspects
of the data that a particular JSON text is meant to encode are left
unspecified by the JSON standards
On 08/16/2016 07:57 AM, Daniel Kinzler wrote:
> Am 11.08.2016 um 23:12 schrieb Peter F. Patel-Schneider:
>> Until suitable versioning is part of the Wikidata JSON dump format and
>> contract, however, I don't think that consumers of the dumps should just
>> ignore new fields.
is certainly no universal semantic consideration, even in any strict notion of
semantics, that would require that there be two separate items here.
As far as I can tell, the Wikidata formalism is not one that would disallow
offices being classes. As far as I can te
for a modeller to easily determine how a
class is supposed to be used. This is not currently possible for color and I
think is the main source of the problems with color.
Peter F. Patel-Schneider
Nuance Communications
On 01/09/2017 10:28 AM, Denny Vrandečić wrote:
> I agree with Peter h
ve that data or not.
>
> On Mon, Aug 21, 2017 at 7:18 AM Peter F. Patel-Schneider
> <pfpschnei...@gmail.com <mailto:pfpschnei...@gmail.com>> wrote:
>
> One problem with BabelNet is that its licence is restrictive, being
> the Attribution-NonCommercia
this last connection, which would be easy for those categories
that a linked to Wikipedia page.
Peter F. Patel-Schneider
Nuance Communications
PS: Strangely the Yago logo has a non-commercial license. I don't know why
this was done.
On 08/15/2017 10:32 AM, Finn Aarup Nielsen wrote:
>
>
The GPS unit on my boat regularly claims an estimated position error of 4
feet after it has acquired its full complement of satellites. This is a
fairly new mid-price GPS unit using up to nine satellites and WAAS. So my
recreational GPS supposedly obtains fifth-decimal-place accuracy. It was
It's not easy to get to a true paradox with this collection. Not only do you
have to be able to express it but you have to require that it exists.
Peter F. Patel-Schneider
On 12/02/2017 11:09 AM, mathieu stumpf guntz wrote:
>
> Hi all,
>
> You should in any case be sure to av
Yeah, that would be nice.
You can zoom in on the image, and search for the labels in it. Unfortunately
many of the labels are truncated, e.g., WordNe
Clicking on a node gets the raw data backing up the image, but I don't see how
to get the processed data. The data for some of the nodes
Yes, it would be nice to have Wikidata there, provided that Wikidata satisfies
the requirements. There are already several mentions of Wikidata in the data
behind the diagram.
I don't think that Freebase satisfies the stated requirement because its URIs
no longer "resolve, with or without
that will redirect you to the wiki page, JSON dump, or RDF data
> (in XML or Turtle formats). Since the LOD Cloud criteria explicitly
> mentions content negotiation, I think we’re good :)
>
> Cheers,
> Lucas
>
> On 30.04.2018 23:08, Peter F. Patel-Schneider wrote:
>&g
dered as Linked Open Data but maybe
some improvements can be made.
peter
On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https URIs. The http IRIs
>> redirect to https IR
ntextual and additive?
>
>
>
> regards
>
> Michal Pavlovic
>
> BaaN/Infor-Administrator
>
>
> --
> Date: Sat, 14 Jul 2018 08:32:43 -0700
> From: "Peter F. Patel-Schneider"
; look for 1 statement instance, 4.9k times:
>
> select * {
> ?p wikibase:qualifier ?pq
> filter exists {?x ?pq ?y}
> } limit 100
>
> What query did you try?
>
> On Sat, Jul 14, 2018 at 2:40 AM, Peter F. Patel-Schneider
> wrote:
>> I'm trying
> Date: Wed, 18 Jul 2018 08:30:49 -0700
> From: "Peter F. Patel-Schneider"
> To: Discussion list for the Wikidata project
> , "Pavlovic, Michal"
>
> Subject: Re: [Wikidata] frequency of qualifier predicates
> Message-ID: <5d35b0ac-006f-a4
, five are contextual, three are
additive, and one does not carry world information. The last can be
considered to be contextual, but also might be considered to not carry world
information.
peter
On 07/14/2018 01:19 AM, Lydia Pintscher wrote:
> On Sat, Jul 14, 2018 at 1:42 AM Peter F. Pa
wrote:
> 2018-07-14 1:40 GMT+02:00 Peter F. Patel-Schneider <mailto:pfpschnei...@gmail.com>>:
>
> I'm trying to get a good estimate of how often which qualifier predicate
> is used.
>
>
> The obvious query times out, as expected, so I was trying to find
I'm trying to get a good estimate of how often which qualifier predicate is
used.
The obvious query times out, as expected, so I was trying to find a list of
predicates that are used as qualifiers so that I can craft a query for each of
them. There is
a? It is the poor organization of
the Wikidata ontology. To fix the ontology, beyond doing point fixes, is
going to require some commitment from the Wikidata community.
Peter F. Patel-Schneider
Nuance Communications
___
Wikidata mailing list
Wikidata@lists.w
On 10/20/18 11:57 AM, Ettore RIZZA wrote:
> From Peter F. Patel-Schneider
> Hi,
>
> I see no reason that this [adding subclass relationships sanctioned by
> corresponding Wikipedia pages]
> should not be done for other groups of living
> organisms where s
On 10/20/18 6:29 AM, Ettore RIZZA wrote:
> For most people, ants are insects, not instances of taxon.
Sure, but Wikidata doesn't have ants being instances of taxon. Instead,
Formicidae (aka ant) is an instance of taxon, which seems right to me.
Here are some extracts from Wikidata as of a few
Hi:
Why did you use exact match (P2888) instead of equivalent class (P1709) and
equivalent property (P1628)?
peter
On 9/22/18 5:07 AM, Andra Waagmeester wrote:
> Hi Maarten,
>
> We are actively mapping to other ontologies using the exact match P2888
> property. The disease ontology is one
that already exist. Get someone or some group to commit to keeping
the mapping up to date. Announce the results and show how they are useful.
Peter F. Patel-Schneider
Nuance Communications
On 9/22/18 4:28 AM, Maarten Dammers wrote:
> Hi everyone,
>
> Last week I presented Wikidata at the
right. Are all direct instances of human? That
seems to limiting. Are all indirect instances of human? This seems the most
natural, but where is this behaviour given?
Peter F. Patel-Schneider
Samsung Research America
On 5/28/19 12:04 PM, Léa Lacroix wrote:
> Hello all,
>
> As p
re all indirect instances of human? This seems the
> most
> natural, but where is this behaviour given?
>
> Peter F. Patel-Schneider
> Samsung Research America
>
>
>
> On 5/28/19 12:04 PM, Léa Lacroix wrote:
> > Hello all,
> >
>
The history of ShEx is quite complex.
I don't think that one can say that there were complete and conforming
implementations of ShEx in 2017 because the main ShEX specification,
http://shex.io/shex-semantics-20170713/ was ill-founded. I pointed this out
in
e:language "en". }
}
SELECT ?item ?itemLabel WHERE {
wd:Q82794 wdt:P279* ?item .
SERVICE wikibase:label { bd:serviceParam wikibase:language "en". }
}
SELECT ?item ?itemLabel WHERE {
?item wdt:P31/wdt:P279* wd:Q467.
SERVICE wikibase:label { bd:serviceParam wikibase:langu
Why should this not be done? It seems reasonable to me. Is there some
official statement that this should not be done?
More generally, where is any notion of inference in Wikidata defined?
There appear to be more problems with sports season. For example,
Brickley wrote:
On Fri, 23 Dec 2022 at 21:32, Peter F. Patel-Schneider
wrote:
Why should this not be done? It seems reasonable to me. Is there
some official statement that this should not be done?
The Blazegraph Wikidata SPARQL endpoint (as it is a sadly abandoned
codebase) already
It may be that Wikidata has a lot of general classes, but this is unavoidable
I think if Wikidata is going to store a lot of different kinds of
information. (This is not to say that there are not problems in the Wikidata
class hierarchy.)
For example one of the objects that the mayor of
On 2/23/23 12:19, James Heald wrote:
On Wed, 22 Feb 2023 at 00:03, Kingsley Idehen via Wikidata wrote:
On 2/21/23 4:05 PM, Guillaume Lederrey wrote:
The exposed SPARQL endpoint is at the moment a direct exposition of the
Blazegraph endpoint, so it does expose all the Blazegraph specific
What then is P17 supposed to be used for?
Could, I, for example, use P17 on the address of the Swiss embassy in Germany
and have Switzerland as the value?
associated is generally too weak a word to use in describing properties.
peter
On 01/08/2015 01:46 PM, Thad Guidry wrote:
Markus,
subpropertyof child .
John son Bill .
John daughter Mary .
does a query for John's children return Bill and Mary?
peter
On 01/12/2015 10:46 AM, Nikolas Everett wrote:
On Mon, Jan 12, 2015 at 1:35 PM, Peter F. Patel-Schneider
pfpschnei...@gmail.com mailto:pfpschnei...@gmail.com wrote
making any other
kind of changes to this information.
Peter F. Patel-Schneider
On 01/07/2015 06:25 AM, Markus Krötzsch wrote:
Back to Denny's original question:
Does anybody see a specific danger of abuse if living people get to edit their
own data right now? Entering wrong claims deliberately
, which is like CC0.
Peter F. Patel-Schneider, speaking as an individual
On 04/23/2015 07:20 AM, Serge Wroclawski wrote:
I am not sure how I missed this discussion, but adding information from
OSM into Wikidata en mass like this is a violation of the OSM license.
- Serge
On Tue, Mar 10
I'm trying to figure out whether the Wikidata label service (the stuff that is
invoked as
SERVICE wikibase:label { bd:serviceParam wikibase:language "en" . }
in queries to the Wikidata query service) is something that can be done in
SPARQL or whether it is an extension that can't be done in
Is this the recommended way to set up a local copy of Wikidata? (If not, what
is the recommended way?)
peter
On 11/19/19 10:37 AM, Addshore wrote:
> Hi all
>
> We resolved this on the Wikibase telegram chat.
>
> For anyone finding this email thread, here is a rough log of the chat
>
>
that
exposes a retained identity for blank nodes?
Peter F. Patel-Schneider
On 4/16/20 8:34 AM, David Causse wrote:
> Hi,
>
> This message is relevant for people writing SPARQL queries and using the
> Wikidata Query Service:
>
> As part of the work of redesigning the
60 matches
Mail list logo