Re: [Wikidata] vandalism

2018-05-07 Thread Джон Д .

Hello
 
Someone took a page I have been working on for months and removed most of 
my work.
 
?
 
Is it wrong to include more than one Q. in a P131
 
where you have an instance of a settlement, village etc. within the city 
limits and that city being within a district oblast etc.
 
and within that village as in P31, are your  Q131596,  Q11451 , 
and  Q10480682, which makes it a  Q570116) as 
well.
 
and in 
that location is also a  Q27603105.
 
is it 
wrong to include all this 
stuff.
 
 
 
 
 
 ___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Wikitech-l] GSoC 2018 Introduction: Prssanna Desai

2018-05-07 Thread Stas Malyshev
Hi!

> Greetings,
> I'm Prssanna Desai, an undergraduate student from NMIMS University, Mumbai,
> India and I've been selected for GSoC '18.
> 
> *My Project:* *Improve Data Explorer for query.wikidata.org
> *

Welcome! Thanks for participating and helping to make the Query Service
better!
-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Denny Vrandečić
Thanks!

On Mon, May 7, 2018 at 1:36 PM Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de> wrote:

> Folks, I’m already in contact with John, there’s no need to contact him
> again :)
>
> Cheers, Lucas
>
> Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić <
> vrande...@gmail.com>:
>
>> Well, then, we have tried several times to get into that diagram, and it
>> never worked out.
>>
>> So, given the page you linke, it says:
>>
>> Contributing to the Diagram
>>
>> First, make sure that you publish data according to the Linked Data
>> principles . We
>> interpret this as:
>>
>>- There must be *resolvable http:// (or https://) URIs*.
>>- They must resolve, with or without content negotiation, to *RDF
>>data* in one of the popular RDF formats (RDFa, RDF/XML, Turtle,
>>N-Triples).
>>- The dataset must contain *at least 1000 triples*. (Hence, your FOAF
>>file most likely does not qualify.)
>>- The dataset must be connected via *RDF links* to a dataset that is
>>already in the diagram. This means, either your dataset must use URIs from
>>the other dataset, or vice versa. We arbitrarily require at least 50 
>> links.
>>- Access of the *entire* dataset must be possible via *RDF crawling*,
>>via an *RDF dump*, or via a *SPARQL endpoint*.
>>
>> The process for adding datasets is still under development, please
>> contact John P. McCrae  to add a new dataset
>>
>> Wikidata fulfills all the conditions easily. So, here we go, I am adding
>> John to this thread - although I know he already knows about this request -
>> and I am asking officially to enter Wikidata into the LOD diagram.
>>
>> Let's keep it all open, and see where it goes from here.
>>
>> Cheers,
>> Denny
>>
>>
>> On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
>> hellm...@informatik.uni-leipzig.de> wrote:
>>
>>> Hi Denny, Maarten,
>>>
>>> you should read your own emails. In fact it is quite easy to join the
>>> LOD cloud diagram.
>>>
>>> The most important step is to follow the instructions on the page:
>>> http://lod-cloud.net under how to contribute and then add the metadata.
>>>
>>> Some years ago I made a Wordpress with enabled Linked Data:
>>> http://www.klappstuhlclub.de/wp/ Even this is included as I simply
>>> added the metadata entry.
>>>
>>> Do you really think John McCrae added a line in the code that says "if
>>> (dataset==wikidata) skip; " ?
>>>
>>> You just need to add it like everybody else in LOD, DBpedia also created
>>> its entry and updates it now and then. The same accounts for
>>> http://lov.okfn.org  Somebody from Wikidata needs to upload the
>>> Wikidata properties as OWL.  If nobody does it, it will not be in there.
>>>
>>> All the best,
>>>
>>> Sebastian
>>>
>>> On 04.05.2018 18:33, Maarten Dammers wrote:
>>>
>>> It almost feels like someone doesn’t want Wikidata in there? Maybe that
>>> website is maintained by DBpedia fans? Just thinking out loud here because
>>> DBpedia is very popular in the academic world and Wikidata a huge threat
>>> for that popularity.
>>>
>>> Maarten
>>>
>>> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het
>>> volgende geschreven:
>>>
>>> I'm pretty sure that Wikidata is doing better than 90% of the current
>>> bubbles in the diagram.
>>>
>>> If they wanted to have Wikidata in the diagram it would have been there
>>> before it was too small to read it. :)
>>>
>>> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
>>> pfpschnei...@gmail.com> wrote:
>>>
 Thanks for the corrections.

 So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for
 Douglas
 Adams.  Retrieving from this IRI results in a 303 See Other to
 https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess)
 is the
 main IRI for representations of Douglas Adams and other pages with
 information about him.

 From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
 negotiation can be used to get the JSON representation (the default),
 other
 representations including Turtle, and human-readable information.  (Well
 actually I'm not sure that this is really correct.  It appears that
 instead
 of directly using content negotiation, another 303 See Other is used to
 provide an IRI for a document in the requested format.)

 https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
 https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
 machine-readable documents containing the Wikidata information about
 Douglas
 Adams.  Content negotiation is not possible on these pages.

 https://www.wikidata.org/wiki/Q42 is the IRI that produces a
 human-readable
 version of the information about Douglas Adams.  Content negotiation is
 not
 possible on this page, but it does have link rel="alternate" to the
 machine-readable pages.

 Strangely this page has a link rel="canonica

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Lucas Werkmeister
Folks, I’m already in contact with John, there’s no need to contact him
again :)

Cheers, Lucas

Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić <
vrande...@gmail.com>:

> Well, then, we have tried several times to get into that diagram, and it
> never worked out.
>
> So, given the page you linke, it says:
>
> Contributing to the Diagram
>
> First, make sure that you publish data according to the Linked Data
> principles . We interpret
> this as:
>
>- There must be *resolvable http:// (or https://) URIs*.
>- They must resolve, with or without content negotiation, to *RDF data* in
>one of the popular RDF formats (RDFa, RDF/XML, Turtle, N-Triples).
>- The dataset must contain *at least 1000 triples*. (Hence, your FOAF
>file most likely does not qualify.)
>- The dataset must be connected via *RDF links* to a dataset that is
>already in the diagram. This means, either your dataset must use URIs from
>the other dataset, or vice versa. We arbitrarily require at least 50 links.
>- Access of the *entire* dataset must be possible via *RDF crawling*,
>via an *RDF dump*, or via a *SPARQL endpoint*.
>
> The process for adding datasets is still under development, please contact 
> John
> P. McCrae  to add a new dataset
>
> Wikidata fulfills all the conditions easily. So, here we go, I am adding
> John to this thread - although I know he already knows about this request -
> and I am asking officially to enter Wikidata into the LOD diagram.
>
> Let's keep it all open, and see where it goes from here.
>
> Cheers,
> Denny
>
>
> On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
> hellm...@informatik.uni-leipzig.de> wrote:
>
>> Hi Denny, Maarten,
>>
>> you should read your own emails. In fact it is quite easy to join the LOD
>> cloud diagram.
>>
>> The most important step is to follow the instructions on the page:
>> http://lod-cloud.net under how to contribute and then add the metadata.
>>
>> Some years ago I made a Wordpress with enabled Linked Data:
>> http://www.klappstuhlclub.de/wp/ Even this is included as I simply added
>> the metadata entry.
>>
>> Do you really think John McCrae added a line in the code that says "if
>> (dataset==wikidata) skip; " ?
>>
>> You just need to add it like everybody else in LOD, DBpedia also created
>> its entry and updates it now and then. The same accounts for
>> http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata
>> properties as OWL.  If nobody does it, it will not be in there.
>>
>> All the best,
>>
>> Sebastian
>>
>> On 04.05.2018 18:33, Maarten Dammers wrote:
>>
>> It almost feels like someone doesn’t want Wikidata in there? Maybe that
>> website is maintained by DBpedia fans? Just thinking out loud here because
>> DBpedia is very popular in the academic world and Wikidata a huge threat
>> for that popularity.
>>
>> Maarten
>>
>> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het
>> volgende geschreven:
>>
>> I'm pretty sure that Wikidata is doing better than 90% of the current
>> bubbles in the diagram.
>>
>> If they wanted to have Wikidata in the diagram it would have been there
>> before it was too small to read it. :)
>>
>> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
>> pfpschnei...@gmail.com> wrote:
>>
>>> Thanks for the corrections.
>>>
>>> So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
>>> Adams.  Retrieving from this IRI results in a 303 See Other to
>>> https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess)
>>> is the
>>> main IRI for representations of Douglas Adams and other pages with
>>> information about him.
>>>
>>> From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
>>> negotiation can be used to get the JSON representation (the default),
>>> other
>>> representations including Turtle, and human-readable information.  (Well
>>> actually I'm not sure that this is really correct.  It appears that
>>> instead
>>> of directly using content negotiation, another 303 See Other is used to
>>> provide an IRI for a document in the requested format.)
>>>
>>> https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
>>> https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
>>> machine-readable documents containing the Wikidata information about
>>> Douglas
>>> Adams.  Content negotiation is not possible on these pages.
>>>
>>> https://www.wikidata.org/wiki/Q42 is the IRI that produces a
>>> human-readable
>>> version of the information about Douglas Adams.  Content negotiation is
>>> not
>>> possible on this page, but it does have link rel="alternate" to the
>>> machine-readable pages.
>>>
>>> Strangely this page has a link rel="canonical" to itself.  Shouldn't that
>>> link be to https://www.wikidata.org/entity/Q42?  There is a
>>> human-visible
>>> link to this IRI, but there doesn't appear to be any machine-readable
>>> link.
>>>
>>> RDF links to other IRIs for Douglas 

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Denny Vrandečić
Well, then, we have tried several times to get into that diagram, and it
never worked out.

So, given the page you linke, it says:

Contributing to the Diagram

First, make sure that you publish data according to the Linked Data
principles . We interpret
this as:

   - There must be *resolvable http:// (or https://) URIs*.
   - They must resolve, with or without content negotiation, to *RDF data* in
   one of the popular RDF formats (RDFa, RDF/XML, Turtle, N-Triples).
   - The dataset must contain *at least 1000 triples*. (Hence, your FOAF
   file most likely does not qualify.)
   - The dataset must be connected via *RDF links* to a dataset that is
   already in the diagram. This means, either your dataset must use URIs from
   the other dataset, or vice versa. We arbitrarily require at least 50 links.
   - Access of the *entire* dataset must be possible via *RDF crawling*,
   via an *RDF dump*, or via a *SPARQL endpoint*.

The process for adding datasets is still under development, please contact John
P. McCrae  to add a new dataset

Wikidata fulfills all the conditions easily. So, here we go, I am adding
John to this thread - although I know he already knows about this request -
and I am asking officially to enter Wikidata into the LOD diagram.

Let's keep it all open, and see where it goes from here.

Cheers,
Denny


On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
hellm...@informatik.uni-leipzig.de> wrote:

> Hi Denny, Maarten,
>
> you should read your own emails. In fact it is quite easy to join the LOD
> cloud diagram.
>
> The most important step is to follow the instructions on the page:
> http://lod-cloud.net under how to contribute and then add the metadata.
>
> Some years ago I made a Wordpress with enabled Linked Data:
> http://www.klappstuhlclub.de/wp/ Even this is included as I simply added
> the metadata entry.
>
> Do you really think John McCrae added a line in the code that says "if
> (dataset==wikidata) skip; " ?
>
> You just need to add it like everybody else in LOD, DBpedia also created
> its entry and updates it now and then. The same accounts for
> http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata
> properties as OWL.  If nobody does it, it will not be in there.
>
> All the best,
>
> Sebastian
>
> On 04.05.2018 18:33, Maarten Dammers wrote:
>
> It almost feels like someone doesn’t want Wikidata in there? Maybe that
> website is maintained by DBpedia fans? Just thinking out loud here because
> DBpedia is very popular in the academic world and Wikidata a huge threat
> for that popularity.
>
> Maarten
>
> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het
> volgende geschreven:
>
> I'm pretty sure that Wikidata is doing better than 90% of the current
> bubbles in the diagram.
>
> If they wanted to have Wikidata in the diagram it would have been there
> before it was too small to read it. :)
>
> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
> pfpschnei...@gmail.com> wrote:
>
>> Thanks for the corrections.
>>
>> So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
>> Adams.  Retrieving from this IRI results in a 303 See Other to
>> https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess) is
>> the
>> main IRI for representations of Douglas Adams and other pages with
>> information about him.
>>
>> From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
>> negotiation can be used to get the JSON representation (the default),
>> other
>> representations including Turtle, and human-readable information.  (Well
>> actually I'm not sure that this is really correct.  It appears that
>> instead
>> of directly using content negotiation, another 303 See Other is used to
>> provide an IRI for a document in the requested format.)
>>
>> https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
>> https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
>> machine-readable documents containing the Wikidata information about
>> Douglas
>> Adams.  Content negotiation is not possible on these pages.
>>
>> https://www.wikidata.org/wiki/Q42 is the IRI that produces a
>> human-readable
>> version of the information about Douglas Adams.  Content negotiation is
>> not
>> possible on this page, but it does have link rel="alternate" to the
>> machine-readable pages.
>>
>> Strangely this page has a link rel="canonical" to itself.  Shouldn't that
>> link be to https://www.wikidata.org/entity/Q42?  There is a human-visible
>> link to this IRI, but there doesn't appear to be any machine-readable
>> link.
>>
>> RDF links to other IRIs for Douglas Adams are given in RDF pages by
>> properties in the wdtn namespace.  Many, but not all, identifiers are
>> handled this way.  (Strangely ISNI (P213) isn't even though it is linked
>> on
>> the human-readable page.)
>>
>> So it looks as if Wikidata can be considered as Linked Open Data but maybe
>> some improvements can be made.
>>

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Stas Malyshev
Hi!

> you should read your own emails. In fact it is quite easy to join the
> LOD cloud diagram.
> 
> The most important step is to follow the instructions on the page:
> http://lod-cloud.net under how to contribute and then add the metadata.

I may not be reading it right or misunderstanding something, but I tried
to locate up-to-date working instructions for doing this a few times and
it always ended up going nowhere - the instructions turned out to be out
of date, or new process not working yet, or something else. It would be
very nice and very helpful if you could point out specifically where on
that page are step-by-step instructions which could be followed and
result in resolving this issue?

> Do you really think John McCrae added a line in the code that says "if
> (dataset==wikidata) skip; " ?

I don't think anybody thinks that. And I think most of people there
think it would be nice to have Wikidata added to LOD. It sounds like you
know how to do it, could you please share more specific information
about it?

> You just need to add it like everybody else in LOD, DBpedia also created
> its entry and updates it now and then. The same accounts for
> http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata
> properties as OWL.  If nobody does it, it will not be in there.

Could you share more information about lov.okfn.org? Going there
produces 502, and it's not mentioned anywhere on lod-cloud.net. Where it
is documented and what is exactly the process and what you mean by
"upload the Wikidata properties as OWL"? More detailed information would
be hugely helpful.

Thanks in advance,
-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Weekly Summary #311

2018-05-07 Thread Léa Lacroix
*Here's your quick overview of what has been happening around Wikidata over
the last week.*

Discussions

   - New request for comments:
  - Sort identifier statements on items that are instances of human
  

  - Make "developer" and "programmer" properties clearer
  


Events 

   - Wikidata and GLAM workshop day
   

   in the context of the EuropeanaTech Conference, Rotterdam, Monday 14 May
   2018
   - Wikidata access methods
   , slides by
   Dan Scott

Other Noteworthy Stuff

   - More than 850 living people' articles from the English Wikipedia
    which have date of death
    or place of death
    on their Wikidata item:
   manual checks needed. You can also check Category:P570 missing in
   Wikipedia 
   - Florian will be improving Wikidata support
   
   in the Wikipedia plugin for OpenStreetMap's JOSM editor, for the Google
   Summer of Code 2018
   - Prssanna Desai will work on improvements for the Query Service
   
   during Google Summer of Code

Did you know?

   - Newest properties
   :
  - General datatypes: Deutsche Bahn station category
  , has grammatical gender
  , has grammatical person
  , Wikimedia outline
  , assistant director
  , island of location
  , possible medical
  findings , suggests the
  existence of , has
  evaluation , evaluation
  of , greater than
  , less than
  
  - External identifiers: Dictionary of Algorithms and Data Structures
  ID , Behind The Voice
  Actors character ID
, HanCinema
  film ID , Italian
  School ID , Directory
  of Open Access Journals ID
  , LGDB game ID
  , LGDB emulator ID
  , LGDB tool ID
  , LGDB engine ID
  , TFRRS athlete ID
  , All About Jazz
  musician ID , Ontario
  public library ID , Swedish
  Literature Bank book ID ,
  WikiCFP event ID , WikiCFP
  conference series ID , ICAA
  film catalogue ID
, Stepwell
  Atlas ID 
   - New property proposals
   
   to review:
  - General datatypes: IMDA rating
  ,
  is program committee member of
  
,
  officialized by
  
,
  KAVI rating
  ,
  topographic map
  
,
  child monotypic taxon
  
,
  Köppens klimaklassifisering
  


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Fariz Darari
To add to Andy's reply, on Wikidata the combination of Ranking (
https://m.wikidata.org/wiki/Help:Ranking) , Qualifier (
https://m.wikidata.org/wiki/Special:MyLanguage/Help:Qualifiers) and
References (https://m.wikidata.org/wiki/Special:MyLanguage/Help:Sources)
would enable storing disputed property values. So, it does make sense.

-fariz

On Mon, May 7, 2018, 19:27 Andy Mabbett  wrote:

> On 7 May 2018 at 00:15, Sylvain Boissel  wrote:
>
> > Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a
> écrit
>
> >> On 5 May 2018 at 14:39, David Abián  wrote:
> >>
> >> > Both Wikidata and DBpedia surely can, and should, coexist because
> we'll
> >> > never be able to host in Wikidata the entirety of the Wikipedias.
> >>
> >> Can you give an example of something that can be represented in
> >> DBpedia, but not Wikidata?
>
> > Sure : DBpedia knows the specific values different versions of Wikipedia
> > choose to display in the infobox. For example, the size or population of
> > countries with disputed borders. This data is useful for researchers
> working
> > on cultural bias in Wikipedia, but it makes little sense to store it in
> > Wikidata.
>
> Except that does; and Wikidata is more than capable of holding values
> from conflicting sources. So again, this does not substantiate the
> "Both Wikidata and DBpedia surely can, and should, coexist because
> we'll never be able to host in Wikidata the entirety of the
> Wikipedias" claim.
>
> --
> Andy Mabbett
> @pigsonthewing
> http://pigsonthewing.org.uk
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Sebastian Hellmann
Wikidata should hold the data of all Wikipedias, that is its main 
purpose. However, it doesn't yet and there are many problems, i.e. 
missing references, population count moved to Commons and an open 
discussion to even throw out Wikidata from the infoboxes: 
https://en.wikipedia.org/wiki/Wikipedia:Wikidata/2018_Infobox_RfC


DBpedia is more about technology than data, so we are trying to help out 
and push Wikidata, so it has all the values of all Wikipedias plus it's 
references: 
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSync


All the best,

Sebastian


On 07.05.2018 14:26, Andy Mabbett wrote:

On 7 May 2018 at 00:15, Sylvain Boissel  wrote:


Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a écrit

On 5 May 2018 at 14:39, David Abián  wrote:


Both Wikidata and DBpedia surely can, and should, coexist because we'll
never be able to host in Wikidata the entirety of the Wikipedias.

Can you give an example of something that can be represented in
DBpedia, but not Wikidata?

Sure : DBpedia knows the specific values different versions of Wikipedia
choose to display in the infobox. For example, the size or population of
countries with disputed borders. This data is useful for researchers working
on cultural bias in Wikipedia, but it makes little sense to store it in
Wikidata.

Except that does; and Wikidata is more than capable of holding values
from conflicting sources. So again, this does not substantiate the
"Both Wikidata and DBpedia surely can, and should, coexist because
we'll never be able to host in Wikidata the entirety of the
Wikipedias" claim.



--
All the best,
Sebastian Hellmann

Director of Knowledge Integration and Linked Data Technologies (KILT) 
Competence Center

at the Institute for Applied Informatics (InfAI) at Leipzig University
Executive Director of the DBpedia Association
Projects: http://dbpedia.org, http://nlp2rdf.org, 
http://linguistics.okfn.org, https://www.w3.org/community/ld4lt 


Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Andy Mabbett
On 7 May 2018 at 00:15, Sylvain Boissel  wrote:

> Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a écrit

>> On 5 May 2018 at 14:39, David Abián  wrote:
>>
>> > Both Wikidata and DBpedia surely can, and should, coexist because we'll
>> > never be able to host in Wikidata the entirety of the Wikipedias.
>>
>> Can you give an example of something that can be represented in
>> DBpedia, but not Wikidata?

> Sure : DBpedia knows the specific values different versions of Wikipedia
> choose to display in the infobox. For example, the size or population of
> countries with disputed borders. This data is useful for researchers working
> on cultural bias in Wikipedia, but it makes little sense to store it in
> Wikidata.

Except that does; and Wikidata is more than capable of holding values
from conflicting sources. So again, this does not substantiate the
"Both Wikidata and DBpedia surely can, and should, coexist because
we'll never be able to host in Wikidata the entirety of the
Wikipedias" claim.

-- 
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] GlobalFactSync new prototype Re: Wikiata and the LOD cloud

2018-05-07 Thread Sebastian Hellmann

Hi all,

the discussion about Wikidata and LOD got into this specific detail and 
I was just hoping that we could pick up on a few topics.


We are still hoping to get some support for our GlobalFactSync proposal: 
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSync



We created a new prototype here (Just Eiffel Tour for now):

http://88.99.242.78:9000/?s=http%3A%2F%2Fid.dbpedia.org%2Fglobal%2F12HpzV&p=http%3A%2F%2Fdbpedia.org%2Fontology%2FfloorCount&src=general

You can see there that the floor count property is different in the 
French Wikipedia (properties can be switched with the dropox on the top)


The English Wikipedia has the same value as Wikidata plus a reference. 
One of the goals of GlobalFactSync is to extract these references and 
import them into Wikidata.


We will also build a redirection service around it, so you can use 
Wikidata Q's and P's as arguments for ?s= and ?p= and get resolved to 
the right entry for quick comparison between WD and WP.


All the best,

Sebastian





On 06.05.2018 10:54, Ettore RIZZA wrote:
@Antonin : You're right, I now remember Magnus Knuth's message on this 
list about GlobalFactSync 
, 
a lite version of CrossWikiFact, if I understood correctly. I also 
remember that his message did not trigger many reactions...


2018-05-06 10:46 GMT+02:00 Antonin Delpeuch (lists) 
mailto:li...@antonin.delpeuch.eu>>:


On 06/05/2018 10:37, Ettore RIZZA wrote:
>     More simply, there's still a long way to go until Wikidata
imports
>     all the data contained in Wikipedia infoboxes (or equivalent
data
>     from other sources), let alone the rest.
>
>
> This surprises me. Are there any statistics somewhere on the rate of
> Wikipedia's infoboxes fully parsed ?


That was more or less the goal of the CrossWikiFact project, which was
unfortunately not very widely supported:
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact


It's still not clear to me why this got so little support - it looked
like a good opportunity to collaborate with DBpedia.

Antonin

___
Wikidata mailing list
Wikidata@lists.wikimedia.org 
https://lists.wikimedia.org/mailman/listinfo/wikidata





___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


--
All the best,
Sebastian Hellmann

Director of Knowledge Integration and Linked Data Technologies (KILT) 
Competence Center

at the Institute for Applied Informatics (InfAI) at Leipzig University
Executive Director of the DBpedia Association
Projects: http://dbpedia.org, http://nlp2rdf.org, 
http://linguistics.okfn.org, https://www.w3.org/community/ld4lt 


Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Sebastian Hellmann

Hi Denny, Maarten,

you should read your own emails. In fact it is quite easy to join the 
LOD cloud diagram.


The most important step is to follow the instructions on the page: 
http://lod-cloud.net under how to contribute and then add the metadata.


Some years ago I made a Wordpress with enabled Linked Data: 
http://www.klappstuhlclub.de/wp/ Even this is included as I simply added 
the metadata entry.


Do you really think John McCrae added a line in the code that says "if 
(dataset==wikidata) skip; " ?


You just need to add it like everybody else in LOD, DBpedia also created 
its entry and updates it now and then. The same accounts for 
http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata 
properties as OWL.  If nobody does it, it will not be in there.


All the best,

Sebastian


On 04.05.2018 18:33, Maarten Dammers wrote:
It almost feels like someone doesn’t want Wikidata in there? Maybe 
that website is maintained by DBpedia fans? Just thinking out loud 
here because DBpedia is very popular in the academic world and 
Wikidata a huge threat for that popularity.


Maarten

Op 4 mei 2018 om 17:20 heeft Denny Vrandečić > het volgende geschreven:


I'm pretty sure that Wikidata is doing better than 90% of the current 
bubbles in the diagram.


If they wanted to have Wikidata in the diagram it would have been 
there before it was too small to read it. :)


On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider 
mailto:pfpschnei...@gmail.com>> wrote:


Thanks for the corrections.

So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for
Douglas
Adams.  Retrieving from this IRI results in a 303 See Other to
https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I
guess) is the
main IRI for representations of Douglas Adams and other pages with
information about him.

From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
negotiation can be used to get the JSON representation (the
default), other
representations including Turtle, and human-readable
information.  (Well
actually I'm not sure that this is really correct.  It appears
that instead
of directly using content negotiation, another 303 See Other is
used to
provide an IRI for a document in the requested format.)

https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the
useful
machine-readable documents containing the Wikidata information
about Douglas
Adams.  Content negotiation is not possible on these pages.

https://www.wikidata.org/wiki/Q42 is the IRI that produces a
human-readable
version of the information about Douglas Adams.  Content
negotiation is not
possible on this page, but it does have link rel="alternate" to the
machine-readable pages.

Strangely this page has a link rel="canonical" to itself.
Shouldn't that
link be to https://www.wikidata.org/entity/Q42? There is a
human-visible
link to this IRI, but there doesn't appear to be any
machine-readable link.

RDF links to other IRIs for Douglas Adams are given in RDF pages by
properties in the wdtn namespace.  Many, but not all, identifiers are
handled this way.  (Strangely ISNI (P213) isn't even though it is
linked on
the human-readable page.)

So it looks as if Wikidata can be considered as Linked Open Data
but maybe
some improvements can be made.


peter



On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https URIs. 
The http IRIs
>> redirect to https IRIs.
>
> That's right.
>
>>   As far as I can tell no content negotiation is
>> done.
>
> No, you're mistaken. Your tried the URL of a wikipage in your
curl command.
> Those are for human consumption, thus not available in turtle.
>
> The "real IRIs" of Wikidata entities are like this:
> https://www.wikidata.org/entity/Q{NUMBER}

>
> However, they 303 redirect to
> https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}

>
> which is the identifier of a schema:Dataset. Then, if you HTTP
GET these
> URIs, you can content negotiate them to JSON
>
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json
)
or to
> turtle
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl
).
>
>
> Suprisingly, there is no connection between the entity IRIs and
the wikipage
> URLs. If one was given the IRI of an entity fr

[Wikidata] ISWC 2018 - Student Travel Grants

2018-05-07 Thread Maribel Acosta
*17th International Semantic Web Conference (ISWC 2018)Monterey, California
(USA), October 8-12, 2018 ISWC 2018 STUDENT TRAVEL GRANTS
===http://iswc2018.semanticweb.org/student-travel-grants/
If you are a
student interested in attending ISWC 2018, you may be eligible to apply for
travel grants to support the costs of travel and lodging. This year, travel
grants are funded by the Semantic Web Science Association (SWSA) and the US
National Science Foundation (NSF).APPLYING FOR A STUDENT TRAVEL AWARDThe
deadline for applying for an ISWC 2018 travel award is August 27th, 2018.
Please make sure that you have submitted your application and that your
supervisor has sent us a confirmation email by this date.To apply for a
grant please follow these steps:Make sure that you are eligible to apply
for an ISWC 2018 travel award: You must currently be a student at a higher
education institution; and have an ISWC 2018 submission that has been
accepted to either the main conference, the doctoral consortium, an ISWC
2018 workshop, the poster/demo session, or the Semantic Web challenge (you
may have submissions to more than one of these categories). If there are
additional funds available, we will also consider supporting students who
do not have papers at the conference.Fill out and submit the ISWC 2018
Student Travel Award Application Form.Ask your supervisor to email
iswc.travel.awa...@gmail.com , confirming
that you are a current student under their supervision and that you will be
attending ISWC 2018. The subject of the email should be “ISWC 2018 Student
Travel Award Application Verification for {YOUR NAME}”. The text should
read “I confirm that {YOUR NAME} is a student that is currently under my
supervision at {INSTITUTION NAME} and that they will be attending ISWC 2018
to present work that they completed under my supervision. {SUPERVISOR
NAME}”Register for ISWC 2018.You will receive a notification by September
10th, 2018 and be allowed to register at an early registration rate.AWARD
DETAILSThe exact amount awarded to each student will depend on the needs
stated in the application and the kind of submission that they have in ISWC
2018. Preference will be given to students that are first authors on papers
accepted to the main conference or the doctoral consortium, followed by
those who are first authors on papers accepted to ISWC workshops and the
Poster & Demo session. If there are additional funds available, we will
also consider supporting students who do not have papers at the
conference.Please note that students attending higher education
institutions in the U.S. will be eligible for funding provided by the NSF.
If your application is approved, to ensure reimbursement per NSF
guidelines, you are required to travel with US-flag carriers or on flights
ticketed through an US-carrier.This year, we will consider supporting
travel companions for attendees with special needs and aim to support a
diverse group of students (e.g., gender, nationality, ethnicity, type of
institution, geographic location). If you have any questions or would like
to discuss your specific requirements please do not hesitate to contact
iswc.travel.awa...@gmail.com .STUDENT TRAVEL
AWARD SPONSORSNational Science Foundation (NSF)Semantic Web Science
Association (SWSA)*
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata