Re: [Wikidata] Wikiata and the LOD cloud

2018-05-08 Thread Sebastian Hellmann

Hi Lucas, Denny,

all you need to do is update your entry on old.datahub.io:

https://old.datahub.io/dataset/wikidata

It was edited by Lucie-Aimée Kaffee two years ago. You need to contact 
her, as she created the Wikimedia org in Datahub. I might be able to 
have someone switch ownership of the org to a new account.


But there is many essential metadata missing:

Compare with the DBpedia entry: https://old.datahub.io/dataset/dbpedia

Especially the links and the triple size in the bottom. So you need to 
keep this one updated in order to appear in the LOD cloud.


Please tell me if you can't edit it, I know a former admin from the time 
datahub.io was first created 10 years ago in LOD2 and LATC EU projects, 
he might be able to do something in case there is nobody answering due 
to datahub.io switching to a new style.


All the best,

Sebastian


On 07.05.2018 22:35, Lucas Werkmeister wrote:
Folks, I’m already in contact with John, there’s no need to contact 
him again :)


Cheers, Lucas

Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić 
>:


Well, then, we have tried several times to get into that diagram,
and it never worked out.

So, given the page you linke, it says:


  Contributing to the Diagram

First, make sure that you publish data according to the Linked
Data principles .
We interpret this as:

  * There must be /resolvable http:// (or https://) URIs/.
  * They must resolve, with or without content negotiation, to
/RDF data/ in one of the popular RDF formats (RDFa, RDF/XML,
Turtle, N-Triples).
  * The dataset must contain /at least 1000 triples/. (Hence, your
FOAF file most likely does not qualify.)
  * The dataset must be connected via /RDF links/ to a dataset
that is already in the diagram. This means, either your
dataset must use URIs from the other dataset, or vice versa.
We arbitrarily require at least 50 links.
  * Access of the /entire/ dataset must be possible via /RDF
crawling/, via an /RDF dump/, or via a /SPARQL endpoint/.

The process for adding datasets is still under development, please
contact John P. McCrae  to add a new dataset


Wikidata fulfills all the conditions easily. So, here we go, I am
adding John to this thread - although I know he already knows
about this request - and I am asking officially to enter Wikidata
into the LOD diagram.

Let's keep it all open, and see where it goes from here.

Cheers,
Denny


On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann
> wrote:

Hi Denny, Maarten,

you should read your own emails. In fact it is quite easy to
join the LOD cloud diagram.

The most important step is to follow the instructions on the
page: http://lod-cloud.net under how to contribute and then
add the metadata.

Some years ago I made a Wordpress with enabled Linked Data:
http://www.klappstuhlclub.de/wp/ Even this is included as I
simply added the metadata entry.

Do you really think John McCrae added a line in the code that
says "if (dataset==wikidata) skip; " ?

You just need to add it like everybody else in LOD, DBpedia
also created its entry and updates it now and then. The same
accounts for http://lov.okfn.org Somebody from Wikidata needs
to upload the Wikidata properties as OWL.  If nobody does it,
it will not be in there.

All the best,

Sebastian


On 04.05.2018 18:33, Maarten Dammers wrote:

It almost feels like someone doesn’t want Wikidata in there?
Maybe that website is maintained by DBpedia fans? Just
thinking out loud here because DBpedia is very popular in the
academic world and Wikidata a huge threat for that popularity.

Maarten

Op 4 mei 2018 om 17:20 heeft Denny Vrandečić
> het
volgende geschreven:


I'm pretty sure that Wikidata is doing better than 90% of
the current bubbles in the diagram.

If they wanted to have Wikidata in the diagram it would have
been there before it was too small to read it. :)

On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider
> wrote:

Thanks for the corrections.

So https://www.wikidata.org/entity/Q42 is *the* Wikidata
IRI for Douglas
Adams.  Retrieving from this IRI results in a 303 See
Other to
https://www.wikidata.org/wiki/Special:EntityData/Q42,
which (I guess) is the
main IRI for representations of 

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Denny Vrandečić
Thanks!

On Mon, May 7, 2018 at 1:36 PM Lucas Werkmeister <
lucas.werkmeis...@wikimedia.de> wrote:

> Folks, I’m already in contact with John, there’s no need to contact him
> again :)
>
> Cheers, Lucas
>
> Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić <
> vrande...@gmail.com>:
>
>> Well, then, we have tried several times to get into that diagram, and it
>> never worked out.
>>
>> So, given the page you linke, it says:
>>
>> Contributing to the Diagram
>>
>> First, make sure that you publish data according to the Linked Data
>> principles . We
>> interpret this as:
>>
>>- There must be *resolvable http:// (or https://) URIs*.
>>- They must resolve, with or without content negotiation, to *RDF
>>data* in one of the popular RDF formats (RDFa, RDF/XML, Turtle,
>>N-Triples).
>>- The dataset must contain *at least 1000 triples*. (Hence, your FOAF
>>file most likely does not qualify.)
>>- The dataset must be connected via *RDF links* to a dataset that is
>>already in the diagram. This means, either your dataset must use URIs from
>>the other dataset, or vice versa. We arbitrarily require at least 50 
>> links.
>>- Access of the *entire* dataset must be possible via *RDF crawling*,
>>via an *RDF dump*, or via a *SPARQL endpoint*.
>>
>> The process for adding datasets is still under development, please
>> contact John P. McCrae  to add a new dataset
>>
>> Wikidata fulfills all the conditions easily. So, here we go, I am adding
>> John to this thread - although I know he already knows about this request -
>> and I am asking officially to enter Wikidata into the LOD diagram.
>>
>> Let's keep it all open, and see where it goes from here.
>>
>> Cheers,
>> Denny
>>
>>
>> On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
>> hellm...@informatik.uni-leipzig.de> wrote:
>>
>>> Hi Denny, Maarten,
>>>
>>> you should read your own emails. In fact it is quite easy to join the
>>> LOD cloud diagram.
>>>
>>> The most important step is to follow the instructions on the page:
>>> http://lod-cloud.net under how to contribute and then add the metadata.
>>>
>>> Some years ago I made a Wordpress with enabled Linked Data:
>>> http://www.klappstuhlclub.de/wp/ Even this is included as I simply
>>> added the metadata entry.
>>>
>>> Do you really think John McCrae added a line in the code that says "if
>>> (dataset==wikidata) skip; " ?
>>>
>>> You just need to add it like everybody else in LOD, DBpedia also created
>>> its entry and updates it now and then. The same accounts for
>>> http://lov.okfn.org  Somebody from Wikidata needs to upload the
>>> Wikidata properties as OWL.  If nobody does it, it will not be in there.
>>>
>>> All the best,
>>>
>>> Sebastian
>>>
>>> On 04.05.2018 18:33, Maarten Dammers wrote:
>>>
>>> It almost feels like someone doesn’t want Wikidata in there? Maybe that
>>> website is maintained by DBpedia fans? Just thinking out loud here because
>>> DBpedia is very popular in the academic world and Wikidata a huge threat
>>> for that popularity.
>>>
>>> Maarten
>>>
>>> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het
>>> volgende geschreven:
>>>
>>> I'm pretty sure that Wikidata is doing better than 90% of the current
>>> bubbles in the diagram.
>>>
>>> If they wanted to have Wikidata in the diagram it would have been there
>>> before it was too small to read it. :)
>>>
>>> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
>>> pfpschnei...@gmail.com> wrote:
>>>
 Thanks for the corrections.

 So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for
 Douglas
 Adams.  Retrieving from this IRI results in a 303 See Other to
 https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess)
 is the
 main IRI for representations of Douglas Adams and other pages with
 information about him.

 From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
 negotiation can be used to get the JSON representation (the default),
 other
 representations including Turtle, and human-readable information.  (Well
 actually I'm not sure that this is really correct.  It appears that
 instead
 of directly using content negotiation, another 303 See Other is used to
 provide an IRI for a document in the requested format.)

 https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
 https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
 machine-readable documents containing the Wikidata information about
 Douglas
 Adams.  Content negotiation is not possible on these pages.

 https://www.wikidata.org/wiki/Q42 is the IRI that produces a
 human-readable
 version of the information about Douglas Adams.  Content negotiation is
 not
 possible on this page, but it does have link rel="alternate" to the
 machine-readable pages.

 Strangely 

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Lucas Werkmeister
Folks, I’m already in contact with John, there’s no need to contact him
again :)

Cheers, Lucas

Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić <
vrande...@gmail.com>:

> Well, then, we have tried several times to get into that diagram, and it
> never worked out.
>
> So, given the page you linke, it says:
>
> Contributing to the Diagram
>
> First, make sure that you publish data according to the Linked Data
> principles . We interpret
> this as:
>
>- There must be *resolvable http:// (or https://) URIs*.
>- They must resolve, with or without content negotiation, to *RDF data* in
>one of the popular RDF formats (RDFa, RDF/XML, Turtle, N-Triples).
>- The dataset must contain *at least 1000 triples*. (Hence, your FOAF
>file most likely does not qualify.)
>- The dataset must be connected via *RDF links* to a dataset that is
>already in the diagram. This means, either your dataset must use URIs from
>the other dataset, or vice versa. We arbitrarily require at least 50 links.
>- Access of the *entire* dataset must be possible via *RDF crawling*,
>via an *RDF dump*, or via a *SPARQL endpoint*.
>
> The process for adding datasets is still under development, please contact 
> John
> P. McCrae  to add a new dataset
>
> Wikidata fulfills all the conditions easily. So, here we go, I am adding
> John to this thread - although I know he already knows about this request -
> and I am asking officially to enter Wikidata into the LOD diagram.
>
> Let's keep it all open, and see where it goes from here.
>
> Cheers,
> Denny
>
>
> On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
> hellm...@informatik.uni-leipzig.de> wrote:
>
>> Hi Denny, Maarten,
>>
>> you should read your own emails. In fact it is quite easy to join the LOD
>> cloud diagram.
>>
>> The most important step is to follow the instructions on the page:
>> http://lod-cloud.net under how to contribute and then add the metadata.
>>
>> Some years ago I made a Wordpress with enabled Linked Data:
>> http://www.klappstuhlclub.de/wp/ Even this is included as I simply added
>> the metadata entry.
>>
>> Do you really think John McCrae added a line in the code that says "if
>> (dataset==wikidata) skip; " ?
>>
>> You just need to add it like everybody else in LOD, DBpedia also created
>> its entry and updates it now and then. The same accounts for
>> http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata
>> properties as OWL.  If nobody does it, it will not be in there.
>>
>> All the best,
>>
>> Sebastian
>>
>> On 04.05.2018 18:33, Maarten Dammers wrote:
>>
>> It almost feels like someone doesn’t want Wikidata in there? Maybe that
>> website is maintained by DBpedia fans? Just thinking out loud here because
>> DBpedia is very popular in the academic world and Wikidata a huge threat
>> for that popularity.
>>
>> Maarten
>>
>> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het
>> volgende geschreven:
>>
>> I'm pretty sure that Wikidata is doing better than 90% of the current
>> bubbles in the diagram.
>>
>> If they wanted to have Wikidata in the diagram it would have been there
>> before it was too small to read it. :)
>>
>> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
>> pfpschnei...@gmail.com> wrote:
>>
>>> Thanks for the corrections.
>>>
>>> So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
>>> Adams.  Retrieving from this IRI results in a 303 See Other to
>>> https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess)
>>> is the
>>> main IRI for representations of Douglas Adams and other pages with
>>> information about him.
>>>
>>> From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
>>> negotiation can be used to get the JSON representation (the default),
>>> other
>>> representations including Turtle, and human-readable information.  (Well
>>> actually I'm not sure that this is really correct.  It appears that
>>> instead
>>> of directly using content negotiation, another 303 See Other is used to
>>> provide an IRI for a document in the requested format.)
>>>
>>> https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
>>> https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
>>> machine-readable documents containing the Wikidata information about
>>> Douglas
>>> Adams.  Content negotiation is not possible on these pages.
>>>
>>> https://www.wikidata.org/wiki/Q42 is the IRI that produces a
>>> human-readable
>>> version of the information about Douglas Adams.  Content negotiation is
>>> not
>>> possible on this page, but it does have link rel="alternate" to the
>>> machine-readable pages.
>>>
>>> Strangely this page has a link rel="canonical" to itself.  Shouldn't that
>>> link be to https://www.wikidata.org/entity/Q42?  There is a
>>> human-visible
>>> link to this IRI, but there doesn't appear to be any machine-readable
>>> link.
>>>
>>> 

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Denny Vrandečić
Well, then, we have tried several times to get into that diagram, and it
never worked out.

So, given the page you linke, it says:

Contributing to the Diagram

First, make sure that you publish data according to the Linked Data
principles . We interpret
this as:

   - There must be *resolvable http:// (or https://) URIs*.
   - They must resolve, with or without content negotiation, to *RDF data* in
   one of the popular RDF formats (RDFa, RDF/XML, Turtle, N-Triples).
   - The dataset must contain *at least 1000 triples*. (Hence, your FOAF
   file most likely does not qualify.)
   - The dataset must be connected via *RDF links* to a dataset that is
   already in the diagram. This means, either your dataset must use URIs from
   the other dataset, or vice versa. We arbitrarily require at least 50 links.
   - Access of the *entire* dataset must be possible via *RDF crawling*,
   via an *RDF dump*, or via a *SPARQL endpoint*.

The process for adding datasets is still under development, please contact John
P. McCrae  to add a new dataset

Wikidata fulfills all the conditions easily. So, here we go, I am adding
John to this thread - although I know he already knows about this request -
and I am asking officially to enter Wikidata into the LOD diagram.

Let's keep it all open, and see where it goes from here.

Cheers,
Denny


On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann <
hellm...@informatik.uni-leipzig.de> wrote:

> Hi Denny, Maarten,
>
> you should read your own emails. In fact it is quite easy to join the LOD
> cloud diagram.
>
> The most important step is to follow the instructions on the page:
> http://lod-cloud.net under how to contribute and then add the metadata.
>
> Some years ago I made a Wordpress with enabled Linked Data:
> http://www.klappstuhlclub.de/wp/ Even this is included as I simply added
> the metadata entry.
>
> Do you really think John McCrae added a line in the code that says "if
> (dataset==wikidata) skip; " ?
>
> You just need to add it like everybody else in LOD, DBpedia also created
> its entry and updates it now and then. The same accounts for
> http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata
> properties as OWL.  If nobody does it, it will not be in there.
>
> All the best,
>
> Sebastian
>
> On 04.05.2018 18:33, Maarten Dammers wrote:
>
> It almost feels like someone doesn’t want Wikidata in there? Maybe that
> website is maintained by DBpedia fans? Just thinking out loud here because
> DBpedia is very popular in the academic world and Wikidata a huge threat
> for that popularity.
>
> Maarten
>
> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het
> volgende geschreven:
>
> I'm pretty sure that Wikidata is doing better than 90% of the current
> bubbles in the diagram.
>
> If they wanted to have Wikidata in the diagram it would have been there
> before it was too small to read it. :)
>
> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
> pfpschnei...@gmail.com> wrote:
>
>> Thanks for the corrections.
>>
>> So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
>> Adams.  Retrieving from this IRI results in a 303 See Other to
>> https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess) is
>> the
>> main IRI for representations of Douglas Adams and other pages with
>> information about him.
>>
>> From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
>> negotiation can be used to get the JSON representation (the default),
>> other
>> representations including Turtle, and human-readable information.  (Well
>> actually I'm not sure that this is really correct.  It appears that
>> instead
>> of directly using content negotiation, another 303 See Other is used to
>> provide an IRI for a document in the requested format.)
>>
>> https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
>> https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
>> machine-readable documents containing the Wikidata information about
>> Douglas
>> Adams.  Content negotiation is not possible on these pages.
>>
>> https://www.wikidata.org/wiki/Q42 is the IRI that produces a
>> human-readable
>> version of the information about Douglas Adams.  Content negotiation is
>> not
>> possible on this page, but it does have link rel="alternate" to the
>> machine-readable pages.
>>
>> Strangely this page has a link rel="canonical" to itself.  Shouldn't that
>> link be to https://www.wikidata.org/entity/Q42?  There is a human-visible
>> link to this IRI, but there doesn't appear to be any machine-readable
>> link.
>>
>> RDF links to other IRIs for Douglas Adams are given in RDF pages by
>> properties in the wdtn namespace.  Many, but not all, identifiers are
>> handled this way.  (Strangely ISNI (P213) isn't even though it is linked
>> on
>> the human-readable page.)
>>
>> So it looks as if Wikidata can be considered as Linked Open Data but maybe

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Stas Malyshev
Hi!

> you should read your own emails. In fact it is quite easy to join the
> LOD cloud diagram.
> 
> The most important step is to follow the instructions on the page:
> http://lod-cloud.net under how to contribute and then add the metadata.

I may not be reading it right or misunderstanding something, but I tried
to locate up-to-date working instructions for doing this a few times and
it always ended up going nowhere - the instructions turned out to be out
of date, or new process not working yet, or something else. It would be
very nice and very helpful if you could point out specifically where on
that page are step-by-step instructions which could be followed and
result in resolving this issue?

> Do you really think John McCrae added a line in the code that says "if
> (dataset==wikidata) skip; " ?

I don't think anybody thinks that. And I think most of people there
think it would be nice to have Wikidata added to LOD. It sounds like you
know how to do it, could you please share more specific information
about it?

> You just need to add it like everybody else in LOD, DBpedia also created
> its entry and updates it now and then. The same accounts for
> http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata
> properties as OWL.  If nobody does it, it will not be in there.

Could you share more information about lov.okfn.org? Going there
produces 502, and it's not mentioned anywhere on lod-cloud.net. Where it
is documented and what is exactly the process and what you mean by
"upload the Wikidata properties as OWL"? More detailed information would
be hugely helpful.

Thanks in advance,
-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Fariz Darari
To add to Andy's reply, on Wikidata the combination of Ranking (
https://m.wikidata.org/wiki/Help:Ranking) , Qualifier (
https://m.wikidata.org/wiki/Special:MyLanguage/Help:Qualifiers) and
References (https://m.wikidata.org/wiki/Special:MyLanguage/Help:Sources)
would enable storing disputed property values. So, it does make sense.

-fariz

On Mon, May 7, 2018, 19:27 Andy Mabbett  wrote:

> On 7 May 2018 at 00:15, Sylvain Boissel  wrote:
>
> > Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a
> écrit
>
> >> On 5 May 2018 at 14:39, David Abián  wrote:
> >>
> >> > Both Wikidata and DBpedia surely can, and should, coexist because
> we'll
> >> > never be able to host in Wikidata the entirety of the Wikipedias.
> >>
> >> Can you give an example of something that can be represented in
> >> DBpedia, but not Wikidata?
>
> > Sure : DBpedia knows the specific values different versions of Wikipedia
> > choose to display in the infobox. For example, the size or population of
> > countries with disputed borders. This data is useful for researchers
> working
> > on cultural bias in Wikipedia, but it makes little sense to store it in
> > Wikidata.
>
> Except that does; and Wikidata is more than capable of holding values
> from conflicting sources. So again, this does not substantiate the
> "Both Wikidata and DBpedia surely can, and should, coexist because
> we'll never be able to host in Wikidata the entirety of the
> Wikipedias" claim.
>
> --
> Andy Mabbett
> @pigsonthewing
> http://pigsonthewing.org.uk
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Sebastian Hellmann
Wikidata should hold the data of all Wikipedias, that is its main 
purpose. However, it doesn't yet and there are many problems, i.e. 
missing references, population count moved to Commons and an open 
discussion to even throw out Wikidata from the infoboxes: 
https://en.wikipedia.org/wiki/Wikipedia:Wikidata/2018_Infobox_RfC


DBpedia is more about technology than data, so we are trying to help out 
and push Wikidata, so it has all the values of all Wikipedias plus it's 
references: 
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/GlobalFactSync


All the best,

Sebastian


On 07.05.2018 14:26, Andy Mabbett wrote:

On 7 May 2018 at 00:15, Sylvain Boissel  wrote:


Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a écrit

On 5 May 2018 at 14:39, David Abián  wrote:


Both Wikidata and DBpedia surely can, and should, coexist because we'll
never be able to host in Wikidata the entirety of the Wikipedias.

Can you give an example of something that can be represented in
DBpedia, but not Wikidata?

Sure : DBpedia knows the specific values different versions of Wikipedia
choose to display in the infobox. For example, the size or population of
countries with disputed borders. This data is useful for researchers working
on cultural bias in Wikipedia, but it makes little sense to store it in
Wikidata.

Except that does; and Wikidata is more than capable of holding values
from conflicting sources. So again, this does not substantiate the
"Both Wikidata and DBpedia surely can, and should, coexist because
we'll never be able to host in Wikidata the entirety of the
Wikipedias" claim.



--
All the best,
Sebastian Hellmann

Director of Knowledge Integration and Linked Data Technologies (KILT) 
Competence Center

at the Institute for Applied Informatics (InfAI) at Leipzig University
Executive Director of the DBpedia Association
Projects: http://dbpedia.org, http://nlp2rdf.org, 
http://linguistics.okfn.org, https://www.w3.org/community/ld4lt 


Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Andy Mabbett
On 7 May 2018 at 00:15, Sylvain Boissel  wrote:

> Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a écrit

>> On 5 May 2018 at 14:39, David Abián  wrote:
>>
>> > Both Wikidata and DBpedia surely can, and should, coexist because we'll
>> > never be able to host in Wikidata the entirety of the Wikipedias.
>>
>> Can you give an example of something that can be represented in
>> DBpedia, but not Wikidata?

> Sure : DBpedia knows the specific values different versions of Wikipedia
> choose to display in the infobox. For example, the size or population of
> countries with disputed borders. This data is useful for researchers working
> on cultural bias in Wikipedia, but it makes little sense to store it in
> Wikidata.

Except that does; and Wikidata is more than capable of holding values
from conflicting sources. So again, this does not substantiate the
"Both Wikidata and DBpedia surely can, and should, coexist because
we'll never be able to host in Wikidata the entirety of the
Wikipedias" claim.

-- 
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-07 Thread Sebastian Hellmann

Hi Denny, Maarten,

you should read your own emails. In fact it is quite easy to join the 
LOD cloud diagram.


The most important step is to follow the instructions on the page: 
http://lod-cloud.net under how to contribute and then add the metadata.


Some years ago I made a Wordpress with enabled Linked Data: 
http://www.klappstuhlclub.de/wp/ Even this is included as I simply added 
the metadata entry.


Do you really think John McCrae added a line in the code that says "if 
(dataset==wikidata) skip; " ?


You just need to add it like everybody else in LOD, DBpedia also created 
its entry and updates it now and then. The same accounts for 
http://lov.okfn.org  Somebody from Wikidata needs to upload the Wikidata 
properties as OWL.  If nobody does it, it will not be in there.


All the best,

Sebastian


On 04.05.2018 18:33, Maarten Dammers wrote:
It almost feels like someone doesn’t want Wikidata in there? Maybe 
that website is maintained by DBpedia fans? Just thinking out loud 
here because DBpedia is very popular in the academic world and 
Wikidata a huge threat for that popularity.


Maarten

Op 4 mei 2018 om 17:20 heeft Denny Vrandečić > het volgende geschreven:


I'm pretty sure that Wikidata is doing better than 90% of the current 
bubbles in the diagram.


If they wanted to have Wikidata in the diagram it would have been 
there before it was too small to read it. :)


On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider 
> wrote:


Thanks for the corrections.

So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for
Douglas
Adams.  Retrieving from this IRI results in a 303 See Other to
https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I
guess) is the
main IRI for representations of Douglas Adams and other pages with
information about him.

From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
negotiation can be used to get the JSON representation (the
default), other
representations including Turtle, and human-readable
information.  (Well
actually I'm not sure that this is really correct.  It appears
that instead
of directly using content negotiation, another 303 See Other is
used to
provide an IRI for a document in the requested format.)

https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the
useful
machine-readable documents containing the Wikidata information
about Douglas
Adams.  Content negotiation is not possible on these pages.

https://www.wikidata.org/wiki/Q42 is the IRI that produces a
human-readable
version of the information about Douglas Adams.  Content
negotiation is not
possible on this page, but it does have link rel="alternate" to the
machine-readable pages.

Strangely this page has a link rel="canonical" to itself.
Shouldn't that
link be to https://www.wikidata.org/entity/Q42? There is a
human-visible
link to this IRI, but there doesn't appear to be any
machine-readable link.

RDF links to other IRIs for Douglas Adams are given in RDF pages by
properties in the wdtn namespace.  Many, but not all, identifiers are
handled this way.  (Strangely ISNI (P213) isn't even though it is
linked on
the human-readable page.)

So it looks as if Wikidata can be considered as Linked Open Data
but maybe
some improvements can be made.


peter



On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https URIs. 
The http IRIs
>> redirect to https IRIs.
>
> That's right.
>
>>   As far as I can tell no content negotiation is
>> done.
>
> No, you're mistaken. Your tried the URL of a wikipage in your
curl command.
> Those are for human consumption, thus not available in turtle.
>
> The "real IRIs" of Wikidata entities are like this:
> https://www.wikidata.org/entity/Q{NUMBER}

>
> However, they 303 redirect to
> https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}

>
> which is the identifier of a schema:Dataset. Then, if you HTTP
GET these
> URIs, you can content negotiate them to JSON
>
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json
)
or to
> turtle
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl
).
>
>
> Suprisingly, there is no connection between the entity IRIs and
the wikipage

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-06 Thread Sylvain Boissel
Le sam. 5 mai 2018 à 16:35, Andy Mabbett  a
écrit :

> On 5 May 2018 at 14:39, David Abián  wrote:
>
> > Both Wikidata and DBpedia surely can, and should, coexist because we'll
> > never be able to host in Wikidata the entirety of the Wikipedias.
>
> Can you give an example of something that can be represented in
> DBpedia, but not Wikidata?
>

Sure : DBpedia knows the specific values different versions of Wikipedia
choose to display in the infobox. For example, the size or population of
countries with disputed borders. This data is useful for researchers
working on cultural bias in Wikipedia, but it makes little sense to store
it in Wikidata.

>
> --
> Andy Mabbett
> @pigsonthewing
> http://pigsonthewing.org.uk
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-06 Thread Ettore RIZZA
@Antonin : You're right, I now remember Magnus Knuth's message on this list
about GlobalFactSync
, a
lite version of CrossWikiFact, if I understood correctly. I also remember
that his message did not trigger many reactions...

2018-05-06 10:46 GMT+02:00 Antonin Delpeuch (lists) <
li...@antonin.delpeuch.eu>:

> On 06/05/2018 10:37, Ettore RIZZA wrote:
> > More simply, there's still a long way to go until Wikidata imports
> > all the data contained in Wikipedia infoboxes (or equivalent data
> > from other sources), let alone the rest.
> >
> >
> > This surprises me. Are there any statistics somewhere on the rate of
> > Wikipedia's infoboxes fully parsed ?
>
>
> That was more or less the goal of the CrossWikiFact project, which was
> unfortunately not very widely supported:
> https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact
>
> It's still not clear to me why this got so little support - it looked
> like a good opportunity to collaborate with DBpedia.
>
> Antonin
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-06 Thread Antonin Delpeuch (lists)
On 06/05/2018 10:37, Ettore RIZZA wrote:
> More simply, there's still a long way to go until Wikidata imports
> all the data contained in Wikipedia infoboxes (or equivalent data
> from other sources), let alone the rest.
> 
> 
> This surprises me. Are there any statistics somewhere on the rate of
> Wikipedia's infoboxes fully parsed ?


That was more or less the goal of the CrossWikiFact project, which was
unfortunately not very widely supported:
https://meta.wikimedia.org/wiki/Grants:Project/DBpedia/CrossWikiFact

It's still not clear to me why this got so little support - it looked
like a good opportunity to collaborate with DBpedia.

Antonin

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-06 Thread Ettore RIZZA
>
> More simply, there's still a long way to go until Wikidata imports all the
> data contained in Wikipedia infoboxes (or equivalent data from other
> sources), let alone the rest.


This surprises me. Are there any statistics somewhere on the rate of
Wikipedia's infoboxes fully parsed ?

2018-05-05 19:04 GMT+02:00 Federico Leva (Nemo) :

> Andy Mabbett, 05/05/2018 17:33:
>
>> Both Wikidata and DBpedia surely can, and should, coexist because we'll
>>> never be able to host in Wikidata the entirety of the Wikipedias.
>>>
>> Can you give an example of something that can be represented in
>> DBpedia, but not Wikidata?
>>
>
> More simply, there's still a long way to go until Wikidata imports all the
> data contained in Wikipedia infoboxes (or equivalent data from other
> sources), let alone the rest.
>
> So, as Gerard mentions, DBpedia has something more/different to offer.
> (The same is true for the various extractions of structured data from
> Wiktionary vs. Wiktionary's own unstructured data.)
>
> That said, the LOD cloud is about links, as far as I understand. Wikidata
> should be very interesting in it.
>
> Federico
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Gerard Meijssen
Hoi,
Yes but as it is the data in Wikipedia, particularly in lists is of by 6 to
8%. In order for Wikidata to host the data, the data first has to be
curated. Errors can be found on either side. So we have to be able to
compare in order to know what to curate. That takes an effort for us to be
effective in the curation.

I have moved a lot of data from Wikpedia to Wikidata and I refuse to edit
Wikipedia because my experience is one of more hostility at that end. What
I am looking for is collaboration not for myself to be on a lonely track
fighting windmills.
Thanks,
  GerardM

On 5 May 2018 at 21:15, Andra Waagmeester  wrote:

> More simply, there's still a long way to go until Wikidata imports all the
>> data contained in Wikipedia infoboxes (or equivalent data from other
>> sources), let alone the rest.
>>
>>
> Why would you want to import all the data contained in  Wikipedia info
> boxes? I would rather aim for the oposite, i.e. Infobox being built from
> data in Wikidata. Simply because with Wikidata it is easier to capture
> where the data is coming from (through the references and qualifiers) than
> with Wikipedia info boxes.
>
>
> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Andra Waagmeester
>
> More simply, there's still a long way to go until Wikidata imports all the
> data contained in Wikipedia infoboxes (or equivalent data from other
> sources), let alone the rest.
>
>
Why would you want to import all the data contained in  Wikipedia info
boxes? I would rather aim for the oposite, i.e. Infobox being built from
data in Wikidata. Simply because with Wikidata it is easier to capture
where the data is coming from (through the references and qualifiers) than
with Wikipedia info boxes.


___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Federico Leva (Nemo)

Andy Mabbett, 05/05/2018 20:50:

The statement I questioned was "never able"; that's not a matter of "a
long way to go".


I see. I'm not sure about the long run. On the other hand, in the long 
run we're all dead.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Gerard Meijssen
Hoi,
And given that DBpedia finds its changes from the RSS feed for an
increasing number of Wikipedias and given that we at Wikidata do not even
regularly harvest data based on "category contains" on a regular basis
(just as an example) there is no justifiable room for any sense of
superiority at Wikidata.
Thanks,
  GerardM

On 5 May 2018 at 20:35, David Abián  wrote:

> Wikipedia isn't a read-only interface but an editable project, so there
> will always be contents in Wikipedia that aren't in Wikidata, so DBpedia
> will always have the opportunity to offer contents from Wikipedia that
> aren't in Wikidata. That's all.
>
> El 05/05/18 a las 19:52, Andy Mabbett escribió:
> >> I don't mean a technical lack of expressiveness, but the impossibility,
> >> and lack of intention, for Wikipedia to become a read-only interface of
> >> Wikidata someday.
> > Well, neither is DBpedia, so I don't see how that substantiates your
> claim.
>
> --
> David Abián
> Wikimedia España
> https://wikimedia.es/
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread David Abián
Wikipedia isn't a read-only interface but an editable project, so there
will always be contents in Wikipedia that aren't in Wikidata, so DBpedia
will always have the opportunity to offer contents from Wikipedia that
aren't in Wikidata. That's all.

El 05/05/18 a las 19:52, Andy Mabbett escribió:
>> I don't mean a technical lack of expressiveness, but the impossibility,
>> and lack of intention, for Wikipedia to become a read-only interface of
>> Wikidata someday.
> Well, neither is DBpedia, so I don't see how that substantiates your claim.

-- 
David Abián
Wikimedia España
https://wikimedia.es/

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Andy Mabbett
On 5 May 2018 at 15:52, David Abián  wrote:

> El 05/05/18 a las 16:33, Andy Mabbett escribió:
>> On 5 May 2018 at 14:39, David Abián  wrote:
>>
>>> Both Wikidata and DBpedia surely can, and should, coexist because we'll
>>> never be able to host in Wikidata the entirety of the Wikipedias.
>>
>> Can you give an example of something that can be represented in
>> DBpedia, but not Wikidata?

> I don't mean a technical lack of expressiveness, but the impossibility,
> and lack of intention, for Wikipedia to become a read-only interface of
> Wikidata someday.

Well, neither is DBpedia, so I don't see how that substantiates your claim.

-- 
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Andy Mabbett
On 5 May 2018 at 18:04, Federico Leva (Nemo)  wrote:
> Andy Mabbett, 05/05/2018 17:33:
>>>
>>> Both Wikidata and DBpedia surely can, and should, coexist because we'll
>>> never be able to host in Wikidata the entirety of the Wikipedias.
>>
>> Can you give an example of something that can be represented in
>> DBpedia, but not Wikidata?
>
>
> More simply, there's still a long way to go until Wikidata imports all the
> data contained in Wikipedia infoboxes (or equivalent data from other
> sources), let alone the rest.

The statement I questioned was "never able"; that's not a matter of "a
long way to go".

-- 
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Federico Leva (Nemo)

Andy Mabbett, 05/05/2018 17:33:

Both Wikidata and DBpedia surely can, and should, coexist because we'll
never be able to host in Wikidata the entirety of the Wikipedias.

Can you give an example of something that can be represented in
DBpedia, but not Wikidata?


More simply, there's still a long way to go until Wikidata imports all 
the data contained in Wikipedia infoboxes (or equivalent data from other 
sources), let alone the rest.


So, as Gerard mentions, DBpedia has something more/different to offer. 
(The same is true for the various extractions of structured data from 
Wiktionary vs. Wiktionary's own unstructured data.)


That said, the LOD cloud is about links, as far as I understand. 
Wikidata should be very interesting in it.


Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Ettore RIZZA
>
> The semantics of Wikidata qualifiers have not been defined and and won't
> be enforced. It's left up to users to invent their own meanings. (In this
> way, Wikidata is still a lot like the prose in Wikipedia.)
> We need more "curated" projects like DBpedia



Mmh, I would have rather thought that the system of qualifiers, even
imperfect, was a great enhacement compared to the DBpedia model - which is
a bit of a mess.

Let's take Winston Churchill item  :
Wikidata tells us, for example, that he served as British Prime Minister
from 1951 to 1955 replacing Clement Attlee and that he was replaced at this
position by Anthony Eden. In DBpedia
, which does not use
reification, we have just a list of offices, a list of successors, a list
of predecessors, a list of dates, and no way to figure out who replaced
whom to what and when.

The handcrafted ontology of DBpedia is certainly more consistent, but it's
also much poorer. Rather than impoverishing Wikidata's class system, would
it not be better to find a way to avoid horrors like "actor is a subclass
of person" . I would be
interested to know if there are researchers working on the subject.

Regarding the compared size of DBpedia and Wikidata, I thought that
Wikidata is by nature much larger. DBpedia cannot contain more entities
than there are in the english Wikipedia (about 5 million), with its very
strict criteria of notoriety, while Wikidata allows any more things. Am I
wrong ? (I consider of course that DBpedia and its other language versions
are different knowledge bases, as is the case in the LOD cloud)

2018-05-05 16:52 GMT+02:00 David Abián :

> I don't mean a technical lack of expressiveness, but the impossibility,
> and lack of intention, for Wikipedia to become a read-only interface of
> Wikidata someday.
>
>
> El 05/05/18 a las 16:33, Andy Mabbett escribió:
> > On 5 May 2018 at 14:39, David Abián  wrote:
> >
> >> Both Wikidata and DBpedia surely can, and should, coexist because we'll
> >> never be able to host in Wikidata the entirety of the Wikipedias.
> >
> > Can you give an example of something that can be represented in
> > DBpedia, but not Wikidata?
> >
>
> --
> David Abián
> Wikimedia España
> https://wikimedia.es/
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Andy Mabbett
On 5 May 2018 at 14:39, David Abián  wrote:

> Both Wikidata and DBpedia surely can, and should, coexist because we'll
> never be able to host in Wikidata the entirety of the Wikipedias.

Can you give an example of something that can be represented in
DBpedia, but not Wikidata?

-- 
Andy Mabbett
@pigsonthewing
http://pigsonthewing.org.uk

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread David Abián
Both Wikidata and DBpedia surely can, and should, coexist because we'll
never be able to host in Wikidata the entirety of the Wikipedias.
However, it's clear that, the more data is in Wikidata and, therefore,
retrieved by the Wikipedias, the less data DBpedia has to extract from
these Wikipedias.

I wouldn't say that Wikidata, or DBpedia, are happy or sad to
collaborate with each other — they're complex abstractions with many
different people involved with their corresponding considerations, and
some of these people working on both projects. However, I know that
Wikidata, as a platform, is collaborative in the highest possible
degree, so not only DBpedia, but absolutely anyone, can collaborate and
build Wikidata as they actually consider.


El 05/05/18 a las 15:09, Gerard Meijssen escribió:
> PS There is room enough for both Wikidata and DBpedia and if anything
> DBpedia is quite happy to collaborate, Wikidata is not collaborating;
> everything is Wikidata centred. I get the impression that it is like the
> Borg and its charm is sometimes equivalent.

-- 
David Abián
Wikimedia España
https://wikimedia.es/

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Jeff Thompson
In Wikidata, the subclass hierarchy and the way that properties are used 
is unmanaged and contradictory. Furthermore, Wikidata added statement 
qualifiers which put the meaning of any statement in doubt. For example, 
there is a property "use" https://www.wikidata.org/wiki/Property:P366 . 
If someone qualifies a statement with "use X" what does that mean? Is 
the statement no longer generally true? Should it be omitted? The 
semantics of Wikidata qualifiers have not been defined and and won't be 
enforced. It's left up to users to invent their own meanings. (In this 
way, Wikidata is still a lot like the prose in Wikipedia.)


We need more "curated" projects like DBpedia to do the work of 
maintaining a coherent subclass hierarchy and to take a conservative 
approach to statements with qualifiers (omitting most such statements 
unless the qualifier is unambiguous).


- Jeff

On 2018/05/05 15:02, David Abián wrote:

Since the subject has come out, I leave some general impressions, which
aren't necessarily applicable to the people in charge of generating the
LOD cloud.

Many DBpedia-centered researchers are truly reluctant to mention
Wikidata. Some of them don't want people to know that Wikidata exists,
so they continue introducing DBpedia in their talks and papers as the
largest knowledge base that is available out there — which is, indeed,
no longer true. This isn't hate but an attempt to survive, an attempt to
ignore change, to continue working on the same lines of research and
"enjoying" the corresponding, sometimes poor, funding.

It's not a matter of triples. The very ideas of both projects are
different, and this point is what makes DBpedia potentially obsolete.
DBpedia is a non-collaborative project — as we understand collaboration
in the Wikimedia movement — that emerged from academia with the aim of
*extracting* information from Wikipedia. Similarly to Wikipedia, it can
be confusing to talk about DBpedia in the singular because there are
several DBpedias, each one mainly oriented, and limited, to a language,
and not very well interlinked. There's, however, a single multilingual
Wikidata that makes the idea of extracting information from Wikipedia
less meaningful. Most relevant structured data are already centralized
here, in Wikidata, which *provides* them to Wikipedia. Moreover, the
data in Wikidata are referenced... sometimes :), and they are more
fine-grained and better structured than those in DBpedia.

Researchers should have nothing to fear from Wikidata, and some of them,
mainly the young ones, do start to work on our project. In my humble
opinion, we need the help of universities and research centers to fill
some gaps and to produce and apply theory. I think these needs should be
better communicated to researchers and fears should be mitigated. Our
project isn't "that new" today.

Hopefully, Wikidata will appear soon in the LOD cloud... O:)


El 04/05/18 a las 18:33, Maarten Dammers escribió:

It almost feels like someone doesn’t want Wikidata in there? Maybe that
website is maintained by DBpedia fans? Just thinking out loud here
because DBpedia is very popular in the academic world and Wikidata a
huge threat for that popularity.

Maarten

Op 4 mei 2018 om 17:20 heeft Denny Vrandečić > het volgende geschreven:


I'm pretty sure that Wikidata is doing better than 90% of the current
bubbles in the diagram.

If they wanted to have Wikidata in the diagram it would have been
there before it was too small to read it. :)





___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread Gerard Meijssen
Hoi,
Given that DBpedia includes Wikidata and has its own processes to get data
from the Wikipedias it must be bigger than Wikidata.

PS There is room enough for both Wikidata and DBpedia and if anything
DBpedia is quite happy to collaborate, Wikidata is not collaborating;
everything is Wikidata centred. I get the impression that it is like the
Borg and its charm is sometimes equivalent.
Thanks,
   GerardM

On 5 May 2018 at 15:02, David Abián  wrote:

> Since the subject has come out, I leave some general impressions, which
> aren't necessarily applicable to the people in charge of generating the
> LOD cloud.
>
> Many DBpedia-centered researchers are truly reluctant to mention
> Wikidata. Some of them don't want people to know that Wikidata exists,
> so they continue introducing DBpedia in their talks and papers as the
> largest knowledge base that is available out there — which is, indeed,
> no longer true. This isn't hate but an attempt to survive, an attempt to
> ignore change, to continue working on the same lines of research and
> "enjoying" the corresponding, sometimes poor, funding.
>
> It's not a matter of triples. The very ideas of both projects are
> different, and this point is what makes DBpedia potentially obsolete.
> DBpedia is a non-collaborative project — as we understand collaboration
> in the Wikimedia movement — that emerged from academia with the aim of
> *extracting* information from Wikipedia. Similarly to Wikipedia, it can
> be confusing to talk about DBpedia in the singular because there are
> several DBpedias, each one mainly oriented, and limited, to a language,
> and not very well interlinked. There's, however, a single multilingual
> Wikidata that makes the idea of extracting information from Wikipedia
> less meaningful. Most relevant structured data are already centralized
> here, in Wikidata, which *provides* them to Wikipedia. Moreover, the
> data in Wikidata are referenced... sometimes :), and they are more
> fine-grained and better structured than those in DBpedia.
>
> Researchers should have nothing to fear from Wikidata, and some of them,
> mainly the young ones, do start to work on our project. In my humble
> opinion, we need the help of universities and research centers to fill
> some gaps and to produce and apply theory. I think these needs should be
> better communicated to researchers and fears should be mitigated. Our
> project isn't "that new" today.
>
> Hopefully, Wikidata will appear soon in the LOD cloud... O:)
>
>
> El 04/05/18 a las 18:33, Maarten Dammers escribió:
> > It almost feels like someone doesn’t want Wikidata in there? Maybe that
> > website is maintained by DBpedia fans? Just thinking out loud here
> > because DBpedia is very popular in the academic world and Wikidata a
> > huge threat for that popularity.
> >
> > Maarten
> >
> > Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  > > het volgende geschreven:
> >
> >> I'm pretty sure that Wikidata is doing better than 90% of the current
> >> bubbles in the diagram.
> >>
> >> If they wanted to have Wikidata in the diagram it would have been
> >> there before it was too small to read it. :)
>
>
> --
> David Abián
> Wikimedia España
> https://wikimedia.es/
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-05 Thread David Abián
Since the subject has come out, I leave some general impressions, which
aren't necessarily applicable to the people in charge of generating the
LOD cloud.

Many DBpedia-centered researchers are truly reluctant to mention
Wikidata. Some of them don't want people to know that Wikidata exists,
so they continue introducing DBpedia in their talks and papers as the
largest knowledge base that is available out there — which is, indeed,
no longer true. This isn't hate but an attempt to survive, an attempt to
ignore change, to continue working on the same lines of research and
"enjoying" the corresponding, sometimes poor, funding.

It's not a matter of triples. The very ideas of both projects are
different, and this point is what makes DBpedia potentially obsolete.
DBpedia is a non-collaborative project — as we understand collaboration
in the Wikimedia movement — that emerged from academia with the aim of
*extracting* information from Wikipedia. Similarly to Wikipedia, it can
be confusing to talk about DBpedia in the singular because there are
several DBpedias, each one mainly oriented, and limited, to a language,
and not very well interlinked. There's, however, a single multilingual
Wikidata that makes the idea of extracting information from Wikipedia
less meaningful. Most relevant structured data are already centralized
here, in Wikidata, which *provides* them to Wikipedia. Moreover, the
data in Wikidata are referenced... sometimes :), and they are more
fine-grained and better structured than those in DBpedia.

Researchers should have nothing to fear from Wikidata, and some of them,
mainly the young ones, do start to work on our project. In my humble
opinion, we need the help of universities and research centers to fill
some gaps and to produce and apply theory. I think these needs should be
better communicated to researchers and fears should be mitigated. Our
project isn't "that new" today.

Hopefully, Wikidata will appear soon in the LOD cloud... O:)


El 04/05/18 a las 18:33, Maarten Dammers escribió:
> It almost feels like someone doesn’t want Wikidata in there? Maybe that
> website is maintained by DBpedia fans? Just thinking out loud here
> because DBpedia is very popular in the academic world and Wikidata a
> huge threat for that popularity.
> 
> Maarten
> 
> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  > het volgende geschreven:
> 
>> I'm pretty sure that Wikidata is doing better than 90% of the current
>> bubbles in the diagram.
>>
>> If they wanted to have Wikidata in the diagram it would have been
>> there before it was too small to read it. :)


-- 
David Abián
Wikimedia España
https://wikimedia.es/

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-04 Thread Daniel Mietchen
They go by number of triples, of which Wikidata has ca. 5 billion vs.
9.5 million in DBpedia - see https://lod-cloud.net/dataset/dbpedia .

On Sat, May 5, 2018 at 1:45 AM, Fariz Darari  wrote:
> Since DBpedia was mentioned, wondering which is bigger in the LOD Cloud,
> Wikidata or DBpedia?
>
> Wikidata at the current state contains at least 2700 external ID properties,
> which reaffirms Wikidata as a hub of data.
>
> -fariz
>
> On Sat, May 5, 2018, 00:10 Peter F. Patel-Schneider 
> wrote:
>>
>> Yeah, that would be nice.
>>
>>
>> You can zoom in on the image, and search for the labels in it.
>> Unfortunately
>> many of the labels are truncated, e.g., WordNe
>>
>>
>> Clicking on a node gets the raw data backing up the image, but I don't see
>> how
>> to get the processed data.  The data for some of the nodes either doesn't
>> have
>> enough information to determine whether the source actually satisfies the
>> requirements to be in the LOD Cloud (Wordnet,
>> universal-dependencies-treebank-hebrew) or something about the source
>> doesn't
>> work anymore (Freebase).
>>
>>
>> peter
>>
>>
>>
>> On 05/04/2018 09:52 AM, Bruce Whealton wrote:
>> > Is there an easy way to navigate this?  I was wondering if there was a
>> > way
>> > to zoom-in on a certain area and then see connections from that image.
>> > When
>> > I clicked on something I got a JSON view.  I don't know how much coding
>> > it
>> > would take to have something like the Visual Thesaurus where clicking on
>> > links brings that circle into focus with its first degree connections.
>> > Maybe I need a magnifier on my 4k monitor.
>> >
>> > Bruce
>> >
>> > On Mon, Apr 30, 2018 at 3:17 PM, Ettore RIZZA > > > wrote:
>> >
>> > Hi all,
>> >
>> > The new version of the "Linked Open data Cloud" graph
>> >   is out ... and still no Wikidata in it.
>> > According to this Twitter discussion
>> > , this
>> > would be
>> > due to a lack of metadata on Wikidata. No way to fix that easily?
>> > The
>> > LOD cloud is cited in many scientific papers, it is not a simple
>> > gadget.
>> >
>> > Cheers,
>> >
>> > Ettore Rizza
>> >
>> > ___
>> > Wikidata mailing list
>> > Wikidata@lists.wikimedia.org 
>> > https://lists.wikimedia.org/mailman/listinfo/wikidata
>> > 
>> >
>> >
>> >
>> >
>> > --
>> > Bruce M Whealton Jr.
>> > My Online Resume:
>> > http://fwwebdev.com/myresume/bruce-whealton-resume-view
>> > I do business as Future Wave Web Development
>> > http://futurewavewebdevelopment.com
>> > Providing Web Development & Design, as well as Programming/Software
>> > Engineering
>> >
>> >
>> > ___
>> > Wikidata mailing list
>> > Wikidata@lists.wikimedia.org
>> > https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-04 Thread Fariz Darari
Since DBpedia was mentioned, wondering which is bigger in the LOD Cloud,
Wikidata or DBpedia?

Wikidata at the current state contains at least 2700 external ID
properties, which reaffirms Wikidata as a hub of data.

-fariz

On Sat, May 5, 2018, 00:10 Peter F. Patel-Schneider 
wrote:

> Yeah, that would be nice.
>
>
> You can zoom in on the image, and search for the labels in it.
> Unfortunately
> many of the labels are truncated, e.g., WordNe
>
>
> Clicking on a node gets the raw data backing up the image, but I don't see
> how
> to get the processed data.  The data for some of the nodes either doesn't
> have
> enough information to determine whether the source actually satisfies the
> requirements to be in the LOD Cloud (Wordnet,
> universal-dependencies-treebank-hebrew) or something about the source
> doesn't
> work anymore (Freebase).
>
>
> peter
>
>
>
> On 05/04/2018 09:52 AM, Bruce Whealton wrote:
> > Is there an easy way to navigate this?  I was wondering if there was a
> way
> > to zoom-in on a certain area and then see connections from that image.
> When
> > I clicked on something I got a JSON view.  I don't know how much coding
> it
> > would take to have something like the Visual Thesaurus where clicking on
> > links brings that circle into focus with its first degree connections.
> > Maybe I need a magnifier on my 4k monitor.
> >
> > Bruce
> >
> > On Mon, Apr 30, 2018 at 3:17 PM, Ettore RIZZA  > > wrote:
> >
> > Hi all,
> >
> > The new version of the "Linked Open data Cloud" graph
> >   is out ... and still no Wikidata in it.
> > According to this Twitter discussion
> > , this
> would be
> > due to a lack of metadata on Wikidata. No way to fix that easily? The
> > LOD cloud is cited in many scientific papers, it is not a simple
> gadget.
> >
> > Cheers,
> >
> > Ettore Rizza
> >
> > ___
> > Wikidata mailing list
> > Wikidata@lists.wikimedia.org 
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
> > 
> >
> >
> >
> >
> > --
> > Bruce M Whealton Jr.
> > My Online Resume:
> http://fwwebdev.com/myresume/bruce-whealton-resume-view
> > I do business as Future Wave Web Development
> > http://futurewavewebdevelopment.com
> > Providing Web Development & Design, as well as Programming/Software
> Engineering
> >
> >
> > ___
> > Wikidata mailing list
> > Wikidata@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-04 Thread Peter F. Patel-Schneider
Yeah, that would be nice.


You can zoom in on the image, and search for the labels in it.  Unfortunately
many of the labels are truncated, e.g., WordNe


Clicking on a node gets the raw data backing up the image, but I don't see how
to get the processed data.  The data for some of the nodes either doesn't have
enough information to determine whether the source actually satisfies the
requirements to be in the LOD Cloud (Wordnet,
universal-dependencies-treebank-hebrew) or something about the source doesn't
work anymore (Freebase).


peter



On 05/04/2018 09:52 AM, Bruce Whealton wrote:
> Is there an easy way to navigate this?  I was wondering if there was a way
> to zoom-in on a certain area and then see connections from that image.  When
> I clicked on something I got a JSON view.  I don't know how much coding it
> would take to have something like the Visual Thesaurus where clicking on
> links brings that circle into focus with its first degree connections. 
> Maybe I need a magnifier on my 4k monitor.
>
> Bruce 
>
> On Mon, Apr 30, 2018 at 3:17 PM, Ettore RIZZA  > wrote:
>
> Hi all,
>
> The new version of the "Linked Open data Cloud" graph
>   is out ... and still no Wikidata in it.
> According to this Twitter discussion
> , this would be
> due to a lack of metadata on Wikidata. No way to fix that easily? The
> LOD cloud is cited in many scientific papers, it is not a simple gadget.
>
> Cheers,
>
> Ettore Rizza
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org 
> https://lists.wikimedia.org/mailman/listinfo/wikidata
> 
>
>
>
>
> -- 
> Bruce M Whealton Jr.
> My Online Resume: http://fwwebdev.com/myresume/bruce-whealton-resume-view
> I do business as Future Wave Web Development
> http://futurewavewebdevelopment.com
> Providing Web Development & Design, as well as Programming/Software 
> Engineering
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-04 Thread Bruce Whealton
Is there an easy way to navigate this?  I was wondering if there was a way
to zoom-in on a certain area and then see connections from that image.
When I clicked on something I got a JSON view.  I don't know how much
coding it would take to have something like the Visual Thesaurus where
clicking on links brings that circle into focus with its first degree
connections.  Maybe I need a magnifier on my 4k monitor.

Bruce

On Mon, Apr 30, 2018 at 3:17 PM, Ettore RIZZA  wrote:

> Hi all,
>
> The new version of the "Linked Open data Cloud" graph
>   is out ... and still no Wikidata in it.
> According to this Twitter discussion
> , this would be
> due to a lack of metadata on Wikidata. No way to fix that easily? The LOD
> cloud is cited in many scientific papers, it is not a simple gadget.
>
> Cheers,
>
> Ettore Rizza
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 
Bruce M Whealton Jr.
My Online Resume: http://fwwebdev.com/myresume/bruce-whealton-resume-view
I do business as Future Wave Web Development
http://futurewavewebdevelopment.com
Providing Web Development & Design, as well as Programming/Software
Engineering
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-04 Thread Maarten Dammers
It almost feels like someone doesn’t want Wikidata in there? Maybe that website 
is maintained by DBpedia fans? Just thinking out loud here because DBpedia is 
very popular in the academic world and Wikidata a huge threat for that 
popularity.

Maarten

> Op 4 mei 2018 om 17:20 heeft Denny Vrandečić  het 
> volgende geschreven:
> 
> I'm pretty sure that Wikidata is doing better than 90% of the current bubbles 
> in the diagram.
> 
> If they wanted to have Wikidata in the diagram it would have been there 
> before it was too small to read it. :)
> 
>> On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider 
>>  wrote:
>> Thanks for the corrections.
>> 
>> So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
>> Adams.  Retrieving from this IRI results in a 303 See Other to
>> https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess) is the
>> main IRI for representations of Douglas Adams and other pages with
>> information about him.
>> 
>> From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
>> negotiation can be used to get the JSON representation (the default), other
>> representations including Turtle, and human-readable information.  (Well
>> actually I'm not sure that this is really correct.  It appears that instead
>> of directly using content negotiation, another 303 See Other is used to
>> provide an IRI for a document in the requested format.)
>> 
>> https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
>> https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
>> machine-readable documents containing the Wikidata information about Douglas
>> Adams.  Content negotiation is not possible on these pages.
>> 
>> https://www.wikidata.org/wiki/Q42 is the IRI that produces a human-readable
>> version of the information about Douglas Adams.  Content negotiation is not
>> possible on this page, but it does have link rel="alternate" to the
>> machine-readable pages.
>> 
>> Strangely this page has a link rel="canonical" to itself.  Shouldn't that
>> link be to https://www.wikidata.org/entity/Q42?  There is a human-visible
>> link to this IRI, but there doesn't appear to be any machine-readable link.
>> 
>> RDF links to other IRIs for Douglas Adams are given in RDF pages by
>> properties in the wdtn namespace.  Many, but not all, identifiers are
>> handled this way.  (Strangely ISNI (P213) isn't even though it is linked on
>> the human-readable page.)
>> 
>> So it looks as if Wikidata can be considered as Linked Open Data but maybe
>> some improvements can be made.
>> 
>> 
>> peter
>> 
>> 
>> 
>> On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
>> > On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> >> As far as I can tell real IRIs for Wikidata are https URIs.  The http IRIs
>> >> redirect to https IRIs.
>> >
>> > That's right.
>> >
>> >>   As far as I can tell no content negotiation is
>> >> done.
>> >
>> > No, you're mistaken. Your tried the URL of a wikipage in your curl command.
>> > Those are for human consumption, thus not available in turtle.
>> >
>> > The "real IRIs" of Wikidata entities are like this:
>> > https://www.wikidata.org/entity/Q{NUMBER}
>> >
>> > However, they 303 redirect to
>> > https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}
>> >
>> > which is the identifier of a schema:Dataset. Then, if you HTTP GET these
>> > URIs, you can content negotiate them to JSON
>> > (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to
>> > turtle (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).
>> >
>> >
>> > Suprisingly, there is no connection between the entity IRIs and the 
>> > wikipage
>> > URLs. If one was given the IRI of an entity from Wikidata, and had no
>> > further information about how Wikidata works, they would not be able to
>> > retrieve HTML content about the entity.
>> >
>> >
>> > BTW, I'm not sure the implementation of content negotiation in Wikidata is
>> > correct because the server does not tell me the format of the resource to
>> > which it redirects (as opposed to what DBpedia does, for instance).
>> >
>> >
>> > --AZ
>> 
>> 
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-04 Thread Denny Vrandečić
I'm pretty sure that Wikidata is doing better than 90% of the current
bubbles in the diagram.

If they wanted to have Wikidata in the diagram it would have been there
before it was too small to read it. :)

On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider <
pfpschnei...@gmail.com> wrote:

> Thanks for the corrections.
>
> So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
> Adams.  Retrieving from this IRI results in a 303 See Other to
> https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess) is
> the
> main IRI for representations of Douglas Adams and other pages with
> information about him.
>
> From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
> negotiation can be used to get the JSON representation (the default), other
> representations including Turtle, and human-readable information.  (Well
> actually I'm not sure that this is really correct.  It appears that instead
> of directly using content negotiation, another 303 See Other is used to
> provide an IRI for a document in the requested format.)
>
> https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
> https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
> machine-readable documents containing the Wikidata information about
> Douglas
> Adams.  Content negotiation is not possible on these pages.
>
> https://www.wikidata.org/wiki/Q42 is the IRI that produces a
> human-readable
> version of the information about Douglas Adams.  Content negotiation is not
> possible on this page, but it does have link rel="alternate" to the
> machine-readable pages.
>
> Strangely this page has a link rel="canonical" to itself.  Shouldn't that
> link be to https://www.wikidata.org/entity/Q42?  There is a human-visible
> link to this IRI, but there doesn't appear to be any machine-readable link.
>
> RDF links to other IRIs for Douglas Adams are given in RDF pages by
> properties in the wdtn namespace.  Many, but not all, identifiers are
> handled this way.  (Strangely ISNI (P213) isn't even though it is linked on
> the human-readable page.)
>
> So it looks as if Wikidata can be considered as Linked Open Data but maybe
> some improvements can be made.
>
>
> peter
>
>
>
> On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
> > On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
> >> As far as I can tell real IRIs for Wikidata are https URIs.  The http
> IRIs
> >> redirect to https IRIs.
> >
> > That's right.
> >
> >>   As far as I can tell no content negotiation is
> >> done.
> >
> > No, you're mistaken. Your tried the URL of a wikipage in your curl
> command.
> > Those are for human consumption, thus not available in turtle.
> >
> > The "real IRIs" of Wikidata entities are like this:
> > https://www.wikidata.org/entity/Q{NUMBER}
> >
> > However, they 303 redirect to
> > https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}
> >
> > which is the identifier of a schema:Dataset. Then, if you HTTP GET these
> > URIs, you can content negotiate them to JSON
> > (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to
> > turtle (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).
> >
> >
> > Suprisingly, there is no connection between the entity IRIs and the
> wikipage
> > URLs. If one was given the IRI of an entity from Wikidata, and had no
> > further information about how Wikidata works, they would not be able to
> > retrieve HTML content about the entity.
> >
> >
> > BTW, I'm not sure the implementation of content negotiation in Wikidata
> is
> > correct because the server does not tell me the format of the resource to
> > which it redirects (as opposed to what DBpedia does, for instance).
> >
> >
> > --AZ
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-01 Thread Peter F. Patel-Schneider
Thanks for the corrections.

So https://www.wikidata.org/entity/Q42 is *the* Wikidata IRI for Douglas
Adams.  Retrieving from this IRI results in a 303 See Other to
https://www.wikidata.org/wiki/Special:EntityData/Q42, which (I guess) is the
main IRI for representations of Douglas Adams and other pages with
information about him.

From https://www.wikidata.org/wiki/Special:EntityData/Q42 content
negotiation can be used to get the JSON representation (the default), other
representations including Turtle, and human-readable information.  (Well
actually I'm not sure that this is really correct.  It appears that instead
of directly using content negotiation, another 303 See Other is used to
provide an IRI for a document in the requested format.)

https://www.wikidata.org/wiki/Special:EntityData/Q42.json and
https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl are the useful
machine-readable documents containing the Wikidata information about Douglas
Adams.  Content negotiation is not possible on these pages.

https://www.wikidata.org/wiki/Q42 is the IRI that produces a human-readable
version of the information about Douglas Adams.  Content negotiation is not
possible on this page, but it does have link rel="alternate" to the
machine-readable pages.

Strangely this page has a link rel="canonical" to itself.  Shouldn't that
link be to https://www.wikidata.org/entity/Q42?  There is a human-visible
link to this IRI, but there doesn't appear to be any machine-readable link.

RDF links to other IRIs for Douglas Adams are given in RDF pages by
properties in the wdtn namespace.  Many, but not all, identifiers are
handled this way.  (Strangely ISNI (P213) isn't even though it is linked on
the human-readable page.)

So it looks as if Wikidata can be considered as Linked Open Data but maybe
some improvements can be made.


peter



On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https URIs.  The http IRIs
>> redirect to https IRIs.
>
> That's right.
>
>>   As far as I can tell no content negotiation is
>> done.
>
> No, you're mistaken. Your tried the URL of a wikipage in your curl command.
> Those are for human consumption, thus not available in turtle.
>
> The "real IRIs" of Wikidata entities are like this:
> https://www.wikidata.org/entity/Q{NUMBER}
>
> However, they 303 redirect to
> https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}
>
> which is the identifier of a schema:Dataset. Then, if you HTTP GET these
> URIs, you can content negotiate them to JSON
> (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to
> turtle (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).
>
>
> Suprisingly, there is no connection between the entity IRIs and the wikipage
> URLs. If one was given the IRI of an entity from Wikidata, and had no
> further information about how Wikidata works, they would not be able to
> retrieve HTML content about the entity.
>
>
> BTW, I'm not sure the implementation of content negotiation in Wikidata is
> correct because the server does not tell me the format of the resource to
> which it redirects (as opposed to what DBpedia does, for instance).
>
>
> --AZ


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-05-01 Thread Lucas Werkmeister
On 01.05.2018 10:03, Antoine Zimmermann wrote:
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https URIs.  The http
>> IRIs
>> redirect to https IRIs.
> 
> That's right.
> 
>>   As far as I can tell no content negotiation is
>> done.
> 
> No, you're mistaken. Your tried the URL of a wikipage in your curl
> command. Those are for human consumption, thus not available in turtle.
> 
> The "real IRIs" of Wikidata entities are like this:
> https://www.wikidata.org/entity/Q{NUMBER}
> 
> However, they 303 redirect to
> https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}
> 
> which is the identifier of a schema:Dataset. Then, if you HTTP GET these
> URIs, you can content negotiate them to JSON
> (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to
> turtle (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).
> 
> 
> Suprisingly, there is no connection between the entity IRIs and the
> wikipage URLs. If one was given the IRI of an entity from Wikidata, and
> had no further information about how Wikidata works, they would not be
> able to retrieve HTML content about the entity.

There is a “concept URI” link in the sidebar on the left (between “Page
information” and “Cite this page”), which is a hyperlink to the entity
URI. I also seem to recall a Phabricator task for adding JSON-LD data to
the wiki page, but I can’t find that right now – however, there is a
task to “make export formats more visible”:
https://phabricator.wikimedia.org/T109420

> 
> 
> BTW, I'm not sure the implementation of content negotiation in Wikidata
> is correct because the server does not tell me the format of the
> resource to which it redirects (as opposed to what DBpedia does, for
> instance).

It sends a Content-Type header?

Cheers,
Lucas

> 
> 
> --AZ
> 
>>
>> peter
>>
>>
>>
>> idefix merging> curl -I http://www.wikidata.org/wiki/Q5200
>> HTTP/1.1 301 TLS Redirect
>> Date: Tue, 01 May 2018 01:13:09 GMT
>> Server: Varnish
>> X-Varnish: 227838359
>> X-Cache: cp1068 int
>> X-Cache-Status: int-front
>> Set-Cookie:
>> WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
>> Jun 2018 00:00:00 GMT
>> Set-Cookie:
>> WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
>>
>> 02 Jun 2018 00:00:00 GMT
>> X-Client-IP: 199.4.160.88
>> Location: https://www.wikidata.org/wiki/Q5200
>> Content-Length: 0
>> Connection: keep-alive
>>
>>
>> idefix merging> curl -I https://www.wikidata.org/wiki/Q5200
>> HTTP/2 200
>> date: Tue, 01 May 2018 01:14:58 GMT
>> content-type: text/html; charset=UTF-8
>> server: mw1252.eqiad.wmnet
>> x-content-type-options: nosniff
>> p3p: CP="This is not a P3P policy! See
>> https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more
>> info."
>> x-powered-by: HHVM/3.18.6-dev
>> content-language: en
>> link:
>> ;rel=preload;as=image
>> vary: Accept-Encoding,Cookie,Authorization
>> x-ua-compatible: IE=Edge
>> backend-timing: D=75094 t=1525107829593021
>> x-varnish: 754403290 624210434, 194797954 924438274
>> via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
>> age: 29467
>> x-cache: cp1067 hit/8, cp1068 hit/9
>> x-cache-status: hit-front
>> set-cookie: CP=H2; Path=/; secure
>> strict-transport-security: max-age=106384710; includeSubDomains; preload
>> set-cookie:
>> WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
>> Jun 2018 00:00:00 GMT
>> set-cookie:
>> WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
>>
>> 02 Jun 2018 00:00:00 GMT
>> x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
>> x-client-ip: 199.4.160.88
>> cache-control: private, s-maxage=0, max-age=0, must-revalidate
>> set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
>> Domain=.wikidata.org
>> accept-ranges: bytes
>>
>> idefix merging> curl -I -H "Accept: text/turtle"
>> https://www.wikidata.org/wiki/Q5200
>> HTTP/2 200
>> date: Tue, 01 May 2018 01:15:52 GMT
>> content-type: text/html; charset=UTF-8
>> server: mw1252.eqiad.wmnet
>> x-content-type-options: nosniff
>> p3p: CP="This is not a P3P policy! See
>> https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more
>> info."
>> x-powered-by: HHVM/3.18.6-dev
>> content-language: en
>> link:
>> ;rel=preload;as=image
>> vary: Accept-Encoding,Cookie,Authorization
>> x-ua-compatible: IE=Edge
>> backend-timing: D=75094 t=1525107829593021
>> x-varnish: 754403290 624210434, 160015159 924438274
>> via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
>> age: 29522
>> x-cache: cp1067 hit/8, cp1068 hit/10
>> x-cache-status: hit-front
>> set-cookie: CP=H2; Path=/; secure
>> strict-transport-security: max-age=106384710; includeSubDomains; preload
>> set-cookie:
>> WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
>> Jun 2018 00:00:00 GMT
>> set-cookie:
>> 

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-01 Thread Antoine Zimmermann

On 01/05/2018 10:21, Thomas Pellissier Tanon wrote:

Suprisingly, there is no connection between the entity IRIs and the wikipage 
URLs. If one was given the IRI of an entity from Wikidata, and had no further 
information about how Wikidata works, they would not be able to retrieve HTML 
content about the entity.


If you require the "text/html" MIME type, you are going to be redirected to the 
HTML content. For example, you could try to go to https://www.wikidata.org/entity/Q42 
using your favorite browser or execute:

curl --header 'Accept: text/html' 
https://www.wikidata.org/wiki/Special:EntityData/Q42 -v


You're right. I thought I tried it, but apparently I made a mistake in 
my command.


--AZ




Cheers,

Thomas


Le 1 mai 2018 à 10:03, Antoine Zimmermann  a écrit :

On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:

As far as I can tell real IRIs for Wikidata are https URIs.  The http IRIs
redirect to https IRIs.


That's right.


   As far as I can tell no content negotiation is
done.


No, you're mistaken. Your tried the URL of a wikipage in your curl command. 
Those are for human consumption, thus not available in turtle.

The "real IRIs" of Wikidata entities are like this: 
https://www.wikidata.org/entity/Q{NUMBER}

However, they 303 redirect to 
https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}

which is the identifier of a schema:Dataset. Then, if you HTTP GET these URIs, 
you can content negotiate them to JSON 
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to turtle 
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).


Suprisingly, there is no connection between the entity IRIs and the wikipage 
URLs. If one was given the IRI of an entity from Wikidata, and had no further 
information about how Wikidata works, they would not be able to retrieve HTML 
content about the entity.


BTW, I'm not sure the implementation of content negotiation in Wikidata is 
correct because the server does not tell me the format of the resource to which 
it redirects (as opposed to what DBpedia does, for instance).


--AZ


peter
idefix merging> curl -I http://www.wikidata.org/wiki/Q5200
HTTP/1.1 301 TLS Redirect
Date: Tue, 01 May 2018 01:13:09 GMT
Server: Varnish
X-Varnish: 227838359
X-Cache: cp1068 int
X-Cache-Status: int-front
Set-Cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
Set-Cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
X-Client-IP: 199.4.160.88
Location: https://www.wikidata.org/wiki/Q5200
Content-Length: 0
Connection: keep-alive
idefix merging> curl -I https://www.wikidata.org/wiki/Q5200
HTTP/2 200
date: Tue, 01 May 2018 01:14:58 GMT
content-type: text/html; charset=UTF-8
server: mw1252.eqiad.wmnet
x-content-type-options: nosniff
p3p: CP="This is not a P3P policy! See
https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
x-powered-by: HHVM/3.18.6-dev
content-language: en
link: ;rel=preload;as=image
vary: Accept-Encoding,Cookie,Authorization
x-ua-compatible: IE=Edge
backend-timing: D=75094 t=1525107829593021
x-varnish: 754403290 624210434, 194797954 924438274
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
age: 29467
x-cache: cp1067 hit/8, cp1068 hit/9
x-cache-status: hit-front
set-cookie: CP=H2; Path=/; secure
strict-transport-security: max-age=106384710; includeSubDomains; preload
set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
set-cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
x-client-ip: 199.4.160.88
cache-control: private, s-maxage=0, max-age=0, must-revalidate
set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
Domain=.wikidata.org
accept-ranges: bytes
idefix merging> curl -I -H "Accept: text/turtle"
https://www.wikidata.org/wiki/Q5200
HTTP/2 200
date: Tue, 01 May 2018 01:15:52 GMT
content-type: text/html; charset=UTF-8
server: mw1252.eqiad.wmnet
x-content-type-options: nosniff
p3p: CP="This is not a P3P policy! See
https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
x-powered-by: HHVM/3.18.6-dev
content-language: en
link: ;rel=preload;as=image
vary: Accept-Encoding,Cookie,Authorization
x-ua-compatible: IE=Edge
backend-timing: D=75094 t=1525107829593021
x-varnish: 754403290 624210434, 160015159 924438274
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
age: 29522
x-cache: cp1067 hit/8, cp1068 hit/10
x-cache-status: hit-front
set-cookie: CP=H2; Path=/; secure
strict-transport-security: max-age=106384710; includeSubDomains; preload
set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
set-cookie:

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-01 Thread Thomas Pellissier Tanon
> Suprisingly, there is no connection between the entity IRIs and the wikipage 
> URLs. If one was given the IRI of an entity from Wikidata, and had no further 
> information about how Wikidata works, they would not be able to retrieve HTML 
> content about the entity.

If you require the "text/html" MIME type, you are going to be redirected to the 
HTML content. For example, you could try to go to 
https://www.wikidata.org/entity/Q42 using your favorite browser or execute:

curl --header 'Accept: text/html' 
https://www.wikidata.org/wiki/Special:EntityData/Q42 -v

Cheers,

Thomas

> Le 1 mai 2018 à 10:03, Antoine Zimmermann  a 
> écrit :
> 
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https URIs.  The http IRIs
>> redirect to https IRIs.
> 
> That's right.
> 
>>   As far as I can tell no content negotiation is
>> done.
> 
> No, you're mistaken. Your tried the URL of a wikipage in your curl command. 
> Those are for human consumption, thus not available in turtle.
> 
> The "real IRIs" of Wikidata entities are like this: 
> https://www.wikidata.org/entity/Q{NUMBER}
> 
> However, they 303 redirect to 
> https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}
> 
> which is the identifier of a schema:Dataset. Then, if you HTTP GET these 
> URIs, you can content negotiate them to JSON 
> (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to 
> turtle (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).
> 
> 
> Suprisingly, there is no connection between the entity IRIs and the wikipage 
> URLs. If one was given the IRI of an entity from Wikidata, and had no further 
> information about how Wikidata works, they would not be able to retrieve HTML 
> content about the entity.
> 
> 
> BTW, I'm not sure the implementation of content negotiation in Wikidata is 
> correct because the server does not tell me the format of the resource to 
> which it redirects (as opposed to what DBpedia does, for instance).
> 
> 
> --AZ
> 
>> peter
>> idefix merging> curl -I http://www.wikidata.org/wiki/Q5200
>> HTTP/1.1 301 TLS Redirect
>> Date: Tue, 01 May 2018 01:13:09 GMT
>> Server: Varnish
>> X-Varnish: 227838359
>> X-Cache: cp1068 int
>> X-Cache-Status: int-front
>> Set-Cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 
>> 02
>> Jun 2018 00:00:00 GMT
>> Set-Cookie:
>> WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
>> 02 Jun 2018 00:00:00 GMT
>> X-Client-IP: 199.4.160.88
>> Location: https://www.wikidata.org/wiki/Q5200
>> Content-Length: 0
>> Connection: keep-alive
>> idefix merging> curl -I https://www.wikidata.org/wiki/Q5200
>> HTTP/2 200
>> date: Tue, 01 May 2018 01:14:58 GMT
>> content-type: text/html; charset=UTF-8
>> server: mw1252.eqiad.wmnet
>> x-content-type-options: nosniff
>> p3p: CP="This is not a P3P policy! See
>> https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
>> x-powered-by: HHVM/3.18.6-dev
>> content-language: en
>> link: ;rel=preload;as=image
>> vary: Accept-Encoding,Cookie,Authorization
>> x-ua-compatible: IE=Edge
>> backend-timing: D=75094 t=1525107829593021
>> x-varnish: 754403290 624210434, 194797954 924438274
>> via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
>> age: 29467
>> x-cache: cp1067 hit/8, cp1068 hit/9
>> x-cache-status: hit-front
>> set-cookie: CP=H2; Path=/; secure
>> strict-transport-security: max-age=106384710; includeSubDomains; preload
>> set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 
>> 02
>> Jun 2018 00:00:00 GMT
>> set-cookie:
>> WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
>> 02 Jun 2018 00:00:00 GMT
>> x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
>> x-client-ip: 199.4.160.88
>> cache-control: private, s-maxage=0, max-age=0, must-revalidate
>> set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
>> Domain=.wikidata.org
>> accept-ranges: bytes
>> idefix merging> curl -I -H "Accept: text/turtle"
>> https://www.wikidata.org/wiki/Q5200
>> HTTP/2 200
>> date: Tue, 01 May 2018 01:15:52 GMT
>> content-type: text/html; charset=UTF-8
>> server: mw1252.eqiad.wmnet
>> x-content-type-options: nosniff
>> p3p: CP="This is not a P3P policy! See
>> https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
>> x-powered-by: HHVM/3.18.6-dev
>> content-language: en
>> link: ;rel=preload;as=image
>> vary: Accept-Encoding,Cookie,Authorization
>> x-ua-compatible: IE=Edge
>> backend-timing: D=75094 t=1525107829593021
>> x-varnish: 754403290 624210434, 160015159 924438274
>> via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
>> age: 29522
>> x-cache: cp1067 hit/8, cp1068 hit/10
>> x-cache-status: hit-front
>> set-cookie: CP=H2; Path=/; secure
>> strict-transport-security: max-age=106384710; includeSubDomains; preload
>> set-cookie: 

Re: [Wikidata] Wikiata and the LOD cloud

2018-05-01 Thread Antoine Zimmermann

On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:

As far as I can tell real IRIs for Wikidata are https URIs.  The http IRIs
redirect to https IRIs.


That's right.


  As far as I can tell no content negotiation is
done.


No, you're mistaken. Your tried the URL of a wikipage in your curl 
command. Those are for human consumption, thus not available in turtle.


The "real IRIs" of Wikidata entities are like this: 
https://www.wikidata.org/entity/Q{NUMBER}


However, they 303 redirect to 
https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}


which is the identifier of a schema:Dataset. Then, if you HTTP GET these 
URIs, you can content negotiate them to JSON 
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json) or to 
turtle (https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl).



Suprisingly, there is no connection between the entity IRIs and the 
wikipage URLs. If one was given the IRI of an entity from Wikidata, and 
had no further information about how Wikidata works, they would not be 
able to retrieve HTML content about the entity.



BTW, I'm not sure the implementation of content negotiation in Wikidata 
is correct because the server does not tell me the format of the 
resource to which it redirects (as opposed to what DBpedia does, for 
instance).



--AZ



peter



idefix merging> curl -I http://www.wikidata.org/wiki/Q5200
HTTP/1.1 301 TLS Redirect
Date: Tue, 01 May 2018 01:13:09 GMT
Server: Varnish
X-Varnish: 227838359
X-Cache: cp1068 int
X-Cache-Status: int-front
Set-Cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
Set-Cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
X-Client-IP: 199.4.160.88
Location: https://www.wikidata.org/wiki/Q5200
Content-Length: 0
Connection: keep-alive


idefix merging> curl -I https://www.wikidata.org/wiki/Q5200
HTTP/2 200
date: Tue, 01 May 2018 01:14:58 GMT
content-type: text/html; charset=UTF-8
server: mw1252.eqiad.wmnet
x-content-type-options: nosniff
p3p: CP="This is not a P3P policy! See
https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
x-powered-by: HHVM/3.18.6-dev
content-language: en
link: ;rel=preload;as=image
vary: Accept-Encoding,Cookie,Authorization
x-ua-compatible: IE=Edge
backend-timing: D=75094 t=1525107829593021
x-varnish: 754403290 624210434, 194797954 924438274
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
age: 29467
x-cache: cp1067 hit/8, cp1068 hit/9
x-cache-status: hit-front
set-cookie: CP=H2; Path=/; secure
strict-transport-security: max-age=106384710; includeSubDomains; preload
set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
set-cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
x-client-ip: 199.4.160.88
cache-control: private, s-maxage=0, max-age=0, must-revalidate
set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
Domain=.wikidata.org
accept-ranges: bytes

idefix merging> curl -I -H "Accept: text/turtle"
https://www.wikidata.org/wiki/Q5200
HTTP/2 200
date: Tue, 01 May 2018 01:15:52 GMT
content-type: text/html; charset=UTF-8
server: mw1252.eqiad.wmnet
x-content-type-options: nosniff
p3p: CP="This is not a P3P policy! See
https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
x-powered-by: HHVM/3.18.6-dev
content-language: en
link: ;rel=preload;as=image
vary: Accept-Encoding,Cookie,Authorization
x-ua-compatible: IE=Edge
backend-timing: D=75094 t=1525107829593021
x-varnish: 754403290 624210434, 160015159 924438274
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
age: 29522
x-cache: cp1067 hit/8, cp1068 hit/10
x-cache-status: hit-front
set-cookie: CP=H2; Path=/; secure
strict-transport-security: max-age=106384710; includeSubDomains; preload
set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
set-cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
x-client-ip: 199.4.160.88
cache-control: private, s-maxage=0, max-age=0, must-revalidate
set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
Domain=.wikidata.org
accept-ranges: bytes



On 04/30/2018 02:53 PM, Lucas Werkmeister wrote:

The real URI (without scare quotes :) ) is not
https://www.wikidata.org/wiki/Q5200 but
http://www.wikidata.org/entity/Q5200 – and depending on your Accept
header, that will redirect you to the wiki page, JSON dump, or RDF data
(in XML or Turtle formats). Since the LOD Cloud criteria explicitly
mentions content negotiation, I think we’re good :)

Cheers,
Lucas

On 30.04.2018 23:08, Peter F. Patel-Schneider wrote:

Does it?  The point is not just that 

Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Peter F. Patel-Schneider
As far as I can tell real IRIs for Wikidata are https URIs.  The http IRIs
redirect to https IRIs.  As far as I can tell no content negotiation is
done.

peter



idefix merging> curl -I http://www.wikidata.org/wiki/Q5200
HTTP/1.1 301 TLS Redirect
Date: Tue, 01 May 2018 01:13:09 GMT
Server: Varnish
X-Varnish: 227838359
X-Cache: cp1068 int
X-Cache-Status: int-front
Set-Cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
Set-Cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
X-Client-IP: 199.4.160.88
Location: https://www.wikidata.org/wiki/Q5200
Content-Length: 0
Connection: keep-alive


idefix merging> curl -I https://www.wikidata.org/wiki/Q5200
HTTP/2 200
date: Tue, 01 May 2018 01:14:58 GMT
content-type: text/html; charset=UTF-8
server: mw1252.eqiad.wmnet
x-content-type-options: nosniff
p3p: CP="This is not a P3P policy! See
https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
x-powered-by: HHVM/3.18.6-dev
content-language: en
link: ;rel=preload;as=image
vary: Accept-Encoding,Cookie,Authorization
x-ua-compatible: IE=Edge
backend-timing: D=75094 t=1525107829593021
x-varnish: 754403290 624210434, 194797954 924438274
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
age: 29467
x-cache: cp1067 hit/8, cp1068 hit/9
x-cache-status: hit-front
set-cookie: CP=H2; Path=/; secure
strict-transport-security: max-age=106384710; includeSubDomains; preload
set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
set-cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
x-client-ip: 199.4.160.88
cache-control: private, s-maxage=0, max-age=0, must-revalidate
set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
Domain=.wikidata.org
accept-ranges: bytes

idefix merging> curl -I -H "Accept: text/turtle"
https://www.wikidata.org/wiki/Q5200
HTTP/2 200
date: Tue, 01 May 2018 01:15:52 GMT
content-type: text/html; charset=UTF-8
server: mw1252.eqiad.wmnet
x-content-type-options: nosniff
p3p: CP="This is not a P3P policy! See
https://www.wikidata.org/wiki/Special:CentralAutoLogin/P3P for more info."
x-powered-by: HHVM/3.18.6-dev
content-language: en
link: ;rel=preload;as=image
vary: Accept-Encoding,Cookie,Authorization
x-ua-compatible: IE=Edge
backend-timing: D=75094 t=1525107829593021
x-varnish: 754403290 624210434, 160015159 924438274
via: 1.1 varnish (Varnish/5.1), 1.1 varnish (Varnish/5.1)
age: 29522
x-cache: cp1067 hit/8, cp1068 hit/10
x-cache-status: hit-front
set-cookie: CP=H2; Path=/; secure
strict-transport-security: max-age=106384710; includeSubDomains; preload
set-cookie: WMF-Last-Access=01-May-2018;Path=/;HttpOnly;secure;Expires=Sat, 02
Jun 2018 00:00:00 GMT
set-cookie:
WMF-Last-Access-Global=01-May-2018;Path=/;Domain=.wikidata.org;HttpOnly;secure;Expires=Sat,
02 Jun 2018 00:00:00 GMT
x-analytics: ns=0;page_id=52899665;https=1;nocookies=1
x-client-ip: 199.4.160.88
cache-control: private, s-maxage=0, max-age=0, must-revalidate
set-cookie: GeoIP=US:MA:Woburn:42.49:-71.16:v4; Path=/; secure;
Domain=.wikidata.org
accept-ranges: bytes



On 04/30/2018 02:53 PM, Lucas Werkmeister wrote:
> The real URI (without scare quotes :) ) is not
> https://www.wikidata.org/wiki/Q5200 but
> http://www.wikidata.org/entity/Q5200 – and depending on your Accept
> header, that will redirect you to the wiki page, JSON dump, or RDF data
> (in XML or Turtle formats). Since the LOD Cloud criteria explicitly
> mentions content negotiation, I think we’re good :)
>
> Cheers,
> Lucas
>
> On 30.04.2018 23:08, Peter F. Patel-Schneider wrote:
>> Does it?  The point is not just that Wikidata has real pointers to external
>> resources.  
>>
>>
>> Wikidata needs to serve RDF (e.g., in Turtle) in an accepted fashion.  Is
>> having https://www.wikidata.org/wiki/Special:EntityData/Q5200.ttl
>> available and linked to with an alternate link count when the "real" URI is
>> https://www.wikidata.org/wiki/Q5200?  I don't know enough about this
>> corner of web standards to know.
>>
>>
>> peter
>>
>>
>>
>>
>>
>>
>> On 04/30/2018 01:45 PM, Federico Leva (Nemo) wrote:
>>> Peter F. Patel-Schneider, 30/04/2018 23:32:
 Does the way that Wikidata serves RDF
 (https://www.wikidata.org/wiki/Special:EntityData/Q5200.rdf) satisfy 
 this
 requirement?
>>> I think that part was already settled with:
>>> https://lists.wikimedia.org/pipermail/wikidata/2017-October/011314.html
>>>
>>> More information:
>>> https://phabricator.wikimedia.org/T85444
>>>
>>> Federico
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
> ___
> Wikidata mailing list
> 

Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Fariz Darari
Really looking forward to finally seeing WD in the LOD Cloud!

-fariz

On Tue, May 1, 2018, 04:53 Lucas Werkmeister 
wrote:

> The real URI (without scare quotes :) ) is not
> https://www.wikidata.org/wiki/Q5200 but
> http://www.wikidata.org/entity/Q5200 – and depending on your Accept
> header, that will redirect you to the wiki page, JSON dump, or RDF data
> (in XML or Turtle formats). Since the LOD Cloud criteria explicitly
> mentions content negotiation, I think we’re good :)
>
> Cheers,
> Lucas
>
> On 30.04.2018 23:08, Peter F. Patel-Schneider wrote:
> > Does it?  The point is not just that Wikidata has real pointers to
> external
> > resources.
> >
> >
> > Wikidata needs to serve RDF (e.g., in Turtle) in an accepted fashion.  Is
> > having https://www.wikidata.org/wiki/Special:EntityData/Q5200.ttl
> > available and linked to with an alternate link count when the "real" URI
> is
> > https://www.wikidata.org/wiki/Q5200?  I don't know enough about this
> > corner of web standards to know.
> >
> >
> > peter
> >
> >
> >
> >
> >
> >
> > On 04/30/2018 01:45 PM, Federico Leva (Nemo) wrote:
> >> Peter F. Patel-Schneider, 30/04/2018 23:32:
> >>> Does the way that Wikidata serves RDF
> >>> (https://www.wikidata.org/wiki/Special:EntityData/Q5200.rdf)
> satisfy this
> >>> requirement?
> >>
> >> I think that part was already settled with:
> >> https://lists.wikimedia.org/pipermail/wikidata/2017-October/011314.html
> >>
> >> More information:
> >> https://phabricator.wikimedia.org/T85444
> >>
> >> Federico
> >
> >
> > ___
> > Wikidata mailing list
> > Wikidata@lists.wikimedia.org
> > https://lists.wikimedia.org/mailman/listinfo/wikidata
> >
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Lucas Werkmeister
The real URI (without scare quotes :) ) is not
https://www.wikidata.org/wiki/Q5200 but
http://www.wikidata.org/entity/Q5200 – and depending on your Accept
header, that will redirect you to the wiki page, JSON dump, or RDF data
(in XML or Turtle formats). Since the LOD Cloud criteria explicitly
mentions content negotiation, I think we’re good :)

Cheers,
Lucas

On 30.04.2018 23:08, Peter F. Patel-Schneider wrote:
> Does it?  The point is not just that Wikidata has real pointers to external
> resources.  
> 
> 
> Wikidata needs to serve RDF (e.g., in Turtle) in an accepted fashion.  Is
> having https://www.wikidata.org/wiki/Special:EntityData/Q5200.ttl
> available and linked to with an alternate link count when the "real" URI is
> https://www.wikidata.org/wiki/Q5200?  I don't know enough about this
> corner of web standards to know.
> 
> 
> peter
> 
> 
> 
> 
> 
> 
> On 04/30/2018 01:45 PM, Federico Leva (Nemo) wrote:
>> Peter F. Patel-Schneider, 30/04/2018 23:32:
>>> Does the way that Wikidata serves RDF
>>> (https://www.wikidata.org/wiki/Special:EntityData/Q5200.rdf) satisfy 
>>> this
>>> requirement?
>>
>> I think that part was already settled with:
>> https://lists.wikimedia.org/pipermail/wikidata/2017-October/011314.html
>>
>> More information:
>> https://phabricator.wikimedia.org/T85444
>>
>> Federico
> 
> 
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
> 

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Federico Leva (Nemo)

Peter F. Patel-Schneider, 30/04/2018 23:32:

Does the way that Wikidata serves RDF
(https://www.wikidata.org/wiki/Special:EntityData/Q5200.rdf) satisfy this
requirement?


I think that part was already settled with:
https://lists.wikimedia.org/pipermail/wikidata/2017-October/011314.html

More information:
https://phabricator.wikimedia.org/T85444

Federico

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Peter F. Patel-Schneider
Yes, it would be nice to have Wikidata there, provided that Wikidata satisfies
the requirements.  There are already several mentions of Wikidata in the data
behind the diagram.


I don't think that Freebase satisfies the stated requirement because its URIs
no longer "resolve, with or without content negotiation, to /RDF data/ in one
of the popular RDF formats (RDFa, RDF/XML, Turtle, N-Triples)".  I wonder why
Freebase is still in the diagram.


Does the way that Wikidata serves RDF
(https://www.wikidata.org/wiki/Special:EntityData/Q5200.rdf) satisfy this
requirement?  (If it doesn't, it might be easy to change.)


peter



On 04/30/2018 12:17 PM, Ettore RIZZA wrote:
> Hi all,
>
> The new version of the "Linked Open data Cloud" graph
>   is out ... and still no Wikidata in it. According
> to this Twitter discussion
> , this would be due
> to a lack of metadata on Wikidata. No way to fix that easily? The LOD cloud
> is cited in many scientific papers, it is not a simple gadget.
>
> Cheers,
>
> Ettore Rizza
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Ettore RIZZA
>
> The suspense is killing me :D


Me too ! :D

Thanks Lydia, and Lucas of course, looking forward to see a big Wikidata
bubble in the middle of this cloud.

Cheers,

Ettore Rizza

2018-04-30 22:13 GMT+02:00 Lydia Pintscher :

> On Mon, Apr 30, 2018 at 9:19 PM Ettore RIZZA 
> wrote:
>
> > Hi all,
>
> > The new version of the "Linked Open data Cloud" graph  is out ... and
> still no Wikidata in it. According to this Twitter discussion, this would
> be due to a lack of metadata on Wikidata. No way to fix that easily? The
> LOD cloud is cited in many scientific papers, it is not a simple gadget.
>
> When I last talked to them about getting Wikidata included it wasn't
> possible because the website handling the datasets was changed and no
> longer worked for it. Seems they've changed that now. Lucas is in touch to
> figure out what's needed now. Let's hope we can finally get this solved now
> and see where Wikidata ends up in the cloud. The suspense is killing me :D
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
> der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
> Körperschaften I Berlin, Steuernummer 27/029/42207.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Lydia Pintscher
On Mon, Apr 30, 2018 at 9:19 PM Ettore RIZZA  wrote:

> Hi all,

> The new version of the "Linked Open data Cloud" graph  is out ... and
still no Wikidata in it. According to this Twitter discussion, this would
be due to a lack of metadata on Wikidata. No way to fix that easily? The
LOD cloud is cited in many scientific papers, it is not a simple gadget.

When I last talked to them about getting Wikidata included it wasn't
possible because the website handling the datasets was changed and no
longer worked for it. Seems they've changed that now. Lucas is in touch to
figure out what's needed now. Let's hope we can finally get this solved now
and see where Wikidata ends up in the cloud. The suspense is killing me :D


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg unter
der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das Finanzamt für
Körperschaften I Berlin, Steuernummer 27/029/42207.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Wikiata and the LOD cloud

2018-04-30 Thread Ettore RIZZA
Hi all,

The new version of the "Linked Open data Cloud" graph
  is out ... and still no Wikidata in it. According
to this Twitter discussion
, this would be
due to a lack of metadata on Wikidata. No way to fix that easily? The LOD
cloud is cited in many scientific papers, it is not a simple gadget.

Cheers,

Ettore Rizza
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata