Hi Lucas, Denny,
all you need to do is update your entry on old.datahub.io:
https://old.datahub.io/dataset/wikidata
It was edited by Lucie-Aimée Kaffee two years ago. You need to contact
her, as she created the Wikimedia org in Datahub. I might be able to
have someone switch ownership of the org to a new account.
But there is many essential metadata missing:
Compare with the DBpedia entry: https://old.datahub.io/dataset/dbpedia
Especially the links and the triple size in the bottom. So you need to
keep this one updated in order to appear in the LOD cloud.
Please tell me if you can't edit it, I know a former admin from the time
datahub.io was first created 10 years ago in LOD2 and LATC EU projects,
he might be able to do something in case there is nobody answering due
to datahub.io switching to a new style.
All the best,
Sebastian
On 07.05.2018 22:35, Lucas Werkmeister wrote:
Folks, I’m already in contact with John, there’s no need to contact
him again :)
Cheers, Lucas
Am Mo., 7. Mai 2018 um 19:32 Uhr schrieb Denny Vrandečić
<[email protected] <mailto:[email protected]>>:
Well, then, we have tried several times to get into that diagram,
and it never worked out.
So, given the page you linke, it says:
Contributing to the Diagram
First, make sure that you publish data according to the Linked
Data principles <http://www.w3.org/DesignIssues/LinkedData.html>.
We interpret this as:
* There must be /resolvable http:// (or https://) URIs/.
* They must resolve, with or without content negotiation, to
/RDF data/ in one of the popular RDF formats (RDFa, RDF/XML,
Turtle, N-Triples).
* The dataset must contain /at least 1000 triples/. (Hence, your
FOAF file most likely does not qualify.)
* The dataset must be connected via /RDF links/ to a dataset
that is already in the diagram. This means, either your
dataset must use URIs from the other dataset, or vice versa.
We arbitrarily require at least 50 links.
* Access of the /entire/ dataset must be possible via /RDF
crawling/, via an /RDF dump/, or via a /SPARQL endpoint/.
The process for adding datasets is still under development, please
contact John P. McCrae <mailto:[email protected]> to add a new dataset
Wikidata fulfills all the conditions easily. So, here we go, I am
adding John to this thread - although I know he already knows
about this request - and I am asking officially to enter Wikidata
into the LOD diagram.
Let's keep it all open, and see where it goes from here.
Cheers,
Denny
On Mon, May 7, 2018 at 4:15 AM Sebastian Hellmann
<[email protected]
<mailto:[email protected]>> wrote:
Hi Denny, Maarten,
you should read your own emails. In fact it is quite easy to
join the LOD cloud diagram.
The most important step is to follow the instructions on the
page: http://lod-cloud.net under how to contribute and then
add the metadata.
Some years ago I made a Wordpress with enabled Linked Data:
http://www.klappstuhlclub.de/wp/ Even this is included as I
simply added the metadata entry.
Do you really think John McCrae added a line in the code that
says "if (dataset==wikidata) skip; " ?
You just need to add it like everybody else in LOD, DBpedia
also created its entry and updates it now and then. The same
accounts for http://lov.okfn.org Somebody from Wikidata needs
to upload the Wikidata properties as OWL. If nobody does it,
it will not be in there.
All the best,
Sebastian
On 04.05.2018 18:33, Maarten Dammers wrote:
It almost feels like someone doesn’t want Wikidata in there?
Maybe that website is maintained by DBpedia fans? Just
thinking out loud here because DBpedia is very popular in the
academic world and Wikidata a huge threat for that popularity.
Maarten
Op 4 mei 2018 om 17:20 heeft Denny Vrandečić
<[email protected] <mailto:[email protected]>> het
volgende geschreven:
I'm pretty sure that Wikidata is doing better than 90% of
the current bubbles in the diagram.
If they wanted to have Wikidata in the diagram it would have
been there before it was too small to read it. :)
On Tue, May 1, 2018 at 7:47 AM Peter F. Patel-Schneider
<[email protected] <mailto:[email protected]>> wrote:
Thanks for the corrections.
So https://www.wikidata.org/entity/Q42 is *the* Wikidata
IRI for Douglas
Adams. Retrieving from this IRI results in a 303 See
Other to
https://www.wikidata.org/wiki/Special:EntityData/Q42,
which (I guess) is the
main IRI for representations of Douglas Adams and other
pages with
information about him.
From
https://www.wikidata.org/wiki/Special:EntityData/Q42 content
negotiation can be used to get the JSON representation
(the default), other
representations including Turtle, and human-readable
information. (Well
actually I'm not sure that this is really correct. It
appears that instead
of directly using content negotiation, another 303 See
Other is used to
provide an IRI for a document in the requested format.)
https://www.wikidata.org/wiki/Special:EntityData/Q42.json
and
https://www.wikidata.org/wiki/Special:EntityData/Q42.ttl
are the useful
machine-readable documents containing the Wikidata
information about Douglas
Adams. Content negotiation is not possible on these pages.
https://www.wikidata.org/wiki/Q42 is the IRI that
produces a human-readable
version of the information about Douglas Adams. Content
negotiation is not
possible on this page, but it does have link
rel="alternate" to the
machine-readable pages.
Strangely this page has a link rel="canonical" to
itself. Shouldn't that
link be to https://www.wikidata.org/entity/Q42? There is
a human-visible
link to this IRI, but there doesn't appear to be any
machine-readable link.
RDF links to other IRIs for Douglas Adams are given in
RDF pages by
properties in the wdtn namespace. Many, but not all,
identifiers are
handled this way. (Strangely ISNI (P213) isn't even
though it is linked on
the human-readable page.)
So it looks as if Wikidata can be considered as Linked
Open Data but maybe
some improvements can be made.
peter
On 05/01/2018 01:03 AM, Antoine Zimmermann wrote:
> On 01/05/2018 03:25, Peter F. Patel-Schneider wrote:
>> As far as I can tell real IRIs for Wikidata are https
URIs. The http IRIs
>> redirect to https IRIs.
>
> That's right.
>
>> As far as I can tell no content negotiation is
>> done.
>
> No, you're mistaken. Your tried the URL of a wikipage
in your curl command.
> Those are for human consumption, thus not available in
turtle.
>
> The "real IRIs" of Wikidata entities are like this:
> https://www.wikidata.org/entity/Q{NUMBER}
<https://www.wikidata.org/entity/Q%7BNUMBER%7D>
>
> However, they 303 redirect to
>
https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}
<https://www.wikidata.org/wiki/Special:EntityData/Q%7BNUMBER%7D>
>
> which is the identifier of a schema:Dataset. Then, if
you HTTP GET these
> URIs, you can content negotiate them to JSON
>
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.json
<https://www.wikidata.org/wiki/Special:EntityData/Q%7BNUMBER%7D.json>)
or to
> turtle
(https://www.wikidata.org/wiki/Special:EntityData/Q{NUMBER}.ttl
<https://www.wikidata.org/wiki/Special:EntityData/Q%7BNUMBER%7D.ttl>).
>
>
> Suprisingly, there is no connection between the entity
IRIs and the wikipage
> URLs. If one was given the IRI of an entity from
Wikidata, and had no
> further information about how Wikidata works, they
would not be able to
> retrieve HTML content about the entity.
>
>
> BTW, I'm not sure the implementation of content
negotiation in Wikidata is
> correct because the server does not tell me the format
of the resource to
> which it redirects (as opposed to what DBpedia does,
for instance).
>
>
> --AZ
_______________________________________________
Wikidata mailing list
[email protected]
<mailto:[email protected]>
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
[email protected]
<mailto:[email protected]>
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
[email protected]
<mailto:[email protected]>
https://lists.wikimedia.org/mailman/listinfo/wikidata
--
All the best,
Sebastian Hellmann
Director of Knowledge Integration and Linked Data Technologies
(KILT) Competence Center
at the Institute for Applied Informatics (InfAI) at Leipzig
University
Executive Director of the DBpedia Association
Projects: http://dbpedia.org, http://nlp2rdf.org,
http://linguistics.okfn.org,
https://www.w3.org/community/ld4lt
<http://www.w3.org/community/ld4lt>
Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
_______________________________________________
Wikidata mailing list
[email protected] <mailto:[email protected]>
https://lists.wikimedia.org/mailman/listinfo/wikidata
_______________________________________________
Wikidata mailing list
[email protected] <mailto:[email protected]>
https://lists.wikimedia.org/mailman/listinfo/wikidata
--
Lucas Werkmeister
Software Developer (Intern)
Wikimedia Deutschland e. V. | Tempelhofer Ufer 23-24 | 10963 Berlin
Phone: +49 (0)30 219 158 26-0
https://wikimedia.de
Imagine a world, in which every single human being can freely share in
the sum of all knowledge. That‘s our commitment.
Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e.
V. Eingetragen im Vereinsregister des Amtsgerichts
Berlin-Charlottenburg unter der Nummer 23855 B. Als gemeinnützig
anerkannt durch das Finanzamt für Körperschaften I Berlin,
Steuernummer 27/029/42207.
_______________________________________________
Wikidata mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata
--
All the best,
Sebastian Hellmann
Director of Knowledge Integration and Linked Data Technologies (KILT)
Competence Center
at the Institute for Applied Informatics (InfAI) at Leipzig University
Executive Director of the DBpedia Association
Projects: http://dbpedia.org, http://nlp2rdf.org,
http://linguistics.okfn.org, https://www.w3.org/community/ld4lt
<http://www.w3.org/community/ld4lt>
Homepage: http://aksw.org/SebastianHellmann
Research Group: http://aksw.org
_______________________________________________
Wikidata mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/wikidata