Re: [Wikidata-l] Super Lachaise, a mobile app based on Wikidata

2014-10-29 Thread Finn Årup Nielsen


Dear Pierre-Yves,

On 10/28/2014 05:41 PM, Pierre-Yves Beaudouin wrote:


I don't know because I'm not the developer of the app and my knowledge
is limited in this area. For many years now, I am collecting data
(information, photo, coordinates) about the cemetery. I've publish
everything on Commons, Wikidata and OSM, so developers can do something
smart with that ;)


How do you get the geocoordinates for the individual graves? Looing at
http://www.superlachaise.fr/ I see Guillaume Apollinaire. His Wikidata 
https://www.wikidata.org/wiki/Q133855 has no geodata. The cemetery link 
and Findagrave seem neither to have geodata.



- Finn Årup Nielsen

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Wikidata turns two!

2014-10-29 Thread Lydia Pintscher
Hey folks :)

Today Wikidata is turning two. It amazes me what we've achieved in
just 2 years. We've built an incredible project that is set out to
change the world. Thank you everyone who has been a part of this so
far.
We've put together some notes and opinions. And there are presents as
well! Check them out and leave your birthday wishes:
https://www.wikidata.org/wiki/Wikidata:Second_Birthday


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


[Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread Denny Vrandečić
Folks,

as you know, many Googlers are huge fans of Wikipedia. So here’s a little
gift for Wikidata’s second birthday.

Some of my smart colleagues at Google have run a few heuristics and
algorithms in order to discover Wikipedia articles in different languages
about the same topic which are missing language links between the articles.
The results contain more than 35,000 missing links with a high confidence
according to these algorithms. We estimate a precision of about 92+% (i.e.
we assume that less than 8% of those are wrong, based on our evaluation).
The dataset covers 60 Wikipedia language editions.

Here are the missing links, available for download from the WMF labs
servers:

https://tools.wmflabs.org/yichengtry/merge_candidate.20141028.csv

The data is published under CC-0.

What can you do with the data? Since it is CC-0, you can do anything you
want, obviously, but here are a few suggestions:

There’s a small tool on WMF labs that you can use to verify the links (it
displays the articles side by side from a language pair you select, and
then you can confirm or contradict the merge):

https://tools.wmflabs.org/yichengtry

The tool does not do the change in Wikidata itself, though (we thought it
would be too invasive if we did that). Instead, the results of the human
evaluation are saved on WMF labs. You are welcome to take the tool and
extend it with the possibility to upload the change directly on Wikidata,
if you so wish, or, once the data is verified, to upload the results.

Also, Magnus Manske is already busy uploading the data to the Wikidata
game, so you can very soon also play the merge game on the data directly.
He is also creating the missing items on Wikidata. Thanks Magnus for a very
pleasant cooperation!

I want to call out to my colleagues at Google who created the dataset -
Jiang Bian and Si Li - and to Yicheng Huang, the intern who developed the
tool on labs.

I hope that this small data release can help a little with further
improving the quality of Wikidata and Wikipedia! Thank you all, you are
awesome!

Cheers,
Denny



On Wed Oct 29 2014 at 10:52:05 AM Lydia Pintscher 
lydia.pintsc...@wikimedia.de wrote:

Hey folks :)

Today Wikidata is turning two. It amazes me what we've achieved in
just 2 years. We've built an incredible project that is set out to
change the world. Thank you everyone who has been a part of this so
far.
We've put together some notes and opinions. And there are presents as
well! Check them out and leave your birthday wishes:
https://www.wikidata.org/wiki/Wikidata:Second_Birthday


Cheers
Lydia

--
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread James Forrester
On Wed Oct 29 2014 at 10:56:42 Denny Vrandečić vrande...@google.com wrote:

 There’s a small tool on WMF labs that you can use to verify the links (it
 displays the articles side by side from a language pair you select, and
 then you can confirm or contradict the merge):

 https://tools.wmflabs.org/yichengtry


This is really fun, and so useful too. Thank you so much, Denny, Jiang
Bian, Si Li, and Yicheng Huang – Denny and the Googlers is a new band
name if ever there was one.
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata RDF

2014-10-29 Thread Markus Krötzsch

Martynas,

Denny is right. You could set up a Virtuoso endpoint based on our RDF 
exports. This would be quite nice to have. That's one important reason 
why we created the exports, and I really hope we will soon see this 
happening. We are dealing here with a very large project, and the 
decision for or against a technology is not just a matter of our 
personal preference. If RDF can demonstrate added value, then there will 
surely be resources to further extend the support for it. So far, we are 
in the lead: we provide close to one billion (!) triples Wikidata 
knowledge to the world. So far, there is no known use of this data. We 
need to go step by step: some support from us, some practical usage from 
the RDF community, some more support from us, ...


In reply to your initial email, Martynas, I have to say that you seem to 
have very little knowledge about what is going on in Wikidata. If you 
would follow the development reports more closely, you would know that 
most of the work is going into components that RDF does not replace at 
all. Querying with SPARQL is nice, but we are still more focussed on UI 
issues, history management, infrastructure integration (such as pushing 
changes to other sites), and many more things which are completely 
unrelated to RDF in every way. Your suggestion that a single file format 
would somehow magically make the construction of one of the 
world-largest community-edited knowledge bases a piece of cake is just 
naive.


Now don't get me wrong: naive thinking has it's place in Wikidata -- 
it's always naive to try what others consider impossible -- but it 
should be combined with some positive, forward thinking attitude. I hope 
that our challenge to show the power of RDF to us can unleash some 
positive energies in you :-) I am looking forward to your results (and 
happy to help if you need some more details about the RDF dumps etc.).


Best wishes,

Markus


On 29.10.2014 18:26, Denny Vrandečić wrote:

Martynas,

since we had this discussion on this list previously, and again I am
irked by your claim that we could just use standard RDF tools out of the
box for Wikidata.

I will shut up and concede that you are right if you manage to set up a
standard open source RDF tool on an open source stack that contains the
Wikidata knowledge base, is keeping up to date with the rate of changes
that we have, and is able to answer queries from the public without
choking and dying for 24 hours, before this year is over. Announce a few
days in advance on this list when you will make the experiment.

Technology has advanced by three years since we made the decision not to
use standard RDF tools, so I am sure it should be much easier today. But
last time I talked with people writing such tools, they were rather
cautious due to our requirements.

We still wouldn't have proven that it could deal with the expected QPS
Wikidata will have, but heck, I would be surprised and I would admit
that I was wrong with my decision if you can do that.

Seriously, we did not snub RDF and SPARQL because we don't like it or
don't know it. We decided against it *because* we know it so well and we
realized it does not fulfill our requirements.

Cheers,
Denny

On Mon Oct 27 2014 at 6:47:05 PM Martynas Jusevičius
marty...@graphity.org mailto:marty...@graphity.org wrote:

Hey all,

so I see there is some work being done on mapping Wikidata data model
to RDF [1].

Just a thought: what if you actually used RDF and Wikidata's concepts
modeled in it right from the start? And used standard RDF tools, APIs,
query language (SPARQL) instead of building the whole thing from
scratch?

Is it just me or was this decision really a colossal waste of resources?


[1] http://korrekt.org/papers/__Wikidata-RDF-export-2014.pdf
http://korrekt.org/papers/Wikidata-RDF-export-2014.pdf

Martynas
http://graphityhq.com

_
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org mailto:Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/__mailman/listinfo/wikidata-l
https://lists.wikimedia.org/mailman/listinfo/wikidata-l



___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l




___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata RDF

2014-10-29 Thread Phillip Rhodes
FWIW, put me in the camp of people who want to see wikidata available
via RDF as well.  I won't argue that RDF needs to be the *native*
format for Wikidata, but I think it would be a crying shame for such a
large knowledgebase to be cut off from seamless integration with the
rest of the LinkedData world.

That said, I don't really care if RDF/SPARQL support come later and
are treated as an add on, but I do think Wikidata should at least
have that as a goal for eventually.  And if I can help make that
happen, I'll try to pitch in however I can.   I have some experiments
I'm doing now, working on some new approaches to scaling RDF
triplestores, so using the Wikidata data may be an interesting testbed
for that down the road.

And on a related note - and apologies if this has been discussed to
death, but I haven't been on the list since the beginning - but I am
curious if there is any formal collaboration
(in-place|proposed|possible) between dbpedia and wikidata?


Phil

This message optimized for indexing by NSA PRISM


On Wed, Oct 29, 2014 at 2:34 PM, Markus Krötzsch
mar...@semantic-mediawiki.org wrote:
 Martynas,

 Denny is right. You could set up a Virtuoso endpoint based on our RDF
 exports. This would be quite nice to have. That's one important reason why
 we created the exports, and I really hope we will soon see this happening.
 We are dealing here with a very large project, and the decision for or
 against a technology is not just a matter of our personal preference. If RDF
 can demonstrate added value, then there will surely be resources to further
 extend the support for it. So far, we are in the lead: we provide close to
 one billion (!) triples Wikidata knowledge to the world. So far, there is no
 known use of this data. We need to go step by step: some support from us,
 some practical usage from the RDF community, some more support from us, ...

 In reply to your initial email, Martynas, I have to say that you seem to
 have very little knowledge about what is going on in Wikidata. If you would
 follow the development reports more closely, you would know that most of the
 work is going into components that RDF does not replace at all. Querying
 with SPARQL is nice, but we are still more focussed on UI issues, history
 management, infrastructure integration (such as pushing changes to other
 sites), and many more things which are completely unrelated to RDF in every
 way. Your suggestion that a single file format would somehow magically make
 the construction of one of the world-largest community-edited knowledge
 bases a piece of cake is just naive.

 Now don't get me wrong: naive thinking has it's place in Wikidata -- it's
 always naive to try what others consider impossible -- but it should be
 combined with some positive, forward thinking attitude. I hope that our
 challenge to show the power of RDF to us can unleash some positive energies
 in you :-) I am looking forward to your results (and happy to help if you
 need some more details about the RDF dumps etc.).

 Best wishes,

 Markus


 On 29.10.2014 18:26, Denny Vrandečić wrote:

 Martynas,

 since we had this discussion on this list previously, and again I am
 irked by your claim that we could just use standard RDF tools out of the
 box for Wikidata.

 I will shut up and concede that you are right if you manage to set up a
 standard open source RDF tool on an open source stack that contains the
 Wikidata knowledge base, is keeping up to date with the rate of changes
 that we have, and is able to answer queries from the public without
 choking and dying for 24 hours, before this year is over. Announce a few
 days in advance on this list when you will make the experiment.

 Technology has advanced by three years since we made the decision not to
 use standard RDF tools, so I am sure it should be much easier today. But
 last time I talked with people writing such tools, they were rather
 cautious due to our requirements.

 We still wouldn't have proven that it could deal with the expected QPS
 Wikidata will have, but heck, I would be surprised and I would admit
 that I was wrong with my decision if you can do that.

 Seriously, we did not snub RDF and SPARQL because we don't like it or
 don't know it. We decided against it *because* we know it so well and we
 realized it does not fulfill our requirements.

 Cheers,
 Denny

 On Mon Oct 27 2014 at 6:47:05 PM Martynas Jusevičius
 marty...@graphity.org mailto:marty...@graphity.org wrote:

 Hey all,

 so I see there is some work being done on mapping Wikidata data model
 to RDF [1].

 Just a thought: what if you actually used RDF and Wikidata's concepts
 modeled in it right from the start? And used standard RDF tools, APIs,
 query language (SPARQL) instead of building the whole thing from
 scratch?

 Is it just me or was this decision really a colossal waste of
 resources?


 [1] http://korrekt.org/papers/__Wikidata-RDF-export-2014.pdf
 

Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread Amir Ladsgroup
I can connect all of them by bot but I'm not sure it should be done
automatically.

Happy birthday Wikidata :)

On 10/29/14, James Forrester jdforres...@gmail.com wrote:
 On Wed Oct 29 2014 at 10:56:42 Denny Vrandečić vrande...@google.com
 wrote:

 There’s a small tool on WMF labs that you can use to verify the links (it
 displays the articles side by side from a language pair you select, and
 then you can confirm or contradict the merge):

 https://tools.wmflabs.org/yichengtry


 This is really fun, and so useful too. Thank you so much, Denny, Jiang
 Bian, Si Li, and Yicheng Huang – Denny and the Googlers is a new band
 name if ever there was one.



-- 
Amir

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread Asaf Bartov
This is great, thanks!

   A.

On Wed, Oct 29, 2014 at 11:10 AM, James Forrester jdforres...@gmail.com
wrote:

 On Wed Oct 29 2014 at 10:56:42 Denny Vrandečić vrande...@google.com
 wrote:

 There’s a small tool on WMF labs that you can use to verify the links (it
 displays the articles side by side from a language pair you select, and
 then you can confirm or contradict the merge):

 https://tools.wmflabs.org/yichengtry


 This is really fun, and so useful too. Thank you so much, Denny, Jiang
 Bian, Si Li, and Yicheng Huang – Denny and the Googlers is a new band
 name if ever there was one.


 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l




-- 
Asaf Bartov
Wikimedia Foundation http://www.wikimediafoundation.org

Imagine a world in which every single human being can freely share in the
sum of all knowledge. Help us make it a reality!
https://donate.wikimedia.org
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread Stryn@Wikimedia
Funnier to do manually ;)
Anyway, it's very nice tool!
Thanks to everyone who developed it.

*Stryn*

2014-10-29 21:37 GMT+02:00 Amir Ladsgroup ladsgr...@gmail.com:

 I can connect all of them by bot but I'm not sure it should be done
 automatically.

 Happy birthday Wikidata :)

 On 10/29/14, James Forrester jdforres...@gmail.com wrote:
  On Wed Oct 29 2014 at 10:56:42 Denny Vrandečić vrande...@google.com
  wrote:
 
  There’s a small tool on WMF labs that you can use to verify the links
 (it
  displays the articles side by side from a language pair you select, and
  then you can confirm or contradict the merge):
 
  https://tools.wmflabs.org/yichengtry
 
 
  This is really fun, and so useful too. Thank you so much, Denny, Jiang
  Bian, Si Li, and Yicheng Huang – Denny and the Googlers is a new band
  name if ever there was one.
 


 --
 Amir

 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread Jeroen De Dauw
Hey,

Does this mean we can also shoot a TODO list in the direction of Google? :)

Cheers

--
Jeroen De Dauw - http://www.bn2vs.com
Software craftsmanship advocate
Evil software architect at Wikimedia Germany
~=[,,_,,]:3
___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Birthday gift: Missing Wikipedia links (was Re: Wikidata turns two!)

2014-10-29 Thread Denny Vrandečić
Sure, you can keep all your todos with Google ;)

https://www.gmail.com/mail/help/tasks/

Cheers,
Denny


On Wed Oct 29 2014 at 2:58:03 PM Jeroen De Dauw jeroended...@gmail.com
wrote:

 Hey,

 Does this mean we can also shoot a TODO list in the direction of Google? :)

 Cheers

 --
 Jeroen De Dauw - http://www.bn2vs.com
 Software craftsmanship advocate
 Evil software architect at Wikimedia Germany
 ~=[,,_,,]:3
 ___
 Wikidata-l mailing list
 Wikidata-l@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata-l

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l


Re: [Wikidata-l] Wikidata RDF

2014-10-29 Thread Lydia Pintscher
Hey Phillip :)

On Wed, Oct 29, 2014 at 7:41 PM, Phillip Rhodes
motley.crue@gmail.com wrote:
 FWIW, put me in the camp of people who want to see wikidata available
 via RDF as well.  I won't argue that RDF needs to be the *native*
 format for Wikidata, but I think it would be a crying shame for such a
 large knowledgebase to be cut off from seamless integration with the
 rest of the LinkedData world.

 That said, I don't really care if RDF/SPARQL support come later and
 are treated as an add on, but I do think Wikidata should at least
 have that as a goal for eventually.  And if I can help make that
 happen, I'll try to pitch in however I can.   I have some experiments
 I'm doing now, working on some new approaches to scaling RDF
 triplestores, so using the Wikidata data may be an interesting testbed
 for that down the road.

 And on a related note - and apologies if this has been discussed to
 death, but I haven't been on the list since the beginning - but I am
 curious if there is any formal collaboration
 (in-place|proposed|possible) between dbpedia and wikidata?

Help with this would be awesome and totally welcome. The tracking bug
is at https://bugzilla.wikimedia.org/show_bug.cgi?id=48143


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata-l mailing list
Wikidata-l@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-l