Re: [Wikidata] Why do these two SPARQL queries take such different times to run?

2015-09-09 Thread Kingsley Idehen
On 9/9/15 4:07 PM, Stas Malyshev wrote:
> Hi!
>
>> > here's a query to find multiple humans with nationality:Greece that have
>> > the same day of birth and day of death:
>> >   http://tinyurl.com/ow6lpen
>> > It produces one pair, and executes in about 0.6 seconds.
>> > 
>> > Here's a query to try to add item numbers and labels to the previous
>> > search:
>> >   http://tinyurl.com/ovjwzc9
>> > 
>> > It *just* completes, taking just over 60 seconds to execute.
> It looks like some issue with nested queries in Blazegraph, I've sent a
> report to them and will see what they say.
>

What's the URL of the dataset loaded into your Blazegraph DBMS? Ideally,
we should have this data available from a variety of SPARQL Query
Services [1].

I know we loaded some Wikidata into the latest DBpedia 2015-4 release,
but I am not 100% sure we have all the datasets that are currently
available.

[1] https://www.wikidata.org/wiki/Wikidata:Data_access#SPARQL_endpoints

-- 
Regards,

Kingsley Idehen   
Founder & CEO 
OpenLink Software 
Company Web: http://www.openlinksw.com
Personal Weblog 1: http://kidehen.blogspot.com
Personal Weblog 2: http://www.openlinksw.com/blog/~kidehen
Twitter Profile: https://twitter.com/kidehen
Google+ Profile: https://plus.google.com/+KingsleyIdehen/about
LinkedIn Profile: http://www.linkedin.com/in/kidehen
Personal WebID: http://kingsley.idehen.net/dataspace/person/kidehen#this



smime.p7s
Description: S/MIME Cryptographic Signature
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Navino Evans
So much exciting news lately! many congratulations! :)

On 9 September 2015 at 22:29, David Cuenca Tudela  wrote:

> Just created "area" after two years of waiting! yay! congratulations! \o/
>
> On Wed, Sep 9, 2015 at 11:08 PM, Lydia Pintscher <
> lydia.pintsc...@wikimedia.de> wrote:
>
>> On Wed, Sep 9, 2015 at 9:49 PM, Lydia Pintscher
>>  wrote:
>> > Hey everyone :)
>> >
>> > As promised we just enabled support for quantities with units on
>> > Wikidata. So from now on you'll be able to store fancy things like the
>> > height of a mountain or the boiling point of an element.
>> >
>> > Quite a few properties have been waiting on unit support before they
>> > are created. I assume they will be created in the next hours and then
>> > you can go ahead and add all of the measurements.
>>
>> For anyone who is curious: Here is the list of properties already
>> created since unit support is available:
>>
>> https://www.wikidata.org/w/index.php?title=Special:ListProperties/quantity&limit=50&offset=103
>> and here is the list of properties that were waiting on unit support:
>> https://www.wikidata.org/wiki/Wikidata:Property_proposal/Pending/2
>> Those should change over the next hours/days.
>>
>>
>> Cheers
>> Lydia
>>
>> --
>> Lydia Pintscher - http://about.me/lydia.pintscher
>> Product Manager for Wikidata
>>
>> Wikimedia Deutschland e.V.
>> Tempelhofer Ufer 23-24
>> 10963 Berlin
>> www.wikimedia.de
>>
>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>
>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
>
>
> --
> Etiamsi omnes, ego non
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 
___

The Timeline of Everything

www.histropedia.com

Twitter  Facebo
ok
 Google +

   L inke
dIn

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread David Cuenca Tudela
Just created "area" after two years of waiting! yay! congratulations! \o/

On Wed, Sep 9, 2015 at 11:08 PM, Lydia Pintscher <
lydia.pintsc...@wikimedia.de> wrote:

> On Wed, Sep 9, 2015 at 9:49 PM, Lydia Pintscher
>  wrote:
> > Hey everyone :)
> >
> > As promised we just enabled support for quantities with units on
> > Wikidata. So from now on you'll be able to store fancy things like the
> > height of a mountain or the boiling point of an element.
> >
> > Quite a few properties have been waiting on unit support before they
> > are created. I assume they will be created in the next hours and then
> > you can go ahead and add all of the measurements.
>
> For anyone who is curious: Here is the list of properties already
> created since unit support is available:
>
> https://www.wikidata.org/w/index.php?title=Special:ListProperties/quantity&limit=50&offset=103
> and here is the list of properties that were waiting on unit support:
> https://www.wikidata.org/wiki/Wikidata:Property_proposal/Pending/2
> Those should change over the next hours/days.
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>



-- 
Etiamsi omnes, ego non
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Lydia Pintscher
On Wed, Sep 9, 2015 at 9:49 PM, Lydia Pintscher
 wrote:
> Hey everyone :)
>
> As promised we just enabled support for quantities with units on
> Wikidata. So from now on you'll be able to store fancy things like the
> height of a mountain or the boiling point of an element.
>
> Quite a few properties have been waiting on unit support before they
> are created. I assume they will be created in the next hours and then
> you can go ahead and add all of the measurements.

For anyone who is curious: Here is the list of properties already
created since unit support is available:
https://www.wikidata.org/w/index.php?title=Special:ListProperties/quantity&limit=50&offset=103
and here is the list of properties that were waiting on unit support:
https://www.wikidata.org/wiki/Wikidata:Property_proposal/Pending/2
Those should change over the next hours/days.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Leila Zia
This is very good news, congratulations! :-)

On Wed, Sep 9, 2015 at 1:26 PM, Amir Ladsgroup  wrote:

> I probably won't sleep tonight :)
>
> Best
>
> On Thu, Sep 10, 2015 at 12:48 AM Stryn  wrote:
>
>> Wow, finally, great!
>> I've been waiting for units so long.  I'm already in my bed, so will try
>> tomorrow then :-)
>>
>> Stryn 🐼
>> Sent from Windows Phone
>> --
>> Lähettäjä: Lydia Pintscher 
>> Lähetetty: ‎9.‎9.‎2015 23:00
>> Vastaanottaja: Discussion list for the Wikidata project.
>> 
>> Aihe: [Wikidata] Units are live! \o/
>>
>> Hey everyone :)
>>
>> As promised we just enabled support for quantities with units on
>> Wikidata. So from now on you'll be able to store fancy things like the
>> height of a mountain or the boiling point of an element.
>>
>> Quite a few properties have been waiting on unit support before they
>> are created. I assume they will be created in the next hours and then
>> you can go ahead and add all of the measurements.
>>
>>
>> Cheers
>> Lydia
>>
>> --
>> Lydia Pintscher - http://about.me/lydia.pintscher
>> Product Manager for Wikidata
>>
>> Wikimedia Deutschland e.V.
>> Tempelhofer Ufer 23-24
>> 10963 Berlin
>> www.wikimedia.de
>>
>> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>>
>> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
>> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
>> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Amir Ladsgroup
I probably won't sleep tonight :)

Best

On Thu, Sep 10, 2015 at 12:48 AM Stryn  wrote:

> Wow, finally, great!
> I've been waiting for units so long.  I'm already in my bed, so will try
> tomorrow then :-)
>
> Stryn 🐼
> Sent from Windows Phone
> --
> Lähettäjä: Lydia Pintscher 
> Lähetetty: ‎9.‎9.‎2015 23:00
> Vastaanottaja: Discussion list for the Wikidata project.
> 
> Aihe: [Wikidata] Units are live! \o/
>
> Hey everyone :)
>
> As promised we just enabled support for quantities with units on
> Wikidata. So from now on you'll be able to store fancy things like the
> height of a mountain or the boiling point of an element.
>
> Quite a few properties have been waiting on unit support before they
> are created. I assume they will be created in the next hours and then
> you can go ahead and add all of the measurements.
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Stryn
Wow, finally, great!
I've been waiting for units so long.  I'm already in my bed, so will try 
tomorrow then :-)

Stryn 🐼 
Sent from Windows Phone

- Alkuperäinen viesti -
Lähettäjä: "Lydia Pintscher" 
Lähetetty: ‎9.‎9.‎2015 23:00
Vastaanottaja: "Discussion list for the Wikidata project." 

Aihe: [Wikidata] Units are live! \o/

Hey everyone :)

As promised we just enabled support for quantities with units on
Wikidata. So from now on you'll be able to store fancy things like the
height of a mountain or the boiling point of an element.

Quite a few properties have been waiting on unit support before they
are created. I assume they will be created in the next hours and then
you can go ahead and add all of the measurements.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Why do these two SPARQL queries take such different times to run?

2015-09-09 Thread Stas Malyshev
Hi!

> here's a query to find multiple humans with nationality:Greece that have
> the same day of birth and day of death:
>   http://tinyurl.com/ow6lpen
> It produces one pair, and executes in about 0.6 seconds.
> 
> Here's a query to try to add item numbers and labels to the previous
> search:
>   http://tinyurl.com/ovjwzc9
> 
> It *just* completes, taking just over 60 seconds to execute.

It looks like some issue with nested queries in Blazegraph, I've sent a
report to them and will see what they say.

> Obviously the second query as written at the moment involves a
> sub-query, which inevitably must make it a bit slower -- but given the
> solution set of the sub-query only has two rows, and an exact date for a
> given property ought to be a fairly quick key to look up, why is the
> second query taking 100 times longer than the first ?

Yes, in theory it should be fast, so I suspect some kind of bug.
-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Units are live! \o/

2015-09-09 Thread Gerard Meijssen
Hoi,
It is a good day :)
Thank you
GerardM

On 9 September 2015 at 21:49, Lydia Pintscher 
wrote:

> Hey everyone :)
>
> As promised we just enabled support for quantities with units on
> Wikidata. So from now on you'll be able to store fancy things like the
> height of a mountain or the boiling point of an element.
>
> Quite a few properties have been waiting on unit support before they
> are created. I assume they will be created in the next hours and then
> you can go ahead and add all of the measurements.
>
>
> Cheers
> Lydia
>
> --
> Lydia Pintscher - http://about.me/lydia.pintscher
> Product Manager for Wikidata
>
> Wikimedia Deutschland e.V.
> Tempelhofer Ufer 23-24
> 10963 Berlin
> www.wikimedia.de
>
> Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.
>
> Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
> unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
> Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Units are live! \o/

2015-09-09 Thread Lydia Pintscher
Hey everyone :)

As promised we just enabled support for quantities with units on
Wikidata. So from now on you'll be able to store fancy things like the
height of a mountain or the boiling point of an element.

Quite a few properties have been waiting on unit support before they
are created. I assume they will be created in the next hours and then
you can go ahead and add all of the measurements.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata needs your votes

2015-09-09 Thread Lydia Pintscher
On Wed, Sep 9, 2015 at 10:01 AM, Ricordisamoa
 wrote:
> The 2nd round starts today! You can vote from
> http://hauptvoting.welt.de/mainvoting/list

Indeed :)
As in the previous round everyone gets one vote per day. We have some
good competition so every vote counts.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] more data for more sister projects coming right up

2015-09-09 Thread Lydia Pintscher
Hey folks :)

It is time to give some more sister projects access to the sitelink or
data part of Wikidata. The schedule is as follows:

* September 22nd: Wikibooks will get access to statements (aka Phase 2)
* Oktober 20th: Meta, Mediawiki and Wikispecies will get access to the
sitelinks (aka Phase 1)

Please give them a warm welcome and watch
https://www.wikidata.org/wiki/Wikidata:Wikibooks,
https://www.wikidata.org/wiki/Wikidata:Meta-Wiki and
https://www.wikidata.org/wiki/Wikidata:Wikispecies for questions.


Cheers
Lydia

-- 
Lydia Pintscher - http://about.me/lydia.pintscher
Product Manager for Wikidata

Wikimedia Deutschland e.V.
Tempelhofer Ufer 23-24
10963 Berlin
www.wikimedia.de

Wikimedia Deutschland - Gesellschaft zur Förderung Freien Wissens e. V.

Eingetragen im Vereinsregister des Amtsgerichts Berlin-Charlottenburg
unter der Nummer 23855 Nz. Als gemeinnützig anerkannt durch das
Finanzamt für Körperschaften I Berlin, Steuernummer 27/681/51985.

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Why do these two SPARQL queries take such different times to run?

2015-09-09 Thread James Heald

Are you sure?  121 ms ??

It's consistently not working for me.  (Whereas the first query 
consistently does; but even that just took 457 ms).


However, I have been finding Reasonator's date lookup feature, eg
   https://tools.wmflabs.org/reasonator/?date=1889-11-10
a very very useful way to work round this.

All best,

  James.


On 09/09/2015 16:06, Magnus Manske wrote:

Your "labeled" example just ran for me in 121ms.

Maybe the server gets overloaded at times and goes into disk swap? Nothing
to do with the query?

On Wed, Sep 9, 2015 at 2:06 PM James Heald  wrote:


Prompted by this thread at Project Chat,
https://www.wikidata.org/wiki/Wikidata:Project_chat#Identical_data_sets

here's a query to find multiple humans with nationality:Greece that have
the same day of birth and day of death:
http://tinyurl.com/ow6lpen
It produces one pair, and executes in about 0.6 seconds.

Here's a query to try to add item numbers and labels to the previous
search:
http://tinyurl.com/ovjwzc9

It *just* completes, taking just over 60 seconds to execute.

(Please don't merge the two items yet, because that will destroy the
example).

Analogous queries with lookups for France (71 apparent sets of
duplicates), UK (32), and Italy(14) fail to complete.


Two questions therefore:
(1)  Why are the two queries taking such different times to run ?
(2)  Is there a good way to rewrite the second to make it faster ?


Obviously the second query as written at the moment involves a
sub-query, which inevitably must make it a bit slower -- but given the
solution set of the sub-query only has two rows, and an exact date for a
given property ought to be a fairly quick key to look up, why is the
second query taking 100 times longer than the first ?

And is there a better way I should be doing this, since the query does
appear to be producing useful real matches ?

Thanks,

 James.


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata





___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata




___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Why do these two SPARQL queries take such different times to run?

2015-09-09 Thread Magnus Manske
Your "labeled" example just ran for me in 121ms.

Maybe the server gets overloaded at times and goes into disk swap? Nothing
to do with the query?

On Wed, Sep 9, 2015 at 2:06 PM James Heald  wrote:

> Prompted by this thread at Project Chat,
>https://www.wikidata.org/wiki/Wikidata:Project_chat#Identical_data_sets
>
> here's a query to find multiple humans with nationality:Greece that have
> the same day of birth and day of death:
>http://tinyurl.com/ow6lpen
> It produces one pair, and executes in about 0.6 seconds.
>
> Here's a query to try to add item numbers and labels to the previous
> search:
>http://tinyurl.com/ovjwzc9
>
> It *just* completes, taking just over 60 seconds to execute.
>
> (Please don't merge the two items yet, because that will destroy the
> example).
>
> Analogous queries with lookups for France (71 apparent sets of
> duplicates), UK (32), and Italy(14) fail to complete.
>
>
> Two questions therefore:
> (1)  Why are the two queries taking such different times to run ?
> (2)  Is there a good way to rewrite the second to make it faster ?
>
>
> Obviously the second query as written at the moment involves a
> sub-query, which inevitably must make it a bit slower -- but given the
> solution set of the sub-query only has two rows, and an exact date for a
> given property ought to be a fairly quick key to look up, why is the
> second query taking 100 times longer than the first ?
>
> And is there a better way I should be doing this, since the query does
> appear to be producing useful real matches ?
>
> Thanks,
>
> James.
>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Why do these two SPARQL queries take such different times to run?

2015-09-09 Thread James Heald

Prompted by this thread at Project Chat,
  https://www.wikidata.org/wiki/Wikidata:Project_chat#Identical_data_sets

here's a query to find multiple humans with nationality:Greece that have 
the same day of birth and day of death:

  http://tinyurl.com/ow6lpen
It produces one pair, and executes in about 0.6 seconds.

Here's a query to try to add item numbers and labels to the previous search:
  http://tinyurl.com/ovjwzc9

It *just* completes, taking just over 60 seconds to execute.

(Please don't merge the two items yet, because that will destroy the 
example).


Analogous queries with lookups for France (71 apparent sets of 
duplicates), UK (32), and Italy(14) fail to complete.



Two questions therefore:
(1)  Why are the two queries taking such different times to run ?
(2)  Is there a good way to rewrite the second to make it faster ?


Obviously the second query as written at the moment involves a 
sub-query, which inevitably must make it a bit slower -- but given the 
solution set of the sub-query only has two rows, and an exact date for a 
given property ought to be a fairly quick key to look up, why is the 
second query taking 100 times longer than the first ?


And is there a better way I should be doing this, since the query does 
appear to be producing useful real matches ?


Thanks,

   James.


___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [ANNOUNCEMENT] first StrepHit dataset for the primary sources tool

2015-09-09 Thread Marco Fossati

Hi Markus, everyone,

The project proposal is currently in active development.
I would like to focus now on the dissemination of the idea and the 
engagement of the Wikidata community.

Hence, I would love to gather feedback on the following question:

Does StrepHit sounds interesting and useful for you?

It would be great if you could report your thoughts on the project talk 
page:

https://meta.wikimedia.org/wiki/Grants_talk:IEG/StrepHit:_Wikidata_Statements_Validation_via_References

Cheers!

On 9/8/15 2:02 PM, wikidata-requ...@lists.wikimedia.org wrote:

Date: Mon, 07 Sep 2015 16:47:16 +0200
From: Markus Krötzsch
To: "Discussion list for the Wikidata project."

Subject: Re: [Wikidata] [ANNOUNCEMENT] first StrepHit dataset for the
primary sources tool
Message-ID:<55eda374.2090...@semantic-mediawiki.org>
Content-Type: text/plain; charset=utf-8; format=flowed

Dear Marco,

Sounds interesting, but the project page still has a lot of gaps. Will
you notify us again when you are done? It is a bit tricky to endorse a
proposal that is not finished yet;-)

Markus

On 04.09.2015 17:01, Marco Fossati wrote:

>[Begging pardon if you have already read this in the Wikidata project chat]
>
>Hi everyone,
>
>As Wikidatans, we all know how much data quality matters.
>We all know what high quality stands for: statements need to be
>validated via references to external, non-wiki, sources.
>
>That's why the primary sources tool is being developed:
>https://www.wikidata.org/wiki/Wikidata:Primary_sources_tool
>And that's why I am preparing the StrepHit IEG proposal:
>https://meta.wikimedia.org/wiki/Grants:IEG/StrepHit:_Wikidata_Statements_Validation_via_References
>
>
>StrepHit (pronounced "strep hit", means "Statement? repherence it!") is
>a Natural Language Processing pipeline that understands human language,
>extracts structured data from raw text and produces Wikidata statements
>with reference URLs.
>
>As a demonstration to support the IEG proposal, you can find the
>**FBK-strephit-soccer** dataset uploaded to the primary sources tool
>backend.
>It's a small dataset serving the soccer domain use case.
>Please follow the instructions on the project page to activate it and
>start playing with the data.
>
>What is the biggest difference that sets StrepHit datasets apart from
>the currently uploaded ones?
>At least one reference URL is always guaranteed for each statement.
>This means that if StrepHit finds some new statement that was not there
>in Wikidata before, it will always propose its external references.
>We do not want to manually reject all the new statements with no
>reference, right?
>
>If you like the idea, please endorse the StrepHit IEG proposal!


--
Marco Fossati
http://about.me/marco.fossati
Twitter: @hjfocs
Skype: hell_j

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata Live RDF?

2015-09-09 Thread Markus Krötzsch

On 09.09.2015 08:30, Stas Malyshev wrote:

Hi!


Now that the SPARQL endpoint is "official", will the life RDF data
(which you get, e.g, via Special:EntityData) also be switched to show
the content using in SPARQL? Since this is already implemented, I guess


I think it might be a good idea.
We have a number of "flavors" in the data now, that include different
aspects of RDF. E.g.
https://www.wikidata.org/wiki/Special:EntityData/Q1.ttl?flavor=simple
produces only "simple" statements, flavor=full produces everything we
know, and flavor=dump produces the same things we have in RDF dump. We
can of course create new flavors with different aspects included or
excluded. So what should default RDF content display?

We also have a tracking bug for this:
https://phabricator.wikimedia.org/T101837 so maybe we should discuss it
there.


Ah, thanks for reminding me. I have commented there. I think the default 
should be to simply return all data that is in the dumps.


Markus




___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Announcing the release of the Wikidata Query Service

2015-09-09 Thread Markus Krötzsch

Good morning :-)

On 09.09.2015 00:45, Stas Malyshev wrote:

Hi!


P.S. I am not convinced yet of this non-standard extension of SPARQL to
fetch labels. Its behaviour based on the variables given in SELECT seems


You don't have to use variables in SELECT, it's just a shortcut.


What I meant with my comment is that the SPARQL semantics does not allow 
for extensions that modify the query semantics based on the selected 
variables. Even if this is optional, it changes some fundamental 
assumptions about how SPARQL works.


...


Nothing prevents us from creating such UI, but for many purposes having
to do extra step to see labels does not seem optimal for me, especially
if data is intended for human consumption.


I agree that creating such a UI should not be left to WMF or WMDE 
developers. The SPARQL web api is there for everybody to use. One could 
also start from general SPARQL tools such as YASGUI (about.yasgui.org) 
as a basis for SPARQL editor.


An extra step should not be needed. Users would just use a query page 
like the one we have now. Only the display of the result table would be 
modified so that there is a language selector above the table: if a 
language is selected, all URIs that refer to Wikidata entities will get 
a suitable label as their anchor text. One could also have a the option 
to select "no language" where only the item ids are shown.





results), one could as well have some JavaScript there to beautify the
resulting item URIs based on client-side requests. Maybe some consumers


I'm not sure what you mean by "beautify". If you mean to fetch labels,
querying labels separately would slow things down significantly.


There should be no slowdown in the SPARQL service, since the labels 
would be fetched client-side. There would be a short delay between the 
arrival of the results and the fetching of the labels. We already have 
similar delays when opening a Wikidata page (site names, for example, 
take a moment to fetch). Wikidata Query/Autolist also uses this method 
to fetch labels client-side, and the delay is not too big (it is 
possible to fetch up to 50 labels in one request).


With beautify I mean that one could do further things, such as having a 
tooltip when hovering over entities in results that shows more 
information (or maybe fetches an additional description). That's where 
people can be creative. I think these kinds of features are best placed 
in the hands of community members. We can easily have several SPARQL UIs.





really need to get labels from SPARQL, but at least the users who want
to see results right away would not need this.


Then why wouldn't these users just ignore the label service altogether?


Because all examples are using it, and many users are learning SPARQL 
now from the examples. This is the main reason why I care at all. After 
all, every SPARQL processor has some built-in extensions that deviate 
from the standard.


Markus

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] Wikidata needs your votes

2015-09-09 Thread Ricordisamoa
The 2nd round starts today! You can vote from 
http://hauptvoting.welt.de/mainvoting/list


Il 12/08/2015 14:47, Lydia Pintscher ha scritto:

Hey folks,

As you know Wikidata is one of the 100 winners of the "Germany - Land
der Ideen" competition 2015. We now have the chance to win the
Public’s Choice Award as well! At the moment, Wikidata is ranked 15th.
We need to be among the first 10 to enter the next round and win the
Public’s Choice Award. Voting is open until 23 August.

The information about the competition and projects is available in
English 
(https://www.land-der-ideen.de/en/projects-germany/landmarks-land-ideas/competition-2015-urban-space-rural-space-cyberspace
andhttps://www.land-der-ideen.de/en/projects-germany/landmarks-land-ideas/2015-award-recipients/category-science/wikidata)
but the voting process only in German. But it’s pretty easy to vote
anyway:
* Go to the Wikidata voting page at
https://www.land-der-ideen.de/ausgezeichnete-orte/preistraeger/wikidata
* Click the yellow button on the right ("Jetzt abstimmen").
* Type in your email address and tick the box to agree to the voting rules.
* You will receive a link via email. Click the link within 24 hours.

You can repeat this daily until 23 August. You have one vote every day.

Let's win this! :D


Cheers
Lydia




___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Dbpedia-discussion] Fwd: Announcing the release of the Wikidata Query Service

2015-09-09 Thread Stas Malyshev
Hi!

> I don't know what the implementation status is for generated URIs like
> http://www.wikidata.org/value/8000228965cf554cf1baf641980f657d, since we
> cannot resolve them so easily. One way would be to redirect to a
> suitable DESCRIBE query result on the SPARQL endpoint. But this may
> require some more implementation work first. I think our main goal
> should be to have proper RDF replies for items and (all URI variants of)
> properties.

That's an interesting idea, did not think about it this way. I'll try
and see if it will work.

-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [Dbpedia-discussion] Fwd: Announcing the release of the Wikidata Query Service

2015-09-09 Thread Stas Malyshev
Hi!

> One more thing, if you click on [1] and then click on the HTTP URIs in
> the solution, none of them resolve (bar schema.org). Is this an
> oversight re., Linked Data deployment?

It's not oversight in the meaning we know about it, we just didn't do it
yet. See: https://phabricator.wikimedia.org/T97195

-- 
Stas Malyshev
smalys...@wikimedia.org

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] (Almost) empty items

2015-09-09 Thread Addshore
I know.

I was simply explaining what happens and why it happens in case the
community feels like something needs to be addressed here, either a change
in the tool or a change in the API..

Addshore

On 9 September 2015 at 08:33, Gerard Meijssen 
wrote:

> Hoi,
> The point by Magnus is that he uses an API. When his software is to blame,
> blame the API. That is all.
> Thanks,
>  GerardM
>
> On 8 September 2015 at 13:24, Addshore  wrote:
>
>> Using Q19921058 as an example as the item is not empty a redirect will
>> not / can not be created.
>>
>> On 8 September 2015 at 12:06, Magnus Manske 
>> wrote:
>>
>>> WiDaR uses the Wikidata API to merge. Redirect (pun intended) all blame
>>> there.
>>>
>>> On Tue, Sep 8, 2015 at 10:00 AM Andre Engels 
>>> wrote:
>>>
 WiDaR (which is used by the Wikidata game, among others) creates
 (almost) empty items when it I'd used to merge  two items with different
 descriptions in some language. In that case all other stuff is merged, but
 the item that should become a redirect is left behind with only the
 problematic description and nothing else. See Q19921058 for  case where
 this issue created such an item after a recent merge attempt of mine.
 ___
 Wikidata mailing list
 Wikidata@lists.wikimedia.org
 https://lists.wikimedia.org/mailman/listinfo/wikidata

>>>
>>> ___
>>> Wikidata mailing list
>>> Wikidata@lists.wikimedia.org
>>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>>
>>>
>>
>>
>> --
>> Addshore
>>
>> ___
>> Wikidata mailing list
>> Wikidata@lists.wikimedia.org
>> https://lists.wikimedia.org/mailman/listinfo/wikidata
>>
>>
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata
>
>


-- 
Addshore
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata