[Wikidata-bugs] [Maniphest] [Updated] T241169: Create database dump of new Wikibase term store

2019-12-22 Thread ArielGlenn
ArielGlenn closed this task as a duplicate of T226167: audit public tables and 
make sure we dump them all.

TASK DETAIL
  https://phabricator.wikimedia.org/T241169

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: ArielGlenn
Cc: ArielGlenn, Bugreporter, darthmon_wmde, Nandana, Lahi, Gq86, 
GoranSMilovanovic, Lunewa, QZanden, LawExplorer, _jensen, rosalieper, 
Scott_WUaS, gnosygnu, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Updated] T241169: Create database dump of new Wikibase term store

2019-12-22 Thread ArielGlenn
ArielGlenn added a comment.


  This is already covered in T226167 
 and a patch set is waiting to be 
merged once migration is complete :-)

TASK DETAIL
  https://phabricator.wikimedia.org/T241169

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: ArielGlenn
Cc: ArielGlenn, Bugreporter, darthmon_wmde, Nandana, Lahi, Gq86, 
GoranSMilovanovic, Lunewa, QZanden, LawExplorer, _jensen, rosalieper, 
Scott_WUaS, gnosygnu, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Updated] T241338: Allow linking to specific statements in Structured data on Commons

2019-12-22 Thread Jarekt
Jarekt added a subscriber: dcausse.
Jarekt added a comment.


  In T222321  @dcausse also thought 
we should have
  
  > direct reference to a particular statement (like the statement id anchor on 
wikibase entity e.g. 
https://www.wikidata.org/wiki/Q1#Q1$8983b0ea-4a9c-0902-c0db-785db33f767c)
  
  I would like to have it so if an infobox pulls some informatio from SDC than 
I would like to add an edit icon to it, which when clicked would lead to the 
statement where the information come from.

TASK DETAIL
  https://phabricator.wikimedia.org/T241338

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Jarekt
Cc: dcausse, Jarekt, Aklapper, darthmon_wmde, Nandana, JKSTNK, Lahi, 
PDrouin-WMF, Gq86, E1presidente, Ramsey-WMF, Cparle, Anooprao, SandraF_WMF, 
GoranSMilovanovic, QZanden, Tramullas, Acer, LawExplorer, Salgo60, Silverfish, 
Poyekhali, _jensen, rosalieper, Taiwania_Justo, Scott_WUaS, Susannaanas, 
Ixocactus, Wong128hk, Jane023, Wikidata-bugs, Base, matthiasmullie, aude, 
El_Grafo, Dinoguy1000, Ricordisamoa, Wesalius, Lydia_Pintscher, Fabrice_Florin, 
Raymond, Steinsplitter, Mbch331, Keegan
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Created] T241338: Allow linking to specific statements in Structured data on Commons

2019-12-22 Thread Jarekt
Jarekt created this task.
Jarekt added projects: Commons, SDC General.
Restricted Application added a subscriber: Aklapper.
Restricted Application added a project: Wikidata.

TASK DESCRIPTION
  On Wikidata you can have a link like 
https://www.wikidata.org/wiki/Q56051850#P6216 that links to P6216 
 property of Q56051850 item. On 
Commons link like 
https://commons.wikimedia.org/wiki/File:Arbeiderswoning_Oostwold_4.jpg#P6216 
should also link to  P6216  property 
of Arbeiderswoning_Oostwold_4.jpg file.

TASK DETAIL
  https://phabricator.wikimedia.org/T241338

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Jarekt
Cc: Jarekt, Aklapper, darthmon_wmde, Nandana, JKSTNK, Lahi, PDrouin-WMF, Gq86, 
E1presidente, Ramsey-WMF, Cparle, Anooprao, SandraF_WMF, GoranSMilovanovic, 
QZanden, Tramullas, Acer, LawExplorer, Salgo60, Silverfish, Poyekhali, _jensen, 
rosalieper, Taiwania_Justo, Scott_WUaS, Susannaanas, Ixocactus, Wong128hk, 
Jane023, Wikidata-bugs, Base, matthiasmullie, aude, El_Grafo, Dinoguy1000, 
Ricordisamoa, Wesalius, Lydia_Pintscher, Fabrice_Florin, Raymond, 
Steinsplitter, Mbch331, Keegan
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikidata] [ANN] nomunofu v0.1.0

2019-12-22 Thread Ted Thibodeau Jr

On Dec 22, 2019, at 03:17 PM, Amirouche Boubekki  
wrote:
> 
> Hello all ;-)
> 
> 
> I ported the code to Chez Scheme to do an apple-to-apple comparison
> between GNU Guile and Chez and took the time to launch a few queries
> against Virtuoso available in Ubuntu 18.04 (LTS).

Hi, Amirouche --

Kingsley's points about tuning Virtuoso to use available 
RAM [1] and other system resources are worth looking into, 
but a possibly more important first question is --

   Exactly what version of Virtuoso are you testing?

If you followed the common script on Ubuntu 18.04, i.e., --

   sudo apt update

   sudo apt install virtuoso-opensource

-- then you likely have version 6.1.6 of VOS, the Open Source 
Edition of Virtuoso, which shipped 2012-08-02 [2], and is far
behind the latest version of both VOS (v7.2.5+) and Enterprise 
Edition (v8.3+)!

The easiest way to confirm what you're running is to review 
the first "paragraph" of output from the command corresponding 
to the name of your Virtuoso binary --

   virtuoso-t -?

   virtuoso-iodbc-t -?

If I'm right, and you're running 6.x, you'll get much better
test results just by running a current version of Virtuoso.

You can build VOS 7.2.6+ from source [3] (we'd recommend the 
develop/7 branch [4] for the absolute latest), or download a 
precompiled binary [5] of VOS 7.2.5.1 or 7.2.6.dev.

You can also try Enterprise Edition at no cost for 30 days [5].



[1] http://vos.openlinksw.com/owiki/wiki/VOS/VirtRDFPerformanceTuning

[2] 
http://vos.openlinksw.com/owiki/wiki/VOS/VOSNews2012#2012-08-02%20--%20Announcing%20Virtuoso%20Open-Source%20Edition%20v6.1.6.

[3] http://vos.openlinksw.com/owiki/wiki/VOS/VOSBuild

[4] https://github.com/openlink/virtuoso-opensource/tree/develop/7

[5] https://sourceforge.net/projects/virtuoso/files/virtuoso/ 





> Spoiler: the new code is always faster.
> 
> The hard disk is SATA, and the CPU is dubbed: Intel(R) Xeon(R) CPU
> E3-1220 V2 @ 3.10GHz
> 
> I imported latest-lexeme.nt (6GB) using guile-nomunofu, chez-nomunofu
> and Virtuoso:
> 
> - Chez takes 40 minutes to import 6GB
> - Chez is 3 to 5 times faster than Guile
> - Chez is 11% faster than Virtuoso


How did you load the data?  Did you use Virtuoso's build-load
facilities?  This is the recommended method [6].

[6] http://vos.openlinksw.com/owiki/wiki/VOS/VirtBulkRDFLoader


> Regarding query time, Chez is still faster than Virtuoso with or
> without cache.  The query I am testing is the following:
> 
> SELECT ?s ?p ?o
> FROM 
> WHERE {
>  ?s   
> .
>  ?s 
>  .
>  ?s  ?o
> };
> 
> Virtuoso first query takes: 1295 msec.
> The second query takes: 331 msec.
> Then it stabilize around: 200 msec.
> 
> chez nomunofu takes around 200ms without cache.
> 
> There is still an optimization I can do to speed up nomunofu a little.
> 
> 
> Happy hacking!


I'll be interested to hear your new results, with a current build,
and with proper INI tuning in place.

Regards,

Ted



--
A: Yes.  http://www.idallen.com/topposting.html
| Q: Are you sure?   
| | A: Because it reverses the logical flow of conversation.
| | | Q: Why is top posting frowned upon?

Ted Thibodeau, Jr.   //   voice +1-781-273-0900 x32
Senior Support & Evangelism  //mailto:tthibod...@openlinksw.com
 //  http://twitter.com/TallTed
OpenLink Software, Inc.  //  http://www.openlinksw.com/
 20 Burlington Mall Road, Suite 322, Burlington MA 01803
 Weblog-- http://www.openlinksw.com/blogs/
 Community -- https://community.openlinksw.com/
 LinkedIn  -- http://www.linkedin.com/company/openlink-software/
 Twitter   -- http://twitter.com/OpenLink
 Facebook  -- http://www.facebook.com/OpenLinkSoftware
Universal Data Access, Integration, and Management Technology Providers






smime.p7s
Description: S/MIME cryptographic signature
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata-bugs] [Maniphest] [Updated] T240862: Can't do shallow clone from phabricator

2019-12-22 Thread Legoktm
Legoktm added a project: Regression.
Legoktm added a comment.


  This is definitely a regression, as codesearch used to be able to do shallow 
clones fine.
  
  Is there an error log on the server for the HTTP 500 error?

TASK DETAIL
  https://phabricator.wikimedia.org/T240862

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Legoktm
Cc: Legoktm, mmodell, Addshore, Aklapper, Ladsgroup, darthmon_wmde, Jelabra, 
DannyS712, Nandana, Majesticalreaper22, Lahi, Gq86, GoranSMilovanovic, 
Jayprakash12345, QZanden, LawExplorer, JJMC89, _jensen, rosalieper, Scott_WUaS, 
Vedmaka, Wong128hk, Luke081515, Asahiko, Wikidata-bugs, aude, Mbch331, Jay8g, 
Krenair
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikidata] Contd: [ANN] nomunofu v0.1.0

2019-12-22 Thread Amirouche Boubekki
Le dim. 22 déc. 2019 à 21:23, Kingsley Idehen  a écrit :
>
> On 12/22/19 4:17 PM, Kingsley Idehen wrote:
>
> On 12/22/19 3:17 PM, Amirouche Boubekki wrote:
>
> Hello all ;-)
>
>
> I ported the code to Chez Scheme to do an apple-to-apple comparison
> between GNU Guile and Chez and took the time to launch a few queries
> against Virtuoso available in Ubuntu 18.04 (LTS).
>
> Spoiler: the new code is always faster.
>
> The hard disk is SATA, and the CPU is dubbed: Intel(R) Xeon(R) CPU
> E3-1220 V2 @ 3.10GHz
>
> I imported latest-lexeme.nt (6GB) using guile-nomunofu, chez-nomunofu
> and Virtuoso:
>
> - Chez takes 40 minutes to import 6GB
> - Chez is 3 to 5 times faster than Guile
> - Chez is 11% faster than Virtuoso
>
> Regarding query time, Chez is still faster than Virtuoso with or
> without cache.  The query I am testing is the following:
>
> SELECT ?s ?p ?o
> FROM 
> WHERE {
>   ?s  
>  .
>   ?s 
>  .
>   ?s  ?o
> };
>
> Virtuoso first query takes: 1295 msec.
> The second query takes: 331 msec.
> Then it stabilize around: 200 msec.
>
> chez nomunofu takes around 200ms without cache.
>
> There is still an optimization I can do to speed up nomunofu a little.
>
>
> Happy hacking!
>
> If you are going to make claims about Virtuoso, please shed light on
> your Virtuoso configuration and host machine.
>
> How much memory do you have on this machine? What the CPU Affinity re
> CPUs available.

I did not setup CPU affinity?.. I am not sure what it is.

>
> Is there a URL for sample data used in your tests?

The sample data is
https://dumps.wikimedia.org/wikidatawiki/entities/latest-lexemes.nt.bz2

>
>
> Looking at 
> https://ark.intel.com/content/www/us/en/ark/products/65734/intel-xeon-processor-e3-1220-v2-8m-cache-3-10-ghz.html,
>  your Virtuoso INI settings are even more important due to the fact that we 
> have CPU Affinity of 4 in play i.e., you need configure Virtuoso such that it 
> optimizes behavior for this setup.
>

Thanks a lot for the input. My .ini file is empty. There is 4 cpu/core
on the machine I using with 32GB or RAM. What should I do?

Thanks in advance!

> --
> Regards,
>
> Kingsley Idehen
> Founder & CEO
> OpenLink Software
> Home Page: http://www.openlinksw.com
> Community Support: https://community.openlinksw.com
> Weblogs (Blogs):
> Company Blog: https://medium.com/openlink-software-blog
> Virtuoso Blog: https://medium.com/virtuoso-blog
> Data Access Drivers Blog: 
> https://medium.com/openlink-odbc-jdbc-ado-net-data-access-drivers
>
> Personal Weblogs (Blogs):
> Medium Blog: https://medium.com/@kidehen
> Legacy Blogs: http://www.openlinksw.com/blog/~kidehen/
>   http://kidehen.blogspot.com
>
> Profile Pages:
> Pinterest: https://www.pinterest.com/kidehen/
> Quora: https://www.quora.com/profile/Kingsley-Uyi-Idehen
> Twitter: https://twitter.com/kidehen
> Google+: https://plus.google.com/+KingsleyIdehen/about
> LinkedIn: http://www.linkedin.com/in/kidehen
>
> Web Identities (WebID):
> Personal: http://kingsley.idehen.net/public_home/kidehen/profile.ttl#i
> : 
> http://id.myopenlink.net/DAV/home/KingsleyUyiIdehen/Public/kingsley.ttl#this
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata



-- 
Amirouche ~ https://hyper.dev

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata] Contd: [ANN] nomunofu v0.1.0

2019-12-22 Thread Kingsley Idehen
On 12/22/19 4:17 PM, Kingsley Idehen wrote:
> On 12/22/19 3:17 PM, Amirouche Boubekki wrote:
>> Hello all ;-)
>>
>>
>> I ported the code to Chez Scheme to do an apple-to-apple comparison
>> between GNU Guile and Chez and took the time to launch a few queries
>> against Virtuoso available in Ubuntu 18.04 (LTS).
>>
>> Spoiler: the new code is always faster.
>>
>> The hard disk is SATA, and the CPU is dubbed: Intel(R) Xeon(R) CPU
>> E3-1220 V2 @ 3.10GHz
>>
>> I imported latest-lexeme.nt (6GB) using guile-nomunofu, chez-nomunofu
>> and Virtuoso:
>>
>> - Chez takes 40 minutes to import 6GB
>> - Chez is 3 to 5 times faster than Guile
>> - Chez is 11% faster than Virtuoso
>>
>> Regarding query time, Chez is still faster than Virtuoso with or
>> without cache.  The query I am testing is the following:
>>
>> SELECT ?s ?p ?o
>> FROM 
>> WHERE {
>>   ?s  
>>  .
>>   ?s 
>>  .
>>   ?s  ?o
>> };
>>
>> Virtuoso first query takes: 1295 msec.
>> The second query takes: 331 msec.
>> Then it stabilize around: 200 msec.
>>
>> chez nomunofu takes around 200ms without cache.
>>
>> There is still an optimization I can do to speed up nomunofu a little.
>>
>>
>> Happy hacking!
> If you are going to make claims about Virtuoso, please shed light on
> your Virtuoso configuration and host machine.
>
> How much memory do you have on this machine? What the CPU Affinity re
> CPUs available.
>
> Is there a URL for sample data used in your tests?


Looking at
https://ark.intel.com/content/www/us/en/ark/products/65734/intel-xeon-processor-e3-1220-v2-8m-cache-3-10-ghz.html,
your Virtuoso INI settings are even more important due to the fact that
we have CPU Affinity of 4 in play i.e., you need configure Virtuoso such
that it optimizes behavior for this setup.



-- 
Regards,

Kingsley Idehen   
Founder & CEO 
OpenLink Software   
Home Page: http://www.openlinksw.com
Community Support: https://community.openlinksw.com
Weblogs (Blogs):
Company Blog: https://medium.com/openlink-software-blog
Virtuoso Blog: https://medium.com/virtuoso-blog
Data Access Drivers Blog: 
https://medium.com/openlink-odbc-jdbc-ado-net-data-access-drivers

Personal Weblogs (Blogs):
Medium Blog: https://medium.com/@kidehen
Legacy Blogs: http://www.openlinksw.com/blog/~kidehen/
  http://kidehen.blogspot.com

Profile Pages:
Pinterest: https://www.pinterest.com/kidehen/
Quora: https://www.quora.com/profile/Kingsley-Uyi-Idehen
Twitter: https://twitter.com/kidehen
Google+: https://plus.google.com/+KingsleyIdehen/about
LinkedIn: http://www.linkedin.com/in/kidehen

Web Identities (WebID):
Personal: http://kingsley.idehen.net/public_home/kidehen/profile.ttl#i
: 
http://id.myopenlink.net/DAV/home/KingsleyUyiIdehen/Public/kingsley.ttl#this



smime.p7s
Description: S/MIME Cryptographic Signature
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [ANN] nomunofu v0.1.0

2019-12-22 Thread Kingsley Idehen
On 12/22/19 3:17 PM, Amirouche Boubekki wrote:
> Hello all ;-)
>
>
> I ported the code to Chez Scheme to do an apple-to-apple comparison
> between GNU Guile and Chez and took the time to launch a few queries
> against Virtuoso available in Ubuntu 18.04 (LTS).
>
> Spoiler: the new code is always faster.
>
> The hard disk is SATA, and the CPU is dubbed: Intel(R) Xeon(R) CPU
> E3-1220 V2 @ 3.10GHz
>
> I imported latest-lexeme.nt (6GB) using guile-nomunofu, chez-nomunofu
> and Virtuoso:
>
> - Chez takes 40 minutes to import 6GB
> - Chez is 3 to 5 times faster than Guile
> - Chez is 11% faster than Virtuoso
>
> Regarding query time, Chez is still faster than Virtuoso with or
> without cache.  The query I am testing is the following:
>
> SELECT ?s ?p ?o
> FROM 
> WHERE {
>   ?s  
>  .
>   ?s 
>  .
>   ?s  ?o
> };
>
> Virtuoso first query takes: 1295 msec.
> The second query takes: 331 msec.
> Then it stabilize around: 200 msec.
>
> chez nomunofu takes around 200ms without cache.
>
> There is still an optimization I can do to speed up nomunofu a little.
>
>
> Happy hacking!


If you are going to make claims about Virtuoso, please shed light on
your Virtuoso configuration and host machine.

How much memory do you have on this machine? What the CPU Affinity re
CPUs available.

Is there a URL for sample data used in your tests?

-- 

Regards,

Kingsley Idehen   
Founder & CEO 
OpenLink Software   
Home Page: http://www.openlinksw.com
Community Support: https://community.openlinksw.com
Weblogs (Blogs):
Company Blog: https://medium.com/openlink-software-blog
Virtuoso Blog: https://medium.com/virtuoso-blog
Data Access Drivers Blog: 
https://medium.com/openlink-odbc-jdbc-ado-net-data-access-drivers

Personal Weblogs (Blogs):
Medium Blog: https://medium.com/@kidehen
Legacy Blogs: http://www.openlinksw.com/blog/~kidehen/
  http://kidehen.blogspot.com

Profile Pages:
Pinterest: https://www.pinterest.com/kidehen/
Quora: https://www.quora.com/profile/Kingsley-Uyi-Idehen
Twitter: https://twitter.com/kidehen
Google+: https://plus.google.com/+KingsleyIdehen/about
LinkedIn: http://www.linkedin.com/in/kidehen

Web Identities (WebID):
Personal: http://kingsley.idehen.net/public_home/kidehen/profile.ttl#i
: 
http://id.myopenlink.net/DAV/home/KingsleyUyiIdehen/Public/kingsley.ttl#this




smime.p7s
Description: S/MIME Cryptographic Signature
___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


Re: [Wikidata] [ANN] nomunofu v0.1.0

2019-12-22 Thread Amirouche Boubekki
Hello all ;-)


I ported the code to Chez Scheme to do an apple-to-apple comparison
between GNU Guile and Chez and took the time to launch a few queries
against Virtuoso available in Ubuntu 18.04 (LTS).

Spoiler: the new code is always faster.

The hard disk is SATA, and the CPU is dubbed: Intel(R) Xeon(R) CPU
E3-1220 V2 @ 3.10GHz

I imported latest-lexeme.nt (6GB) using guile-nomunofu, chez-nomunofu
and Virtuoso:

- Chez takes 40 minutes to import 6GB
- Chez is 3 to 5 times faster than Guile
- Chez is 11% faster than Virtuoso

Regarding query time, Chez is still faster than Virtuoso with or
without cache.  The query I am testing is the following:

SELECT ?s ?p ?o
FROM 
WHERE {
  ?s   .
  ?s 
 .
  ?s  ?o
};

Virtuoso first query takes: 1295 msec.
The second query takes: 331 msec.
Then it stabilize around: 200 msec.

chez nomunofu takes around 200ms without cache.

There is still an optimization I can do to speed up nomunofu a little.


Happy hacking!

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata-bugs] [Maniphest] [Updated] T241299: Edit summaries are broken

2019-12-22 Thread Lydia_Pintscher
Lydia_Pintscher edited projects, added Wikidata-Campsite 
(Wikidata-Campsite-Iteration-∞); removed Wikidata-Campsite.

TASK DETAIL
  https://phabricator.wikimedia.org/T241299

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Lydia_Pintscher
Cc: Ricordisamoa, Ammarpad, Lucas_Werkmeister_WMDE, Lydia_Pintscher, 
Jakob_WMDE, Addshore, Aklapper, Ladsgroup, Hook696, Daryl-TTMG, RomaAmorRoma, 
0010318400, E.S.A-Sheild, Iflorez, darthmon_wmde, alaa_wmde, Meekrab2012, 
joker88john, CucyNoiD, Nandana, NebulousIris, Gaboe420, Versusxo, 
Majesticalreaper22, Giuliamocci, Adrian1985, Cpaulf30, Lahi, Gq86, Af420, 
Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, Adik2382, Th3d3v1ls, 
Ramalepe, Liugev6, QZanden, LawExplorer, WSH1906, Lewizho99, Maathavan, 
_jensen, rosalieper, Scott_WUaS, Jonas, Wikidata-bugs, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Claimed] T229982: SuggesterParamsParser: provide either entity-id parameter 'entity' or a list of properties 'properties'

2019-12-22 Thread Ladsgroup
Ladsgroup claimed this task.
Ladsgroup edited projects, added Wikidata-Campsite 
(Wikidata-Campsite-Iteration-∞); removed Wikidata-Campsite.
Restricted Application added a project: User-Ladsgroup.

TASK DETAIL
  https://phabricator.wikimedia.org/T229982

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ladsgroup
Cc: Krinkle, Aklapper, brennen, Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, 
E.S.A-Sheild, Iflorez, darthmon_wmde, alaa_wmde, Meekrab2012, joker88john, 
CucyNoiD, Nandana, NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, 
Giuliamocci, Adrian1985, Cpaulf30, Lahi, Gq86, Af420, Darkminds3113, Bsandipan, 
Lordiis, GoranSMilovanovic, Adik2382, Th3d3v1ls, Ramalepe, Liugev6, QZanden, 
LawExplorer, WSH1906, Lewizho99, Maathavan, _jensen, rosalieper, Scott_WUaS, 
Jonas, Wikidata-bugs, aude, Lydia_Pintscher, Sjoerddebruin, Jdforrester-WMF, 
Mbch331, Rxy, Jay8g, Krenair
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Updated] T229982: SuggesterParamsParser: provide either entity-id parameter 'entity' or a list of properties 'properties'

2019-12-22 Thread gerritbot
gerritbot added a project: Patch-For-Review.

TASK DETAIL
  https://phabricator.wikimedia.org/T229982

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: gerritbot
Cc: Krinkle, Aklapper, brennen, Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, 
E.S.A-Sheild, darthmon_wmde, Meekrab2012, joker88john, CucyNoiD, Nandana, 
NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, Giuliamocci, Adrian1985, 
Cpaulf30, Lahi, Gq86, Af420, Darkminds3113, Bsandipan, Lordiis, 
GoranSMilovanovic, Adik2382, Th3d3v1ls, Ramalepe, Liugev6, QZanden, 
LawExplorer, WSH1906, Lewizho99, Maathavan, _jensen, rosalieper, Scott_WUaS, 
Jonas, Wikidata-bugs, aude, Lydia_Pintscher, Sjoerddebruin, Jdforrester-WMF, 
Mbch331, Rxy, Jay8g, Krenair
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Commented On] T229982: SuggesterParamsParser: provide either entity-id parameter 'entity' or a list of properties 'properties'

2019-12-22 Thread gerritbot
gerritbot added a comment.


  Change 560209 had a related patch set uploaded (by Ladsgroup; owner: 
Ladsgroup):
  [mediawiki/extensions/PropertySuggester@master] Catch bad params and throw 
error properly
  
  https://gerrit.wikimedia.org/r/560209

TASK DETAIL
  https://phabricator.wikimedia.org/T229982

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: gerritbot
Cc: Krinkle, Aklapper, brennen, darthmon_wmde, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Jonas, Wikidata-bugs, aude, Lydia_Pintscher, Sjoerddebruin, Jdforrester-WMF, 
Mbch331, Rxy, Jay8g, Krenair
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Commented On] T240694: "OutOfRangeException: Latitude needs to be between -360 and 360" from action=wbparsevalue API

2019-12-22 Thread Ladsgroup
Ladsgroup added a comment.


  This will get fixed after upgrading to geo 4.2.1

TASK DETAIL
  https://phabricator.wikimedia.org/T240694

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ladsgroup
Cc: Ladsgroup, Aklapper, Michael, darthmon_wmde, Nandana, Lahi, Gq86, 
GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, 
Jonas, Wikidata-bugs, aude, Lydia_Pintscher, Jdforrester-WMF, Mbch331, Rxy, 
Jay8g, Krenair
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Commented On] T46581: Partial dumps

2019-12-22 Thread Ladsgroup
Ladsgroup added a comment.


  I might be wrong but I feel there's a large demand for couple of types of 
dumps and there's a long tail that we can't afford to have. For example, having 
a dump of all humans (no pun intended) is very very useful (even I need it for 
one of my tools) and there might be a request for dumps that can be easily 
handled through WDQS+scraper but for example just getting list of all humans 
times out in WDQS (understandably)

TASK DETAIL
  https://phabricator.wikimedia.org/T46581

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ladsgroup
Cc: Ladsgroup, Aklapper, Nikki, abian, Bugreporter, Lucie, PokestarFan, hoo, 
JanZerebecki, jkroll, Wikidata-bugs, Denny, Lydia_Pintscher, daniel, 
darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, 
_jensen, rosalieper, Scott_WUaS, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Commented On] T241211: Identify holes in the items terms tables for prod as of 17:00 on the 19th December for rebuilding

2019-12-22 Thread Ladsgroup
Ladsgroup added a comment.


  Given that there was an almost a day between the time the bug got fixed and 
when the sqoop happened, the maintenance script got to fix around Q62Mio and 
still there's 17% inconsistency there. It's way better than ~60% though.

TASK DETAIL
  https://phabricator.wikimedia.org/T241211

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Ladsgroup
Cc: Ladsgroup, Addshore, Aklapper, Iflorez, darthmon_wmde, alaa_wmde, Nandana, 
Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, 
Scott_WUaS, Jonas, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Commented On] T46581: Partial dumps

2019-12-22 Thread daniel
daniel added a comment.


  Filtering dumps by area of interest is convenient, if a good criterion can be 
found to identify items relevant to the topic. It would probably make sense to 
also include any items directly references, to provide the immediate context of 
the items.
  ms and 
  However, for this to be useful, a great many of such specialized "area of 
interest" dumps would have to exist, with substantial overlap. If WMF can 
afford that in terms of resources, it would sure be nice to have.
  
  But perhaps there is a different way to slice this: create a stub dump that 
filters out most of the statements (and sitelinks?), providing labels, 
descriptions, aliases, plus instanceof and subclass.
  
  Another approach would be to focus on structure rather than topic: e.g. 
export all items that have a (or are the subject of a) parent axon property, 
and include only terms and maybe a very limited set op properties. Similarly, 
dumps that contain the geographical inclusion structure, or the genealogical 
structure, or historical timeline may be useful.

TASK DETAIL
  https://phabricator.wikimedia.org/T46581

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: daniel
Cc: Ladsgroup, Aklapper, Nikki, abian, Bugreporter, Lucie, PokestarFan, hoo, 
JanZerebecki, jkroll, Wikidata-bugs, Denny, Lydia_Pintscher, daniel, 
darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, 
_jensen, rosalieper, Scott_WUaS, aude, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Updated] T157014: CONSULTATION/PLAN: Managing Complex State and GUI on MediaWiki (e.g. for Wikidata/Wikibase UI)

2019-12-22 Thread Bugreporter
Bugreporter added a parent task: T54136: [Epic] Redesign Item UI for Wikidata 
repo.

TASK DETAIL
  https://phabricator.wikimedia.org/T157014

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Bugreporter
Cc: Cavila, Addshore, PokestarFan, Afnecors, Kiailandi, Rfarrand, Jakob_WMDE, 
Mooeypoo, Catrope, dr0ptp4kt, GWicke, Aleksey_WMDE, daniel, Niedzielski, 
Magnus, Milimetric, Prtksxna, Fjalapeno, phuedx, Jdlrobson, Capt_Swing, TheDJ, 
Jdforrester-WMF, SBisson, WMDE-leszek, Volker_E, Krinkle, gabriel-wmde, Jonas, 
thiemowmde, Lydia_Pintscher, Jan_Dittrich, Jhernandez, Jdrewniak, Aklapper, 
Antti.Kekki, darthmon_wmde, Nandana, A.S.Kochergin, Lahi, Gq86, 
GoranSMilovanovic, Chicocvenancio, QZanden, Orienteerix, LawExplorer, 
Flycatchr, Puik, _jensen, rosalieper, JGirault, D3r1ck01, Envlh, Scott_WUaS, 
Susannaanas, Wikidata-bugs, aude, Tobias1984, Nikerabbit, Mbch331, Jay8g
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


[Wikidata-bugs] [Maniphest] [Updated] T54136: [Epic] Redesign Item UI for Wikidata repo

2019-12-22 Thread Bugreporter
Bugreporter added a subtask: T157014: CONSULTATION/PLAN: Managing Complex State 
and GUI on MediaWiki (e.g. for Wikidata/Wikibase UI) .

TASK DETAIL
  https://phabricator.wikimedia.org/T54136

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Bugreporter
Cc: Danwe, Charlie_WMDE, Ash_Crow, Agabi10, Aklapper, MGChecker, Bene, 
Nemo_bis, Yair_rand, thiemowmde, He7d3r, jayvdb, Denny, 
iecetcwcpggwqpgciazwvzpfjpwomjxn, adrianheine, aude, SJu, Snaterlicious, 
Ricordisamoa, Yamaha5, Lydia_Pintscher, Danmichaelo, darthmon_wmde, Nickleh, 
Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, 
rosalieper, Scott_WUaS, Wikidata-bugs, Dinoguy1000, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs


Re: [Wikidata] Concise/Notable Wikidata Dump

2019-12-22 Thread Amirouche Boubekki
Hello all!

Le mar. 17 déc. 2019 à 18:15, Aidan Hogan  a écrit :
>
> Hey all,
>
> As someone who likes to use Wikidata in their research, and likes to
> give students projects relating to Wikidata, I am finding it more and
> more difficult to (recommend to) work with recent versions of Wikidata
> due to the increasing dump sizes, where even the truthy version now
> costs considerable time and machine resources to process and handle.

Maybe that is a software problem? What tools do you use to process the dump?

> More generally, I think the growing data volumes might inadvertently
> scare people off taking the dumps and using them in their research.
>
> One idea we had recently to reduce the data size for a student project
> while keeping the most notable parts of Wikidata was to only keep claims
> that involve an item linked to Wikipedia; in other words, if the
> statement involves a Q item (in the "subject" or "object") not linked to
> Wikipedia, the statement is removed.

One similar scheme will be to only keep concepts that are part of wikipedia
vital article [0] and their neighboors (to be defined).

[0] https://en.wikipedia.org/wiki/Wikipedia:Vital_articles/Level/5

Related to wikipedia vital articles, for which I only know the english version,
the problem with that is that wikipedia vital articles are not
available in structured
format.  I made a few month back a proposal to add that information to wikidata,
I had no feedback.  There is https://www.wikidata.org/wiki/Q43375360.
Not sure where to go from there.

> I wonder would it be possible for Wikidata to provide such a dump to
> download (e.g., in RDF) for people who prefer to work with a more
> concise sub-graph that still maintains the most "notable" parts?

The best thing would be to allow people to create their own vital wikidata
concepts, similar to how there is custom wikipedia vital lists and taking
inspiration from to the tool that was released recently.

> While
> of course one could compute this from the full-dump locally, making such
> a version available as a dump directly would save clients some
> resources, potentially encourage more research using/on Wikidata, and
> having such a version "rubber-stamped" by Wikidata would also help to
> justify the use of such a dataset for research purposes.

I agree.

> ... just an idea I thought I would float out there. Perhaps there is
> another (better) way to define a concise dump.
>
> Best,
> Aidan
>
> ___
> Wikidata mailing list
> Wikidata@lists.wikimedia.org
> https://lists.wikimedia.org/mailman/listinfo/wikidata



-- 
Amirouche ~ https://hyper.dev

___
Wikidata mailing list
Wikidata@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata


[Wikidata-bugs] [Maniphest] [Commented On] T194194: Add possibility to check constraints on unsaved statements

2019-12-22 Thread Pasleim
Pasleim added a comment.


  I've written a new service to check constraints on unsaved statements: 
https://tools.wmflabs.org/plnode/
  If a bot or tool operator plans to use it, please inform me.

TASK DETAIL
  https://phabricator.wikimedia.org/T194194

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Pasleim
Cc: abian, Jc86035, Aklapper, Agabi10, Pasleim, Lucas_Werkmeister_WMDE, 
Pintoch, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, lisong, 
QZanden, merbst, LawExplorer, Culex, _jensen, rosalieper, Scott_WUaS, 
Wikidata-bugs, aude, Lydia_Pintscher, Mbch331
___
Wikidata-bugs mailing list
Wikidata-bugs@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs