[Wikidata-bugs] [Maniphest] T78688: [Story] Detach source references from Statements
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate references in model and UI. TASK DETAIL https://phabricator.wikimedia.org/T78688 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Cirdan, Pintoch, Teslaton, ChristianKl, Ricordisamoa, Aklapper, daniel, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, lucamauri, Lahi, Gq86, GoranSMilovanovic, QZanden, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T194305: Track the number of (unique) references on an item in page_props
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate references in model and UI. TASK DETAIL https://phabricator.wikimedia.org/T194305 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, hoo, Multichill, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, lucamauri, Lahi, Gq86, GoranSMilovanovic, QZanden, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T224333: It's possible to save a statement with duplicate references
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate references in model and UI. TASK DETAIL https://phabricator.wikimedia.org/T224333 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Epidosis, Lydia_Pintscher, matej_suchanek, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, lucamauri, Lahi, Gq86, GoranSMilovanovic, QZanden, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T360224: Improve Wikidata handling of duplicate references in model and UI
ArthurPSmith added parent tasks: T78688: [Story] Detach source references from Statements, T270375: Saving identical references with different retrieval dates should be more difficult, T224333: It's possible to save a statement with duplicate references, T194305: Track the number of (unique) references on an item in page_props. TASK DETAIL https://phabricator.wikimedia.org/T360224 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, ArthurPSmith, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, lucamauri, Lahi, Gq86, GoranSMilovanovic, QZanden, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T270375: Saving identical references with different retrieval dates should be more difficult
ArthurPSmith added a subtask: T360224: Improve Wikidata handling of duplicate references in model and UI. TASK DETAIL https://phabricator.wikimedia.org/T270375 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, Epidosis, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, lucamauri, Lahi, Gq86, GoranSMilovanovic, QZanden, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T360224: Improve Wikidata handling of duplicate references in model and UI
ArthurPSmith created this task. ArthurPSmith added projects: MediaWiki-extensions-WikibaseRepository, Wikidata, MediaWiki-extensions-WikibaseClient. Restricted Application added a subscriber: Aklapper. TASK DESCRIPTION **Feature summary** (what you would like to be able to do and where): See https://www.wikidata.org/wiki/Wikidata:Requests_for_comment/Duplicate_References_Data_Model_and_UI 1. Condense internal JSON storage for duplicate references 2. Modify the Wikidata UI for editing duplicated references **Use case(s)** (list the steps that you performed to discover that problem, and describe the actual underlying problem which you want to solve. Do not describe only a solution): As an example see Q21481859 in Wikidata, which has almost 3000 authors who (should) all have the same reference; the duplicated reference data accounts for over 1 MB of the 4.4 MB size of the item. Wikidata items have a maximum JSON file size of about 4.4 MB so the reference duplication has made this and similar items almost un-editable. See also comments on the Wikidata RFC - the DuplicateReferences gadget and the "UseAsRef" script are widely used. **Benefits** (why should this be implemented?): First, this would help significantly reduce the size of many large Wikidata items, making them more usable and editable. Second, this would allow a number of UI changes to improve the experience of adding and maintaining references in Wikidata. I will also link some related tasks that may be resolved through this work. TASK DETAIL https://phabricator.wikimedia.org/T360224 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, ArthurPSmith, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, lucamauri, Lahi, Gq86, GoranSMilovanovic, QZanden, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T356773: [tracking] Community feedback for the WDQS Split the Graph project
ArthurPSmith added a comment. Ok, I got federation to work - sort of. From the main query service I can query the scholarly subgraph - but if I try to use the resulting values I always get a timeout. select ?author WHERE { SERVICE <https://query-scholarly-experimental.wikidata.org/sparql> { wd:Q56977964 wdt:P50 ?author . } } works fine, but even select ?author ?b ?c WHERE { SERVICE <https://query-scholarly-experimental.wikidata.org/sparql> { wd:Q56977964 wdt:P50 ?author . } ?author ?b ?c . } LIMIT 1 times out. What's going on here??? TASK DETAIL https://phabricator.wikimedia.org/T356773 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Sannita, ArthurPSmith Cc: ArthurPSmith, Sj, dcausse, valerio.bozzolan, tfmorris, Gehel, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T356773: [tracking] Community feedback for the WDQS Split the Graph project
ArthurPSmith added a comment. Hi - how does the federation work? I'm experimenting with this by trying to get the list of names of authors on a scholarly article - the article data itself is in the scholarly article subgraph, but the human items for the authors are in the main one. So I need to do a federated query but it's not clear how? Can you provide an example? Do I start on the main graph and federate to the scholarly one, or vice versa? TASK DETAIL https://phabricator.wikimedia.org/T356773 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Sannita, ArthurPSmith Cc: ArthurPSmith, Sj, dcausse, valerio.bozzolan, tfmorris, Gehel, Aklapper, Danny_Benjafield_WMDE, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, KimKelting, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T56097: [Story] allow to select globe in the UI
ArthurPSmith added a comment. It certainly would be good to get this fixed. However, I think this points up a fundamental problem with some of the more complex data structures supported by Wikidata (quantity ranges are a similar case, and probably some of the lexeme structures as well). Namely - this COULD have been implemented as just a qualifier on coordinate location (P625 <https://phabricator.wikimedia.org/P625>) - either using P376 <https://phabricator.wikimedia.org/P376> or something more directly appropriate. But instead it was implemented as an internal structure within Wikidata that requires direct coding within Wikibase to properly support it, and still after many many years is not supported in the Wikidata UI. I think a better longer-term solution here is to plan to sunset these more complex data structures that could simply be handled with properties/qualifiers. One way or another this NEEDS to be fixed though! TASK DETAIL https://phabricator.wikimedia.org/T56097 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Manuel, Huntster, Jarekt, jleedev, Zolo, Mike_Peel, bzimport, Jklamo, Pasleim, Pigsonthewing, Jc3s5h, VIGNERON, Smalyshev, Nikki, Aklapper, Liuxinyu970226, iecetcwcpggwqpgciazwvzpfjpwomjxn, Ricordisamoa, Lydia_Pintscher, Astuthiodit_1, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, Dinadineke, DannyS712, Nandana, lucamauri, tabish.shaikh91, Lahi, Gq86, GoranSMilovanovic, Jayprakash12345, JakeTheDeveloper, QZanden, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, TheDJ, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T298405: Wikidata/Wikibase Entity Draft Namespace
ArthurPSmith added a comment. I see the merit of this idea at least for properties, but I'm wondering where you envision the property discussion to take place? On the talk page? Would that be preserved somehow (referring back to old proposal discussions is done very often). TASK DETAIL https://phabricator.wikimedia.org/T298405 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Bugreporter, RhinosF1, Aklapper, Lectrician1, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T295275: Dedicated section on Wikidata Item and Property pages for classifying Properties
ArthurPSmith added a comment. Good points from @MisterSynergy and others above. One other case I often run into is problems caused by item merges; if both original items had P279 <https://phabricator.wikimedia.org/P279> statements this can cause significant trouble (for example it is a common source of subclass loops). TASK DETAIL https://phabricator.wikimedia.org/T295275 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Tagishsimon, PKM, Naseweis520, ArthurPSmith, Streetmathematician, Csisc, Ayack, Tpt, MisterSynergy, TomT0m, Salgo60, Mohammed_Sadat_WMDE, Lucas_Werkmeister_WMDE, Manuel, Aklapper, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T295275: Dedicated section on Wikidata Item and Property pages for classifying Properties
ArthurPSmith added a comment. Hmm - I agree with the above that P2860 <https://phabricator.wikimedia.org/P2860> should not be on this list. If we are including the "partitive" properties like P361 <https://phabricator.wikimedia.org/P361> and P527 <https://phabricator.wikimedia.org/P527> (taxonomic in the sense that they group parts of something with the whole), what about P355 <https://phabricator.wikimedia.org/P355> (subsidiary) and P749 <https://phabricator.wikimedia.org/P749> (parent organization), which are used that way for organizations, or other properties of that sort? On the other hand your list does not include the truly taxonomic property P171 <https://phabricator.wikimedia.org/P171> (parent taxon) - which is explicitly a subproperty of P279 <https://phabricator.wikimedia.org/P279>. P10019 <https://phabricator.wikimedia.org/P10019> (term in higher taxon) seems to also be a subproperty of P279 <https://phabricator.wikimedia.org/P279>. TASK DETAIL https://phabricator.wikimedia.org/T295275 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Streetmathematician, Csisc, Ayack, Tpt, MisterSynergy, TomT0m, Salgo60, Mohammed_Sadat_WMDE, Lucas_Werkmeister_WMDE, Manuel, Aklapper, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org
[Wikidata-bugs] [Maniphest] T278161: Allow case-sensitive or lexical-category-restrictive lexeme search
ArthurPSmith added a comment. Hmm, another strange case is search for L:Kelly - the 3 current matches are for L404650, L361948 and L230178, none of which seem to have the string "kelly" in them. So there's some sort of stemming going on here in addition to the case insensitivity, which would be nice to be able to avoid if possible. TASK DETAIL https://phabricator.wikimedia.org/T278161 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Mahir256, QZanden, LawExplorer, _jensen, rosalieper, Bodhisattwa, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T278162: gloss text entry box is too short and hard to edit
ArthurPSmith created this task. ArthurPSmith added a project: Wikidata Lexicographical data. Restricted Application added a project: Wikidata. TASK DESCRIPTION I know we want to keep the glosses short, but the box right now is too short (at least when I use the Mac Safari desktop browser). I think I get 25 characters right now. At least 80 would be nice. Also for longer glosses right now, the inline editing is tricky - I can't use the mouse to select text to delete or replace, for instance, if it is toward the end of the gloss, because as soon as I click the mouse it resets to the first characters of the gloss. Just making the box a bit bigger would help a lot I think. The editor might need some adjusting too though. TASK DETAIL https://phabricator.wikimedia.org/T278162 WORKBOARD https://phabricator.wikimedia.org/project/board/2292/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Mahir256, QZanden, LawExplorer, _jensen, rosalieper, Bodhisattwa, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T278161: Allow case-sensitive or lexical-category-restrictive lexeme search
ArthurPSmith created this task. ArthurPSmith added a project: Wikidata Lexicographical data. Restricted Application added a project: Wikidata. TASK DESCRIPTION Wikidata search for L:Anna matches many lexemes (due to their forms containing "anna" or "Anna") but it would be nice to only match the "proper noun" ("Anna") versions with upper-case first letters. That could be handled either by case-sensitivity in the search or by limiting the match to only lexemes with lexical category "proper noun" (Q147276). TASK DETAIL https://phabricator.wikimedia.org/T278161 WORKBOARD https://phabricator.wikimedia.org/project/board/2292/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Invadibot, maantietaja, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Mahir256, QZanden, LawExplorer, _jensen, rosalieper, Bodhisattwa, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T273221: Measure and indicate Lexeme language completeness, and prompt editors with what more might need doing
ArthurPSmith moved this task from Backlog to In progress on the Wikidata-Lexicodays-2021 board. ArthurPSmith added a comment. Denny's posted this notebook: https://public.paws.wmcloud.org/User:DVrandecic_(WMF)/Lexicographic%20coverage.ipynb which does pretty much the above for the language Wikipedia corpora. Results at https://www.wikidata.org/wiki/Wikidata:Lexicographical_coverage However, I don't think he wants to keep running it, so can we move it somewhere that it will be run regularly by a bot or something? The coverage/completeness data is helpful, and the 'missing' page for each language is a great guide to editors, if it could be kept reasonably up to date. TASK DETAIL https://phabricator.wikimedia.org/T273221 WORKBOARD https://phabricator.wikimedia.org/project/board/5224/ EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Scott_WUaS, Quiddity, Jdforrester-WMF, maantietaja, NavinRizwi, Akuckartz, Nandana, Lahi, Gq86, GoranSMilovanovic, Mahir256, QZanden, LawExplorer, _jensen, rosalieper, Bodhisattwa, Nikki, VIGNERON, abian, Wikidata-bugs, aude, Dinoguy1000, Lydia_Pintscher, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] T243701: Wikidata maxlag repeatedly over 5s since Jan 20, 2020 (primarily caused by the query service)
ArthurPSmith added a comment. Something seems to be going on very recently that's a different pattern - did something change on the infrastructure side, or is there a change in usage pattern for the last few hours? Basically maxlag (WDQS lag specifically) has NOT gone below 5 (5 minutes for WDQS) for more than 1 hour. This hasn't happened, as far as I can tell, for many days, perhaps weeks or months. Typically maxlag recovers when bots stop editing after about 20-30 minutes, sometimes it takes almost an hour, but this is the longest delay in a long time. Specifically around 2020-07-21 14:04 the lag went over 5, and as of 15:18 it's grown to over 16. TASK DETAIL https://phabricator.wikimedia.org/T243701 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: tfmorris, Multichill, Majavah, PeterBowman, matej_suchanek, Masti, THE_IT, Jony, Lydia_Pintscher, Vojtech.dostal, William_Avery, valhallasw, SilentSpike, Bugreporter, Demian, Legoktm, Aschroet, ArielGlenn, Ladsgroup, Alicia_Fagerving_WMSE, JeanFred, Pasleim, Gehel, Lea_Lacroix_WMDE, ArthurPSmith, Albertvillanovadelmoral, Xqt, Lucas_Werkmeister_WMDE, Addshore, jcrespo, Dvorapa, Aklapper, Strainu, lmata, CBogen, Akuckartz, darthmon_wmde, ET4Eva, Legado_Shulgin, Nandana, Namenlos314, Davinaclare77, Qtn1293, Techguru.pc, Lahi, Gq86, Darkminds3113, GoranSMilovanovic, Th3d3v1ls, Hfbn0, QZanden, EBjune, merbst, LawExplorer, Vali.matei, Avner, Zppix, _jensen, rosalieper, Scott_WUaS, Jonas, FloNight, Xmlizer, Volker_E, Wong128hk, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, GWicke, Dinoguy1000, Manybubbles, faidon, Mbch331, Rxy, Jay8g, fgiunchedi ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T249687: gadget to add external ID as reference
ArthurPSmith added a comment. Thanks for creating this! I'm not sure what the standard citation reference for an external ID is, but what I've been using is: - stated in (P248) the value of "subject item of this property" (P1629 <https://phabricator.wikimedia.org/P1629>) for that external ID property, if any - external ID property with value from the item - retrieved (P813 <https://phabricator.wikimedia.org/P813>) on the current date. So it would be nice if this gadget could add these three (or 2 if no P1629 <https://phabricator.wikimedia.org/P1629> value) statements as a reference with a simple interaction... TASK DETAIL https://phabricator.wikimedia.org/T249687 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, ArthurPSmith, Lydia_Pintscher, darthmon_wmde, Dinadineke, DannyS712, Nandana, tabish.shaikh91, Lahi, Gq86, GoranSMilovanovic, Soteriaspace, Jayprakash12345, JakeTheDeveloper, QZanden, dachary, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Ricordisamoa, Tacsipacsi, Sjoerddebruin, TheDJ, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Unblock] T150939: Replace https://tools.wmflabs.org/wikidata-externalid-url by providing improved handling for external id formatter urls
ArthurPSmith closed subtask T160205: Add interstitial to wikidata-externalid-url as Declined. TASK DETAIL https://phabricator.wikimedia.org/T150939 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Ayack, Salgo60, Lewis_Hulbert, Mike_Peel, Liuxinyu970226, ArthurPSmith, hoo, Lydia_Pintscher, Nikki, Stigmj, Aklapper, Esc3300, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Declined] T160205: Add interstitial to wikidata-externalid-url
ArthurPSmith closed this task as "Declined". ArthurPSmith added a comment. Wow, was that really almost 3 years ago. There doesn't seem to really be a need for this, so I'm closing the request as declined. TASK DETAIL https://phabricator.wikimedia.org/T160205 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Lydia_Pintscher, Esc3300, coren, ArthurPSmith, Dispenser, Aklapper, darthmon_wmde, Nandana, Zylc, 1978Gage2001, Lahi, Gq86, GoranSMilovanovic, DSquirrelGM, Jayprakash12345, Chicocvenancio, QZanden, Tbscho, LawExplorer, JJMC89, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, Jitrixis, aude, Gryllida, scfc, Mbch331, Krenair ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. Sorry I never got around to looking at this further. @DD063520 do you understand the above comment from @thiemowmde about using the wbparsevalue api rather than python internals? TASK DETAIL https://phabricator.wikimedia.org/T119226 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Xqt, Liuxinyu970226, DD063520, thiemowmde, Tobias1984, Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Zkhalido, darthmon_wmde, Viztor, Nandana, Wenyi, Lahi, Gq86, GoranSMilovanovic, QZanden, Tbscho, MayS, LawExplorer, Mdupont, JJMC89, Dvorapa, _jensen, rosalieper, Altostratus, Avicennasis, Scott_WUaS, mys_721tx, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Masti, Alchimista, Mbch331, Rxy ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T243701: Wikidata maxlag repeatedly over 5s since Jan20, 2020 (primarily caused by the query service)
ArthurPSmith added a comment. @Bugreporter > I think increase the factor will not make thing better, it only increase the oscillating period Yes that does seem to have happened - instead of a roughly 20 minute cycle, we now have about a 1-hour cycle. TASK DETAIL https://phabricator.wikimedia.org/T243701 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Bugreporter, Demian, Legoktm, Aschroet, ArielGlenn, Ladsgroup, Alicia_Fagerving_WMSE, JeanFred, Pasleim, Gehel, Lea_Lacroix_WMDE, ArthurPSmith, Albertvillanovadelmoral, Xqt, Lucas_Werkmeister_WMDE, Addshore, jcrespo, Dvorapa, Aklapper, Strainu, darthmon_wmde, ET4Eva, Legado_Shulgin, Nandana, Davinaclare77, Qtn1293, Techguru.pc, Lahi, Gq86, Darkminds3113, GoranSMilovanovic, Th3d3v1ls, Hfbn0, QZanden, EBjune, merbst, LawExplorer, Vali.matei, Avner, Zppix, _jensen, rosalieper, Scott_WUaS, Jonas, FloNight, Xmlizer, Volker_E, Wong128hk, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, GWicke, Dinoguy1000, Manybubbles, Lydia_Pintscher, faidon, Mbch331, Rxy, Jay8g, fgiunchedi ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T238045: Improve parallelism in WDQS updater
ArthurPSmith added a comment. Possibly relevant comment here: I believe there is a plan also to move to incremental updates (updating only the statements/triples that have changed) so it is probably important that any parallelism in updating be coordinated so that updates for the same item (Q value) be grouped together and done in the same process, so they don't clobber one another. Updates for separate items (different Q values) can be handled in parallel as the associated RDF triples are independent (the subject of a triple is always the item, a statement on the item, or a further node derived from the item). Even without that incremental update process, grouping updates on the same item together could be a significant speed boost, as 5 updates for Q can be collapsed into just the last update under the current procedure of completely rewriting the triples. TASK DETAIL https://phabricator.wikimedia.org/T238045 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: dcausse, ArthurPSmith Cc: ArthurPSmith, Gehel, Aklapper, darthmon_wmde, Nandana, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T243701: Wikidata maxlag repeatedly over 5s since Jan20, 2020 (primarily caused by the query service)
ArthurPSmith added a comment. In T243701#5855439 <https://phabricator.wikimedia.org/T243701#5855439>, @ArielGlenn wrote: > In T243701#5855352 <https://phabricator.wikimedia.org/T243701#5855352>, @Lea_Lacroix_WMDE wrote: > >> Over the past weeks, we noticed a huge increase of content in Wikidata. Maybe that's something worth looking at? > > Wikidata content is growing at a fast and steady pace and has been for a few years now. For the last few months it's been expanding at a rate of around 3,500,000 new pages per month. So that seems unlikely to be connected. That rate is a lot higher than it was for the first 7 months of 2019, at close to or less than 1 million/month, so it could be related. But given the existing size of Wikidata, I'd call it a moderate increase, not a "huge increase", unless it's much bigger in some other metric than just number of items? On the question of GET: > In T243701#5834792 <https://phabricator.wikimedia.org/T243701#5834792>, @Lucas_Werkmeister_WMDE wrote: > I wonder if it would make sense to ignore query service lag on GET requests? Those requests shouldn’t put any kind of load on the query service, after all. Is the idea here to split the "lag" parameter into separate ones for GET's and edit's? That makes a lot of sense to me... TASK DETAIL https://phabricator.wikimedia.org/T243701 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aschroet, ArielGlenn, Ladsgroup, Alicia_Fagerving_WMSE, JeanFred, Pasleim, Gehel, Lea_Lacroix_WMDE, ArthurPSmith, Albertvillanovadelmoral, Xqt, Lucas_Werkmeister_WMDE, Addshore, jcrespo, Dvorapa, Aklapper, Strainu, darthmon_wmde, ET4Eva, Legado_Shulgin, Nandana, Davinaclare77, Qtn1293, Techguru.pc, Lahi, Gq86, Darkminds3113, GoranSMilovanovic, Th3d3v1ls, Hfbn0, QZanden, EBjune, merbst, LawExplorer, Vali.matei, Avner, Zppix, _jensen, rosalieper, Scott_WUaS, Jonas, FloNight, Xmlizer, Volker_E, Wong128hk, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, GWicke, Dinoguy1000, Manybubbles, Lydia_Pintscher, faidon, Mbch331, Rxy, Jay8g, fgiunchedi ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T243701: Wikidata maxlag repeatedly over 5s since Jan20, 2020 (primarily caused by the query service)
ArthurPSmith added a comment. @Addshore and others - the problem has deteriorated since Saturday - see this discussion on Wikidata: https://www.wikidata.org/wiki/Wikidata:Contact_the_development_team/Query_Service_and_search#WDQS_lag TASK DETAIL https://phabricator.wikimedia.org/T243701 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Albertvillanovadelmoral, Xqt, Lucas_Werkmeister_WMDE, Addshore, jcrespo, Dvorapa, Aklapper, Strainu, darthmon_wmde, ET4Eva, Legado_Shulgin, Nandana, Davinaclare77, Qtn1293, Techguru.pc, Lahi, Gq86, Darkminds3113, GoranSMilovanovic, Th3d3v1ls, Hfbn0, QZanden, EBjune, merbst, LawExplorer, Vali.matei, Avner, Zppix, Gehel, _jensen, rosalieper, Scott_WUaS, Jonas, FloNight, Xmlizer, Volker_E, Wong128hk, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, GWicke, Dinoguy1000, Manybubbles, Lydia_Pintscher, faidon, Mbch331, Rxy, Jay8g, fgiunchedi ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T221774: Add Wikidata query service lag to Wikidata maxlag
ArthurPSmith added a comment. In T221774#5815408 <https://phabricator.wikimedia.org/T221774#5815408>, @Addshore wrote: > [...] > Note that this dashboard includes metrics for both pooled and depooled servers. > So whatever you read there will likely also be reporting data for servers that you can't actually query thus are not seeing the lag for via the query service However there was definitely a significant lag on some (most?) of the wdqs servers available for querying when I noticed this problem - updates done over an hour previous were not visible when I queried. But it seems to have resolved for now. TASK DETAIL https://phabricator.wikimedia.org/T221774 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Addshore, ArthurPSmith Cc: ArthurPSmith, Envlh, Gstupp, Sebotic, Tagishsimon, Liridon, Bugreporter, Magnus, Tpt, Lydia_Pintscher, Matthias_Geisler_WMDE, Simon_Villeneuve, Lea_Lacroix_WMDE, Tarrow, alaa_wmde, Andrawaag, Multichill, Ladsgroup, Smalyshev, fgiunchedi, hoo, Daniel_Mietchen, MisterSynergy, Addshore, Sjoerddebruin, Aklapper, Lucas_Werkmeister_WMDE, Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, E.S.A-Sheild, Iflorez, darthmon_wmde, Meekrab2012, joker88john, CucyNoiD, Nandana, NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, Giuliamocci, Adrian1985, Cpaulf30, Lahi, Gq86, Af420, Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, Adik2382, Chicocvenancio, Th3d3v1ls, Ramalepe, Liugev6, QZanden, EBjune, merbst, LawExplorer, WSH1906, Lewizho99, Volans, Maathavan, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T221774: Add Wikidata query service lag to Wikidata maxlag
ArthurPSmith added a comment. @Bugreporter well something must have changed early today - was it previously "mean" and is now "median"? I'm not sure which is better, but having WDQS hours out of date (we're over 4 hours now) is NOT a good situation, and what this whole task was intended to avoid! @Pintoch any thoughts on this? TASK DETAIL https://phabricator.wikimedia.org/T221774 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Addshore, ArthurPSmith Cc: ArthurPSmith, Envlh, Gstupp, Sebotic, Tagishsimon, Liridon, Bugreporter, Magnus, Tpt, Pintoch, Lydia_Pintscher, Matthias_Geisler_WMDE, Simon_Villeneuve, Lea_Lacroix_WMDE, Tarrow, alaa_wmde, Andrawaag, Multichill, Ladsgroup, Smalyshev, fgiunchedi, hoo, Daniel_Mietchen, MisterSynergy, Addshore, Sjoerddebruin, Aklapper, Lucas_Werkmeister_WMDE, Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, E.S.A-Sheild, Iflorez, darthmon_wmde, Meekrab2012, joker88john, CucyNoiD, Nandana, NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, Giuliamocci, Adrian1985, Cpaulf30, Lahi, Gq86, Af420, Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, Adik2382, Chicocvenancio, Th3d3v1ls, Ramalepe, Liugev6, QZanden, EBjune, merbst, LawExplorer, WSH1906, Lewizho99, Volans, Maathavan, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T240442: Design a continuous throttling policy for Wikidata bots
ArthurPSmith added a comment. Just saw this - I'm wondering technically how you would implement it? You could generate a random number between 2.5 and 5, and if maxlag is greater than your random number deny the edit? TASK DETAIL https://phabricator.wikimedia.org/T240442 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Addshore, Aklapper, Pintoch, darthmon_wmde, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T221774: Add Wikidata query service lag to Wikidata maxlag
ArthurPSmith added a comment. Am I misreading this graph? https://grafana.wikimedia.org/d/00489/wikidata-query-service?panelId=8=1=now-12h=now=10s It looks like the query service lag for 3 of the servers has been growing steadily for the past roughly 8 hours. However, edits are going through. Did something change in the maxlag logic somewhere earlier today? TASK DETAIL https://phabricator.wikimedia.org/T221774 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Addshore, ArthurPSmith Cc: ArthurPSmith, Envlh, Gstupp, Sebotic, Tagishsimon, Liridon, Bugreporter, Magnus, Tpt, Pintoch, Lydia_Pintscher, Matthias_Geisler_WMDE, Simon_Villeneuve, Lea_Lacroix_WMDE, Tarrow, alaa_wmde, Andrawaag, Multichill, Ladsgroup, Smalyshev, fgiunchedi, hoo, Daniel_Mietchen, MisterSynergy, Addshore, Sjoerddebruin, Aklapper, Lucas_Werkmeister_WMDE, Hook696, Daryl-TTMG, RomaAmorRoma, 0010318400, E.S.A-Sheild, Iflorez, darthmon_wmde, Meekrab2012, joker88john, CucyNoiD, Nandana, NebulousIris, Gaboe420, Versusxo, Majesticalreaper22, Giuliamocci, Adrian1985, Cpaulf30, Lahi, Gq86, Af420, Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, Adik2382, Chicocvenancio, Th3d3v1ls, Ramalepe, Liugev6, QZanden, EBjune, merbst, LawExplorer, WSH1906, Lewizho99, Volans, Maathavan, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T240371: Maxlag=5 for Author Disambiguator
ArthurPSmith closed this task as "Resolved". ArthurPSmith added a comment. Marking as resolved... TASK DETAIL https://phabricator.wikimedia.org/T240371 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Bugreporter, Aklapper, Pintoch, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Unblock] T240369: Chase up bot operators whose bot keeps running when the dispatch lag is higher than 5
ArthurPSmith closed subtask T240371: Maxlag=5 for Author Disambiguator as Resolved. TASK DETAIL https://phabricator.wikimedia.org/T240369 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Lydia_Pintscher, Framawiki, Sjoerddebruin, Addshore, Bugreporter, Pintoch, Aklapper, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T240371: Maxlag=5 for Author Disambiguator
ArthurPSmith added a comment. I increased the default number of retries to 12, so it will now retry for up to an hour. I think we're good here? TASK DETAIL https://phabricator.wikimedia.org/T240371 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Bugreporter, Aklapper, Pintoch, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T240371: Maxlag=5 for Author Disambiguator
ArthurPSmith added a comment. (A) Pintoch's patch has been applied, and (B) I also increased the retry time from 5 seconds to 5 minutes - that still means an edit will fail after 25 minutes if maxlag doesn't drop, with only 5 retries. Is there a consensus to retry for an hour? Or if there's a better standard for handling retries let me know! TASK DETAIL https://phabricator.wikimedia.org/T240371 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Bugreporter, Aklapper, Pintoch, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Scott_WUaS, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T233763: Searching Lexeme:Danke on Wikidata breaks it
ArthurPSmith added a comment. If you go to the search page and select "Lexeme" as the only namespace you get the same error with "thanks" in the search box, but "thank" alone works fine - the two lexemes that match are L3798 (verb) and L28468 (noun). TASK DETAIL https://phabricator.wikimedia.org/T233763 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Lea_Lacroix_WMDE, Lydia_Pintscher, Aklapper, Denny, darthmon_wmde, DannyS712, Nandana, Mringgaard, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Darkdadaah, Jdforrester-WMF, Mbch331, Jay8g, Krenair ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T212843: [EPIC] Access to Wikidata's lexicographical data from Wiktionaries and other WMF sites
ArthurPSmith added a comment. The Basque collection is even more complete now! I do think some customization may be needed for Lexemes due to the different structure - the forms and senses etc. Perhaps the most useful link for a wiktionary may be from words to senses to wikidata items via the "item for this sense" property. That in principle allows translations to be provided, grouped by sense. One UI suggestion would be: when searching for a word in a wiktionary, if it is NOT found, any matching Wikidata forms from that or any other language could be shown, so this provides an immediate supplement to small Wiktionaries, and there may even be a few words missing from enwikt that could be found in Wikidata. TASK DETAIL https://phabricator.wikimedia.org/T212843 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Iniquity, Tobias1984, Theklan, Fnielsen, RexxS, Pamputt, Mike_Peel, MarcoSwart, Geertivp, Liuxinyu970226, Addshore, Jdforrester-WMF, deryckchan, Lydia_Pintscher, Lea_Lacroix_WMDE, darthmon_wmde, DannyS712, Nandana, Mringgaard, Lahi, Gq86, Cinemantique, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, jberkel, Psychoslave, Wikidata-bugs, aude, GPHemsley, Shizhao, Nemo_bis, Darkdadaah, Mbch331, Ltrlg, Krenair ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T229604: Not possible to write in language edit fields
ArthurPSmith added a comment. I see the problem also (Safari browser). When you talk about it affecting lexemes, where do you see that? I experimented with adding a form and that seemed fine. TASK DETAIL https://phabricator.wikimedia.org/T229604 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Fnielsen, Aklapper, darthmon_wmde, DannyS712, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, rosalieper, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T214680: Document statement URI format for RDF
ArthurPSmith added a comment. Can you add a test to the statement ID generation code that ensures it has an RDF compatible format (except for the 1 character that's a problem now), and a note that this is required for RDF support?TASK DETAILhttps://phabricator.wikimedia.org/T214680EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Smalyshev, Lydia_Pintscher, Liuxinyu970226, Aklapper, Lucas_Werkmeister_WMDE, ArthurPSmith, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T214680: Document statement URI format for RDF
ArthurPSmith added a comment. promise it will always be one-to-one, no matter what happens with internal IDs Hmm - if it's NOT one-to-one, will that not break RDF? That is, if it's possible for 2 different statements to have the same ID, then you would have conflicting triples associated with the same URI. That's not good at all! So I think a one-to-one mapping to the RDF format is important if we at all consider RDF support to be a fundamental piece of what wikidata provides...TASK DETAILhttps://phabricator.wikimedia.org/T214680EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Smalyshev, Lydia_Pintscher, Liuxinyu970226, Aklapper, Lucas_Werkmeister_WMDE, ArthurPSmith, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T214680: Document statement URI format for RDF
ArthurPSmith added a comment. Another thought - even better would be if the API could be adjusted so it accepts the WDQS statement ID format as it is (all -'s).TASK DETAILhttps://phabricator.wikimedia.org/T214680EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Lydia_Pintscher, Liuxinyu970226, Aklapper, Lucas_Werkmeister_WMDE, ArthurPSmith, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, Jonas, Xmlizer, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T214680: Document statement URI format for RDF
ArthurPSmith added a comment. Thanks for creating this ticket! Actually, my use case is the opposite of Lucas's - I want to be able to go from the results of a WDQS query to fetch the full statement via the API, which requires the statement ID. So I would like to see the id conversion documented in BOTH directions - and in particular the arbitrary regex replace listed above (preg_replace( '/[^\w-]/', '-', $statementID )) would NOT work for that purpose. Rather can we just settle that the first $ or - is switched, and that's it? Or is there something else that's an issue here?TASK DETAILhttps://phabricator.wikimedia.org/T214680EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Lydia_Pintscher, Liuxinyu970226, Aklapper, Lucas_Werkmeister_WMDE, ArthurPSmith, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, Jonas, Xmlizer, jkroll, Smalyshev, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T76232: [Story] nudge when editing a statement to check reference
ArthurPSmith added a comment. I didn't know about the "award token" option! Yes, we should do something along these lines. However, I think there are a number of situations to be addressed: (1) The edit may be a clean-up which has no material impact on the value of the statement (especially for quantity values, adjusting the precision, or for URL's adding/removing a trailing '/' or something like that). It is probably hard for software to understand this distinction, so I think despite Snipre's concern above we will always need an option (a checkbox or something) to let the user preserve the existing reference unchanged. (2) For item-valued statements, an edit to fix a redirected value should definitely preserve the references. This may or may not be easy to detect automatically. (3) The original data insertion may have been incorrect in some respect relative to the reference (for example linking to the wrong item or for any datatype simply having the wrong value somehow). Again a correction to match the original reference would not require any edit of the reference (other than possibly updating the "retrieved date" value). (4) A qualifier like "end date" is added to the statement. This should perhaps be supported by a new reference, but it wouldn't invalidate the old references on the statement. The real concern is edits that change the value in a material respect. In this case I think the default behavior should be to preserve the original value with a deprecated rank, rather than to actually delete it. So the UI that I think would work best is: For a statement with no references, allow edits as now. For a referenced statement, by default when the value or qualifiers are changed, preserve the original statement with its references but set rank to 'deprecated', and remove all the old references on the new statement (and nudge the user perhaps to add new ones). But provide a checkbox to assert that this is not a material change so that the original references are preserved and the 'deprecated' statement need not be created. Make sense?TASK DETAILhttps://phabricator.wikimedia.org/T76232EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Moebeus, Jc86035, Snipre, Aklapper, thiemowmde, adrianheine, Snaterlicious, Liuxinyu970226, Lydia_Pintscher, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, D3r1ck01, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T210495: Number of Senses is decreasing on ListeriaBot's report
ArthurPSmith added a comment. Just a note - WDQS query gives different results hopping up and down - sometimes 3004 (for English lexeme senses) and sometimes 2872, over about the last 10 minutes.TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Smalyshev, ArthurPSmith, KaMan, Lucas_Werkmeister_WMDE, Addshore, Ladsgroup, Aklapper, Lydia_Pintscher, Cyberpower678, Lea_Lacroix_WMDE, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, merbst, LawExplorer, _jensen, D3r1ck01, Jonas, Xmlizer, jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Changed Subscribers] T210495: Number of Senses is decreasing on ListeriaBot's report
ArthurPSmith added a subscriber: Smalyshev.ArthurPSmith added a comment. @Smalyshev I'd forgotten there was a phabricator ticket for this - anyway, this is what I was referring to... Last night's update bumped the number down again to 2718; however when I run the query directly on WDQS I get 3004 right now. Something's not right!TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Smalyshev, ArthurPSmith, KaMan, Lucas_Werkmeister_WMDE, Addshore, Ladsgroup, Aklapper, Lydia_Pintscher, Cyberpower678, Lea_Lacroix_WMDE, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, D3r1ck01, Jonas, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T210495: Number of Senses is decreasing on ListeriaBot's report
ArthurPSmith added a comment. I ran a manual update and the total for English bumped up to 2819 - so it doesn't look as if we've actually lost lexeme senses, just that some of the query servers don't know about all of them?TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, KaMan, Lucas_Werkmeister_WMDE, Addshore, Ladsgroup, Aklapper, Lydia_Pintscher, Cyberpower678, Lea_Lacroix_WMDE, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, D3r1ck01, Jonas, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T210495: Number of Senses is decreasing on ListeriaBot's report
ArthurPSmith added a comment. I wouldn't be surprised if it's a WDQS problem, this is definitely generated from an RDF query.TASK DETAILhttps://phabricator.wikimedia.org/T210495EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, KaMan, Lucas_Werkmeister_WMDE, Addshore, Ladsgroup, Aklapper, Lydia_Pintscher, Cyberpower678, Lea_Lacroix_WMDE, Nandana, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, _jensen, D3r1ck01, Jonas, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T160259: [Story] RDF for Lexemes, Forms and Senses
ArthurPSmith added a comment. According to https://www.mediawiki.org/wiki/Extension:WikibaseLexeme/RDF_mapping a lexeme should be "a wikibase:Lexeme " as well as "a ontolex:LexicalEntry", but in the query service I can only find things via the latter relation. Similarly for forms and "wikibase:Form". Something left out of the dump?TASK DETAILhttps://phabricator.wikimedia.org/T160259EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Tpt, Smalyshev, Lucas_Werkmeister_WMDE, Aklapper, Ladsgroup, Nandana, Mringgaard, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Darkdadaah, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T197145: Create special pages for lexemes
ArthurPSmith added a comment. WDQS works for me! I'm not sure where that is of course - I guess I could check Phabricator!TASK DETAILhttps://phabricator.wikimedia.org/T197145EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Mahir256, Esc3300, Lydia_Pintscher, Liuxinyu970226, ArthurPSmith, Lea_Lacroix_WMDE, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Darkdadaah, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T197145: Create special pages for lexemes
ArthurPSmith added a comment. Does "alphabetical" ordering even make sense for words in a collection of vastly different writing systems? If this is done I would recommend it be accompanied by some filtering - for language, part of speech, grammatical features, certain properties perhaps.TASK DETAILhttps://phabricator.wikimedia.org/T197145EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Lea_Lacroix_WMDE, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Darkdadaah, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T195740: Decide on a way forward for acceptable languages for lemmas and representations
ArthurPSmith added a comment. I am in general favorable to Micru's proposal, and perhaps Pamputt's elaboration of it above: using wikidata items directly allows representation of the lemma language naturally in the user's own script/language for one, and other automatic bonuses of using items given the structured data ethos etc.. However I'm a little confused about the details of how this would work - specifically, the most commonly used lexemes would usually have the same spelling, use etc. across all variants of a language; do we give that a more general language ("en" = Q1860 say) and only use the specific items mentioned ("en-US" = Q7976, "en-GB" = Q7979, "en-CA" = Q44676, etc.) where there really are variations? Or would it be possible to attach multiple language items to a single lexeme, to indicate it applies to several specific variants?TASK DETAILhttps://phabricator.wikimedia.org/T195740EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Pamputt, Liuxinyu970226, Micru, VIGNERON, Lydia_Pintscher, Lea_Lacroix_WMDE, Aklapper, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Darkdadaah, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T193728: Solve legal uncertainty of Wikidata
ArthurPSmith added a comment. Here's a specific question that might be detailed enough in description: suppose we have a collection of facts (say the names, countries, inception dates, and official websites for a collection of organizations) that has been extracted from multiple sources, including various language wikipedias, a CC-0 data source (for example https://grid.ac/) and a non-CC-0 non-wikipedia data source - these sources would be indicated in wikidata by the reference/source section on each statement. This extraction has been done by users either manually or running bots with the understanding that they are adding facts to a CC-0 database (wikidata). Reconciling the facts - for example merging duplicates with slightly different names, dates, or URL's - has been done by users manually or semi-automatically, again with the understanding they are contributing to a CC-0 database. Are there any copyright or other rights constraints that apply to this collection, or can it be fully considered to legally be CC-0?TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, SimonPoole, Scott_WorldUnivAndSch, Micru, lisong, Lofhi, Nemo_bis, TomT0m, jrbs, EgonWillighagen, sarojdhakal, Agabi10, NMaia, Simon_Villeneuve, Jarekt, Rspeer, OhKayeSierra, AndrewSu, Mateusz_Konieczny, Maxlath, Huji, Glrx, Realworldobject, Ltrlg, Papapep, Tgr, Ayack, Gnom1, MichaelMaggs, MisterSynergy, Pasleim, Cirdan, 0x010C, Sylvain_WMFr, Denny, Ivanhercaz, Pintoch, Lydia_Pintscher, Lea_Lacroix_WMDE, Aklapper, Psychoslave, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, ZhouZ, Aschmidt, Mpaulson, Wikidata-bugs, aude, jayvdb, Slaporte, Mbch331, Jay8g___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T163642: Index Wikidata strings in statements in the search engine
ArthurPSmith added a comment. Hmm, I'm not sure this is all that useful at least as it stands. Most external id's can be as easily found now via the Wikidata Resolver tool - https://tools.wmflabs.org/wikidata-todo/resolver.php - However, what I would find useful would be a way to locate for example partial street addresses - this (P969) is often entered as a qualifier on headquarters location (P159). Searching for' haswbstatement:P969=Main' now finds something, but only because that oddly has just 'Main' as the value for P969, and making the string lowercase ("main") finds nothing, which is definitely not what I would expect on this... I don't think treating string values as if they were identifiers is the right approach, the usefulness of a search engine is in normalizing string values so you can find them without having the exact matching string. And qualifiers should be folded in somehow!TASK DETAILhttps://phabricator.wikimedia.org/T163642EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Smalyshev, ArthurPSmithCc: ArthurPSmith, Stashbot, Lea_Lacroix_WMDE, gerritbot, Liuxinyu970226, Smalyshev, debt, aude, Lydia_Pintscher, Aklapper, Multichill, Versusxo, Majesticalreaper22, Giuliamocci, Adrian1985, Cpaulf30, Lahi, Gq86, Baloch007, Darkminds3113, Bsandipan, Lordiis, GoranSMilovanovic, Adik2382, Th3d3v1ls, Ramalepe, Liugev6, QZanden, EBjune, LawExplorer, Avner, Lewizho99, Maathavan, Gehel, FloNight, Wikidata-bugs, jayvdb, Mbch331, jeremyb___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T193728: Solve legal uncertainty of Wikidata
ArthurPSmith added a comment. Hi - my most recent response was following MisterSynergy's comment on Denny's proposed questions, and specifically the meaning of "processes that in bulk extract facts from Wikipedia articles," - it sounds like from subsequent discussion that we are not talking solely of automated "processes", so I think I echo MisterSynergy's comment that the question needs to be better defined to "describe how these processes look like". On the one hand there's overall averages, with less than one "fact" per wikipedia article; on the other hand the distribution is probably quite wide, with some articles having dozens of "facts" extracted from them. Since CC-BY-SA applies to each article individually, does extraction of too much factual data from one article potentially violate its copyright?TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, SimonPoole, Scott_WorldUnivAndSch, Micru, lisong, Lofhi, Nemo_bis, TomT0m, jrbs, EgonWillighagen, sarojdhakal, Agabi10, NMaia, Simon_Villeneuve, Jarekt, Rspeer, OhKayeSierra, AndrewSu, Mateusz_Konieczny, Maxlath, Huji, Glrx, Realworldobject, Ltrlg, Papapep, Tgr, Ayack, Gnom1, MichaelMaggs, MisterSynergy, Pasleim, Cirdan, 0x010C, Sylvain_WMFr, Denny, Ivanhercaz, Pintoch, Lydia_Pintscher, Lea_Lacroix_WMDE, Aklapper, Psychoslave, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, ZhouZ, Aschmidt, Mpaulson, Wikidata-bugs, aude, jayvdb, Slaporte, Mbch331, Jay8g___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T193728: Solve legal uncertainty of Wikidata
ArthurPSmith added a comment. based on the fact that we have ~42M “imported from” references and ~64M sitelinks in Wikidata Hmm, I've added likely over 1000 of those "imported from" items myself by hand, for example for organization "official website" entries. So I would say "imported from" gives us an over-count of "bot" work, if that's the main issue here. Or is thousands of individuals adding these entries by hand also a concern?TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, SimonPoole, Scott_WorldUnivAndSch, Micru, lisong, Lofhi, Nemo_bis, TomT0m, jrbs, EgonWillighagen, sarojdhakal, Agabi10, NMaia, Simon_Villeneuve, Jarekt, Rspeer, OhKayeSierra, AndrewSu, Mateusz_Konieczny, Maxlath, Huji, Glrx, Realworldobject, Ltrlg, Papapep, Tgr, Ayack, Gnom1, MichaelMaggs, MisterSynergy, Pasleim, Cirdan, 0x010C, Sylvain_WMFr, Denny, Ivanhercaz, Pintoch, Lydia_Pintscher, Lea_Lacroix_WMDE, Aklapper, Psychoslave, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, ZhouZ, Aschmidt, Mpaulson, Wikidata-bugs, aude, jayvdb, Slaporte, Mbch331, Jay8g___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T193728: Solve legal uncertainty of Wikidata
ArthurPSmith added a comment. Some references on why CC0 is essential for a free public database: https://wiki.creativecommons.org/wiki/CC0_use_for_data "Databases may contain facts that, in and of themselves, are not protected by copyright law. However, the copyright laws of many jurisdictions cover creatively selected or arranged compilations of facts and creative database design and structure, and some jurisdictions like those in the European Union have enacted additional sui generis laws that restrict uses of databases without regard for applicable copyright law. CC0 is intended to cover all copyright and database rights, so that however data and databases are restricted (under copyright or otherwise), those rights are all surrendered" https://www.nature.com/nature/journal/v461/n7261/full/461171a.html "Although it is usual practice for major public databases to make data freely available to access and use, any restrictions on use should be strongly resisted and we endorse explicit encouragement of open sharing, for example under the newly available CC0 public domain waiver of Creative Commons." https://blog.datadryad.org/2011/10/05/why-does-dryad-use-cc0/ "Dryad’s policy ultimately follows the recommendations of Science Commons, which discourage researchers from presuming copyright and using licenses that include “attribution” and “share-alike” conditions for scientific data. Both of these conditions can put legitimate users in awkward positions. First, specifying how “attribution” must be carried out may put a user at odds with accepted citation practice: “when you federate a query from 50,000 databases (not now, perhaps, but definitely within the 70-year duration of copyright!) will you be liable to a lawsuit if you don’t formally attribute all 50,000 owners?” Science Commons Database Protocol FAQ) While “share-alike” conditions create their own unnecessary legal tangle: “ ‘share-alike’ licenses typically impose the condition that some or all derivative products be identically licensed. Such conditions have been known to create significant “license compatibility” problems under existing license schemes that employ them. In the context of data, license compatibility problems will likely create significant barriers for data integration and reuse for both providers and users of data.” (Science Commons Database Protocol FAQ) Thus, “… given the potential for significantly negative unintended consequences of using copyright, the size of the public domain, and the power of norms inside science, we believe that copyright licenses and contractual restrictions are simply the wrong tool [for data], even if those licenses and contracts are used with the best of intentions.” (Science Commons Database Protocol FAQ)" https://pietercolpaert.be/open%20data/2017/02/23/cc0.html "Requiring that you mention the source of the dataset in each application that reuses my data, still complies to the Open Definition. There is no need to argue with anyone that uses for example the CC BY license: you will only have the annoying obligation that you have to mention the name in a user interface. This is useful for datasets which are closely tied to their document or database: when for example reusing and republishing a spreadsheet, I can understand you will want that someone attributes you for created that spreadsheet. However, for data on the Web, the borders between data silos are fading and queries are evaluated over plenty of databases. Then requiring that each dataset is mentioned in the user interface is just annoying end-users." "The share alike requirement, as the name implies, requires that when reusing a document, you share the resulting document under the same license. I like the idea for “viral” licenses and the fact that all results from this document will now also become open data. However, what does it mean exactly for an answer that is generated on the basis of 2 or more datasets? And what if one of these datasets would be a private dataset (e.g., a user profile)? It thus would make it even more unnecessarily complex to reuse data, while the goal was to maximize the reuse of our dataset."TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, SimonPoole, Scott_WorldUnivAndSch, Micru, lisong, Lofhi, Nemo_bis, TomT0m, jrbs, EgonWillighagen, sarojdhakal, Agabi10, NMaia, Simon_Villeneuve, Jarekt, Rspeer, OhKayeSierra, Aschmidt, AndrewSu, Mateusz_Konieczny, Maxlath, Huji, Glrx, Realworldobject, Ltrlg, Papapep, Tgr, Ayack, Gnom1, MichaelMaggs, MisterSynergy, Pasleim, Cirdan, 0x010C, Sylvain_WMFr, Denny, Ivanhercaz, Pintoch, Lydia_Pintscher, Lea_Lacroix_WMDE, Aklapper, Psychoslave, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, ZhouZ, Mpaulson, Wikidata-bugs, aude, jayvdb, Slaporte, Mbch331, Jay8g__
[Wikidata-bugs] [Maniphest] [Commented On] T195382: show Lemma on Special:AllPages
ArthurPSmith added a comment. FYI I agree with VIGNERON on what it should look like - but at least something more than the id!TASK DETAILhttps://phabricator.wikimedia.org/T195382EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, VIGNERON, Aklapper, Lydia_Pintscher, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, Wikidata-bugs, aude, Darkdadaah, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T193728: Solve legal uncertainty of Wikidata
ArthurPSmith added a comment. It has been asserted here several times that OSM data has been wholesale imported into Wikidata - do we know that has happened? Wikidata has two properties related to OSM, one that relates wikidata items to OSM tags like "lighthouse", and one that is essentially deprecated (see T145284), so I assume those are not the issue. According to https://www.wikidata.org/wiki/Wikidata:OpenStreetMap (text which has been there since at least last September) "it is not possible to import coordinates from OpenStreetMap to Wikidata". If the issue is coordinates imported via wikipedia infoboxes that originated with OSM, I can see there might be an issue there, and maybe that should be added to Denny's suggested question in some fashion. But as far as actual importing of OSM data, the only specific cases that I noticed explicitly cited above are (A) a bot request that has been rejected, and (B) a discussion from 2013 where the copyright issue was explicitly raised right away.TASK DETAILhttps://phabricator.wikimedia.org/T193728EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, SimonPoole, Scott_WorldUnivAndSch, Micru, lisong, Lofhi, Nemo_bis, TomT0m, jrbs, EgonWillighagen, sarojdhakal, Agabi10, NMaia, Simon_Villeneuve, Jarekt, Rspeer, OhKayeSierra, Aschmidt, AndrewSu, Mateusz_Konieczny, Maxlath, Huji, Glrx, Realworldobject, Ltrlg, Papapep, Tgr, Ayack, Gnom1, MichaelMaggs, MisterSynergy, Pasleim, Cirdan, 0x010C, Sylvain_WMFr, Denny, Ivanhercaz, Pintoch, Lydia_Pintscher, Lea_Lacroix_WMDE, Aklapper, Psychoslave, Lahi, Gq86, GoranSMilovanovic, QZanden, LawExplorer, ZhouZ, Mpaulson, Wikidata-bugs, aude, jayvdb, Slaporte, Mbch331, Jay8g___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T171092: WDQS sync (?) issue for certain recently created items
ArthurPSmith added a comment.Herald added a subscriber: PokestarFan. Of course, now these examples I gave are working - probably because I updated them recently. However, I found more that are not now, or only partially - for example Q2256713: SELECT ?item WHERE { ?item wdt:P856 http://www.sikjm.ch/ .} SELECT ?item WHERE { ?item wdt:P2427 'grid.483048.3' .} return nothing, but SELECT ?item WHERE { ?item rdfs:label "Schweizerisches Institut für Kinder- und Jugendmedien"@en .} returns Q2256713.TASK DETAILhttps://phabricator.wikimedia.org/T171092EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: PokestarFan, Aklapper, ArthurPSmith, GoranSMilovanovic, QZanden, EBjune, Avner, debt, Gehel, FloNight, Izno, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T171092: WDQS sync (?) issue for certain recently created items
ArthurPSmith created this task.ArthurPSmith added projects: Wikidata, Discovery.Herald added a subscriber: Aklapper. TASK DESCRIPTIONI have found the query service to be consistently (*almost* always, over the past several weeks at least) missing some items - an example is Q30252826: SELECT ?item WHERE { ?item rdfs:label "Technology Centre"@en . } SELECT ?item WHERE { ?item wdt:P856 . } SELECT ?item WHERE { ?item wdt:P2427 'grid.17033.36' .} these occasionally return a correct response with this one item, but usually return nothing - either in the WDQS GUI or via JSON download. There seem to be a handful of missing items - Q30252828 is another. However, Q30252827 is fine: SELECT ?item WHERE { ?item rdfs:label "Mediterranean Center for Environmental Studies"@en . } returns Q30252827 consistently. It's a little disturbing to have the query service completely missing some of what's in wikidata.TASK DETAILhttps://phabricator.wikimedia.org/T171092EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Aklapper, ArthurPSmith, GoranSMilovanovic, QZanden, EBjune, Avner, debt, Gehel, FloNight, Izno, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Raised Priority] T54564: Allow sitelinks to redirect pages to fix the 'Bonnie and Clyde problem'
ArthurPSmith raised the priority of this task from "Lowest" to "Normal".ArthurPSmith added a comment. I don't understand why Multichill can unilaterally alter the priority on this request in the face of an active wikidata RFC where the voting has been 2:1 in support of this change. It would also be nice to get some actual feedback from developers - is this really "against the core data model of Wikdiata"? I don't see it - particularly as the workarounds in place now prove it can be easily supported.TASK DETAILhttps://phabricator.wikimedia.org/T54564EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Pasleim, Multichill, StudiesWorld, daniel, Metamorforme42, JEumerus, Harmonia_Amanda, Ash_Crow, DannyH, Agabi10, Choomaq, IKhitron, QZanden, thiemowmde, Toto256, Acer, Elitre, Sylvain_WMFr, Lea_Lacroix_WMDE, Schlum, TomT0m, Thryduulf, Rich_Farmbrough, Zppix, ChristianKl, Mike_Peel, Wittylama, Liuxinyu970226, SebastianHelm, MisterSynergy, Oliv0, JanusTroelsen, Blahma, MGChecker, MSGJ, Izno, Nnemo, bzimport, Unknown Object (MLST), DanielFriesen, Gymel, Denny, jeblad, Abraham, Addshore, SamB, Toru10, Wikidata-bugs, JAnD, Nemo_bis, He7d3r, -jem-, ValterVB, Filceolaire, Micru, JanZerebecki, matej_suchanek, Ricordisamoa, MZMcBride, Aklapper, Tgr, kaldari, Laddo, Lydia_Pintscher, Jane023, Ltrlg, JohnLewis, Fomafix, Zellfaze, GoranSMilovanovic, aude, Jackmcbarn, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T170614: constraint gadget always shows an error for P279 (subclass of) statements
ArthurPSmith added a comment. Thanks! I did search through the open tasks first and didn't find anything on thisTASK DETAILhttps://phabricator.wikimedia.org/T170614EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Lucas_Werkmeister_WMDE, Mbch331, Aklapper, ArthurPSmith, GoranSMilovanovic, QZanden, Izno, Wikidata-bugs, aude___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T170614: constraint gadget always shows an error for P279 (subclass of) statements
ArthurPSmith created this task.ArthurPSmith added a project: Wikibase-Quality-Constraints.Herald added a subscriber: Aklapper.Herald added a project: Wikidata. TASK DESCRIPTIONIs this the place to report bugs? Whenever I look at a wikidata item with a P279 (subclass of) statement - for example for university - https://www.wikidata.org/wiki/Q3918 - the constraint warning shows up on each claim, with the popup saying "Potential Issues" - "Conflicts with ?" - "The value for the parameter "item" must be an item, not "QQ13406463"." Items that are instances of Q13406463 is one of the case that P279 has a conflict with, but I'm guessing the problem here is the double Q - I don't see a double Q in the constraints section on P279 though. Any idea what's going on here?TASK DETAILhttps://phabricator.wikimedia.org/T170614EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Aklapper, ArthurPSmith, GoranSMilovanovic, QZanden, Izno, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T143486: [feature request] remove sitelinks / update sitelinks on Wikidata when pages are deleted/moved on client wikis (all users)
ArthurPSmith added a comment. The dummy user solution sounds good to me. Magnus Manske is doing something like this with his QuickStatementsBot so maybe a special purpose Bot account on wikidata for this?TASK DETAILhttps://phabricator.wikimedia.org/T143486EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Liuxinyu970226, Izno, hoo, Aklapper, Esc3300, GoranSMilovanovic, QZanden, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T150939: Replace https://tools.wmflabs.org/wikidata-externalid-url by providing improved handling for external id formatter urls
ArthurPSmith added a comment. I believe a way this could be done would be to allow the attachment of regular expressions to the formatter URL, and have the external id URL conversion code understand them. That is, if there was a qualifier property that specified "regex substitution" for example, the ISNI problem (of additional spaces within the id that must be removed for the formatter URL) would be handled by a value something like "s/\s+//g" (remove all spaces). Some of the others might need a "regex match" on the id that allows specifying a $1, $2, $3 grouping pattern, and the formatter URL then looks something like http://./$1/$2/$3 (or that could also possibly be handled by a substitution as in the ISNI case). The IMDB case is more difficult because it's essential 4 different formatter URLs based on the first characters of the id, so it might need a "regex match" that limits the scope of each formatter URL based on the id; wikibase would then need to look through the match regexes to find a matching formatter URL and use that.TASK DETAILhttps://phabricator.wikimedia.org/T150939EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, hoo, Lydia_Pintscher, Nikki, Stigmj, Aklapper, Esc3300, QZanden, D3r1ck01, Izno, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T150939: Replace https://tools.wmflabs.org/wikidata-externalid-url by providing improved handling for external id formatter urls
ArthurPSmith added a comment. As background, I'm seeing about 2000 "hits" per day on this service right now, with about a dozen properties linking through it to their databases.TASK DETAILhttps://phabricator.wikimedia.org/T150939EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, hoo, Lydia_Pintscher, Nikki, Stigmj, Aklapper, Esc3300, QZanden, D3r1ck01, Izno, Wikidata-bugs, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T160205: Add interstitial to wikidata-externalid-url
ArthurPSmith added a comment. @Esc3300 well, I developed this tool because links for IMDB and a handful of other properties were broken when we made the change from string to "external identifier" last year, where the wikidata UI started putting the links in directly (previously it had been done by a _javascript_ gadget - which meant the links wouldn't be available to re-users either). So "work without this tool" would break a lot of stuff in wikidata and for everybody using it.TASK DETAILhttps://phabricator.wikimedia.org/T160205EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Lydia_Pintscher, Esc3300, coren, ArthurPSmith, Dispenser, Aklapper, QZanden, Tbscho, Freddy2001, JJMC89, D3r1ck01, Izno, Luke081515, Wikidata-bugs, aude, yuvipanda, Gryllida, jayvdb, scfc, Mbch331, valhallasw, chasemp___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T150803: Information leak on wikidata-externalid-url
ArthurPSmith closed this task as "Invalid".ArthurPSmith added a comment. @jeblad I'm resolving this as invalid as the initial claim of an information leak seems to be incorrect. However you might want to open up a separate phabricator ticket with your detailed suggestion on how to do formatter URL's better, I think it's a promising approach to allow pulling components from the "regular _expression_" syntax.TASK DETAILhttps://phabricator.wikimedia.org/T150803EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Sjoerddebruin, Multichill, Stigmj, jhsoby, ZhouZ, ArthurPSmith, Bawolff, APalmer_WMF, jeblad, Aklapper, Freddy2001, JJMC89, D3r1ck01, Izno, Luke081515, Wikidata-bugs, aude, yuvipanda, Svick, Gryllida, jayvdb, Wesalius, scfc, coren, csteipp, Mbch331, Jay8g, Krenair, valhallasw, jeremyb, chasemp___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T122706: Create a WDQS-based ElementProvider
ArthurPSmith added a comment. I see you've closed - looks good by the way. Anyway, on the question of retaining WDQ - no I don't think that's necessary, I think Magnus would like to shut it down eventually. I don't see that WDQ adds anything to this tool now SPARQL is working reliably, it's fast and stable. So feel free to remove... ArthurTASK DETAILhttps://phabricator.wikimedia.org/T122706EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Ricordisamoa, ArthurPSmithCc: ArthurPSmith, gerritbot, Aklapper, Ricordisamoa, StudiesWorld, D3r1ck01, Izno, Wikidata-bugs, Lucie, aude, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T120452: Allow structured datasets on a central repository (CSV, TSV, JSON, GeoJSON, XML, ...)
ArthurPSmith added a comment. @Yurik and all, I'm glad to see all this work going on, I was pointed to this after I made a comment on a wikidata property proposal that I thought would be best addressed by somehow allowing a tabular data value rather than a single value. However, I'm wondering if this might be best driven by specific problem cases rather than trying to tackle generic "data" records. One of the most common needs is for time-series data: population of a city vs time, for instance, economic data by point in time, physical data like temperature vs time, etc. The simplest extension beyond the single value allowed by wikidata would be to allow a set of pairs defined by two wikidata properties (eg. P585 - "point in time", P1082 - "population"). The relation to wikidata takes care of localization (those properties have labels in many different languages) and defines the value types (time and quantity in this case), and the dataset would somehow be a statement attached to a wikidata item (eg. a particular city) so that the item and pair of properties fully define the meaning of the collection of pairs. The underlying structure of the pairs doesn't really matter much. But there seems to be something missing here - I think it might be best addressed in wikidata itself...TASK DETAILhttps://phabricator.wikimedia.org/T120452EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Yurik, ArthurPSmithCc: ArthurPSmith, RobLa-WMF, TheDJ, Eloy, Jdforrester-WMF, brion, ThurnerRupert, intracer, TerraCodes, Pokefan95, gerritbot, -jem-, Bawolff, MZMcBride, Milimetric, Thryduulf, JEumerus, MarkTraceur, Yurik, Matanya, ekkis, matmarex, Lydia_Pintscher, Aklapper, Steinsplitter, StudiesWorld, DannyH, V4switch, D3r1ck01, Izno, JAllemandou, Wikidata-bugs, matthiasmullie, aude, El_Grafo, jayvdb, Ricordisamoa, Shizhao, Fabrice_Florin, Mbch331, Jay8g, Krenair, jeremyb___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T142432: ptable app is broken again!
ArthurPSmith added a comment. Excellent, thanks! I probably should have sent you an email...TASK DETAILhttps://phabricator.wikimedia.org/T142432EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Ricordisamoa, ArthurPSmithCc: gerritbot, Aklapper, ArthurPSmith, D3r1ck01, Izno, Wikidata-bugs, Lucie, aude, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Triaged] T142432: ptable app is broken again!
ArthurPSmith triaged this task as "High" priority.ArthurPSmith added a comment. So I updated to https in my local copy and that definitely fixed the problem. Not sure if @Ricordisamoa is around? I don't have permission right now to do anything with ptable, but I do have an account (apsmith) on tools.wmflabs.org so if I was in the right group I could help out here maybe...TASK DETAILhttps://phabricator.wikimedia.org/T142432EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Ricordisamoa, ArthurPSmithCc: Aklapper, ArthurPSmith, D3r1ck01, Izno, Wikidata-bugs, Lucie, aude, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T142432: ptable app is broken again!
ArthurPSmith added a comment. Still broken (at least 3 days now). I can't see the error messages but I tried running my own copy and ran into: https://lists.wikimedia.org/pipermail/mediawiki-api-announce/2016-May/000110.html the code is using http not https: base.py:WD_API = 'http://www.wikidata.org/w/api.php' so I'm guessing this is the problem! Hopefully simple to fix?!TASK DETAILhttps://phabricator.wikimedia.org/T142432EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Ricordisamoa, ArthurPSmithCc: Aklapper, ArthurPSmith, D3r1ck01, Izno, Wikidata-bugs, Lucie, aude, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T142432: ptable app is broken again!
ArthurPSmith created this task.ArthurPSmith added projects: Tool-Labs-tools-Wikidata-Periodic-Table, Wikidata.Herald added a subscriber: Aklapper. TASK DESCRIPTIONhttps://tools.wmflabs.org/ptable has been returning a 500 Server Error since earlier today - possibly longer. Something recently broken in its dependencies?TASK DETAILhttps://phabricator.wikimedia.org/T142432EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: Ricordisamoa, ArthurPSmithCc: Aklapper, ArthurPSmith, D3r1ck01, Izno, Wikidata-bugs, Lucie, aude, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T112140: Provide a wrapper function in pywikibot around wbparsevalue
ArthurPSmith added a comment. Ok, the WbRepresentation superclass looks like it might help simplify this. But FilePage, ItemPage and PropertyPage (and basestring) are not subclasses of that, so I think just returning the json hash would be best there. But the function could certainly run fromWikibase for the other types, that seems pretty easy, I'll look into that.TASK DETAILhttps://phabricator.wikimedia.org/T112140EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Tobias1984, ArthurPSmith, Aklapper, pywikibot-bugs-list, Multichill, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T112140: Provide a wrapper function in pywikibot around wbparsevalue
ArthurPSmith added a comment. @Multichill - could be, I'm not familiar with WbTime other than a glance at the code. Are there edge cases (eg. 10^20 years into the future?) that would break the "int/long" assumptions? But it definitely does NOT work for WbQuantity the way things currently are. Fixing WbQuantity seemed to be out of scope here, though it does need to be done. Coordinate may have similar issues as it uses floats. From pywikibot/page.py description of the Property class, the actual classes involved in the different object types are: ItemPage or PropertyPage basestring FilePage Coordinate WbTime WbQuantity WbMonolingualText The parser doesn't seem to do anything special for wikibase-item types; creating the object would I think involve another separate API query. It seems to me each case would need to be handled at least a little differently. One route would be a separate function to turn the parsed values into pywikibot objects - for example a special constructor for WbTime that takes the parsed values.TASK DETAILhttps://phabricator.wikimedia.org/T112140EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Tobias1984, ArthurPSmith, Aklapper, pywikibot-bugs-list, Multichill, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T112140: Provide a wrapper function in pywikibot around wbparsevalue
ArthurPSmith added a comment. >>! In T112140#2435122, @Multichill wrote: The function should return an object. Possibilities seem to be commonsMedia, globe-coordinate, monolingualtext, quantity, string, time, url, external-id, wikibase-item, wikibase-property, math The parse API allows a list of values to be parsed (not just one at a time), and I have written the function to return a list of the parsed "objects" in the form of just the value (for strings) or a python dict with the keys and values supplied by the wbparsevalue api. In particular, parsing a list of quantity values returns a list of dicts with the keys 'amount', 'upperBound', 'lowerBound', and possibly other keys as provided by the API (for example it returns 'unit' with value '1'). Unfortunately, some existing pywikibot classes like WbQuantity or WbTime do not work for this because they use Decimal or long/int rather than string values (however it looks like WbMonolingualText could work). I thought making that change to the classes ought to be a separate step to providing access to the parse api.TASK DETAILhttps://phabricator.wikimedia.org/T112140EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Tobias1984, ArthurPSmith, Aklapper, pywikibot-bugs-list, Multichill, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T112140: Provide a wrapper function in pywikibot around wbparsevalue
ArthurPSmith added a comment. See https://gerrit.wikimedia.org/r/#/c/297637/ for proposed implementation...TASK DETAILhttps://phabricator.wikimedia.org/T112140EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Tobias1984, ArthurPSmith, Aklapper, pywikibot-bugs-list, Multichill, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. Ok, that echoes something Tobias has said also about using strings and avoiding IEEE fp. I'm going to look at getting T112140 working first and then see if I can bring that implementation to bear on this.TASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: thiemowmde, Tobias1984, Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Claimed] T112140: Provide a wrapper function in pywikibot around wbparsevalue
ArthurPSmith claimed this task.ArthurPSmith added a comment. I'm going to have a shot at implementing this - it looks like it will be useful for a number of other open phabricator issues for pywikibot. I was figuring a function that will take all the parameters the API offers (datatype - a string, values - a list of strings, options - a dict, validate - boolean). Any other recommendations?TASK DETAILhttps://phabricator.wikimedia.org/T112140EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: ArthurPSmith, Aklapper, pywikibot-bugs-list, Multichill, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Updated] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. You're the one who brought up JSON! It sounds like the issue is something different though - internal representation as strings? Anyway, are you recommending pywikibot use the wbparsevalue API for all (or at least numerical) input? That could be a good idea. Looks like it there was already a phabricator ticket on this - T112140 Tobias, any thoughts?TASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: thiemowmde, Tobias1984, Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. That restriction is NOT in the JSON spec: http://tools.ietf.org/html/rfc7159.html#section-6 - also the leading plus is not required by JSON. Is there some other reason for the limitation in the wikidata code? DataValues is a wikidata-specific PHP library right? I can't think of any good reason to keep this limitation on input values.TASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: thiemowmde, Tobias1984, Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. Hmm. So is it a pywikibot problem or a wikibase API problem? Is pywikibot sending in JSON format?TASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: thiemowmde, Tobias1984, Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. As far as testing goes, I have (in my own copy) added the following to the pywikibot tests/wikibase_edit_tests.py file (within the class TestWikibaseMakeClaim): def _check_quantity_claim(self, value, uncertainty): """Helper function to add and check quantity claims""" testsite = self.get_repo() item = self._clean_item(testsite, 'P64') # set new claim claim = pywikibot.page.Claim(testsite, 'P64', datatype='quantity') target = pywikibot.WbQuantity(value, error=uncertainty) claim.setTarget(target) item.addClaim(claim) item.get(force=True) claim = item.claims['P64'][0] self.assertEqual(claim.getTarget(), target) def test_medium_quantity_edit(self): """Attempt to add medium-size quantity claim.""" self._check_quantity_claim(1.5, 0.1) def test_small_quantity_edit(self): """Attempt to add very small quantity claim.""" self._check_quantity_claim(1.0e-7, 2.0e-8) def test_large_quantity_edit(self): """Attempt to add large quantity claim.""" self._check_quantity_claim(1.935e35, 1e32) def test_negative_quantity_edit(self): """Attempt to add negative quantity claims.""" self._check_quantity_claim(-1.5, 0.1) When these tests are run via python pwb.py tests/wikibase_edit_tests.py -v both test_large_quantity_edit() and test_small_quantity_edit fail, with messages: Attempt to add large quantity claim. ... WARNING: API error invalid-snak: Invalid snack (Value must match the pattern for decimal values.) Attempt to add very small quantity claim. ... WARNING: API error invalid-snak: Invalid snak. (Value must match the pattern for decimal values.)TASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Tobias1984, Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith added a comment. Please note this is still an issue with the latest pywikibot code and current wikidata release - as of June 23, 2016. The following is the fix I have in the pywikibot core pywikibot/__init__.py file: instead of format(value, "+g") we need: if math.fabs(value) < 0.001: num_str = float_fix.convert_sc_to_str(float(value)) if value >= 0: num_str = '+{0}'.format(num_str) else: num_str = format(value, "+g") return num_strTASK DETAILhttps://phabricator.wikimedia.org/T119226EMAIL PREFERENCEShttps://phabricator.wikimedia.org/settings/panel/emailpreferences/To: ArthurPSmithCc: Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Mdupont, D3r1ck01, Izno, Wikidata-bugs, aude, jayvdb, Ricordisamoa, Mbch331___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T128078: slashes in identifiers get rendered incorrectly
ArthurPSmith added a comment. Note this may be just a problem for the Freebase Identifier; if you replace just the leading '%2f' with '/' the remaining '%2f' characters are correctly interpreted by the server (as they should be) - the problem is the formatter URL ends with the server name which is probably a bad idea (but may be how freebase needs to work?) TASK DETAIL https://phabricator.wikimedia.org/T128078 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, aude, Aklapper, Lydia_Pintscher, Multichill, StudiesWorld, Izno, Wikidata-bugs, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T91505: [Epic] Adding new datatypes to Wikidata
ArthurPSmith added a comment. In https://phabricator.wikimedia.org/T91505#2015282, @Ricordisamoa wrote: > In https://phabricator.wikimedia.org/T91505#2008209, @Swpb wrote: > > > this discussion <https://www.wikidata.org/wiki/Wikidata:Property_proposal/Natural_science#Right_Ascension_.2F_Declination_.2F_Distance> has raised the usefulness of a degrees-minutes-seconds datatype for angular information. > > > The quantity datatype can fulfill that well once precision and custom formatters are supported properly. I don't know if that's true. Right ascension is peculiar - it is typically given in time units (hours, minutes, seconds, and decimal beyond that). Formatting could turn a numerical value (number of seconds, say) into the right sort of string, but how do you handle data entry? Ideally we want somebody to be able to type in values as they find them in an astronomical table, which would be in the hours-minutes-seconds format, not a numeric number of seconds. Is there any proposal to create a data entry formatter that could do that sort of thing? TASK DETAIL https://phabricator.wikimedia.org/T91505 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Swpb, ArthurPSmith, gerritbot, Smalyshev, Shrutika719, MGChecker, Sannita, Ricordisamoa, mgrabovsky, Liuxinyu970226, Rits, Physikerwelt, Lydia_Pintscher, Niharika, Aklapper, Izno, Wikidata-bugs, aude, Mbch331, Jay8g ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T114547: Table of nuclides based on Wikidata
ArthurPSmith closed this task as "Resolved". ArthurPSmith claimed this task. ArthurPSmith added a comment. Herald added a subscriber: StudiesWorld. Probably should close this - it's been up live for a week or so now! Ran into a problem with query service bugs, but that seems to be resolved. See http://tools.wmflabs.org/ptable/nuclides TASK DETAIL https://phabricator.wikimedia.org/T114547 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: StudiesWorld, ArthurPSmith, Pamputt, Tobias1984, Ricordisamoa, Aklapper, Izno, Wikidata-bugs, Lucie, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T126223: /ptable project is broken
ArthurPSmith created this task. ArthurPSmith added a subscriber: ArthurPSmith. ArthurPSmith added projects: Tool-Labs, Wikidata-Periodic-Table. Herald added subscribers: StudiesWorld, Aklapper. Herald added projects: Labs, Wikidata. TASK DESCRIPTION https://tools.wmflabs.org/ptable/ has been given a 503 since Friday (Feb 5). I believe Ricordisamoa was updating something and tried to get hold of you via IRC without success... TASK DETAIL https://phabricator.wikimedia.org/T126223 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, StudiesWorld, ArthurPSmith, Izno, Luke081515, Wikidata-bugs, Lucie, aude, yuvipanda, Gryllida, jayvdb, Ricordisamoa, scfc, coren, Mbch331, valhallasw ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Changed Subscribers] T91505: [Epic] Adding new datatypes to Wikidata
ArthurPSmith added a subscriber: ArthurPSmith. TASK DETAIL https://phabricator.wikimedia.org/T91505 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, gerritbot, Smalyshev, Shrutika719, MGChecker, Sannita, Ricordisamoa, mgrabovsky, Liuxinyu970226, Rits, Physikerwelt, Qgil, Lydia_Pintscher, NiharikaKohli, Aklapper, Wikidata-bugs, aude, Mbch331, Jay8g ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Changed Subscribers] T67397: [Story] add a new datatype for formulae
ArthurPSmith added a subscriber: ArthurPSmith. TASK DETAIL https://phabricator.wikimedia.org/T67397 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: Physikerwelt, ArthurPSmith Cc: ArthurPSmith, TomT0m, Llyrian, WickieTheViking, Aklapper, MGChecker, Micru, Sannita, Ricordisamoa, Rits, Liuxinyu970226, NiharikaKohli, Tpt, Physikerwelt, Wikidata-bugs, Bene, Tobias1984, Lydia_Pintscher, daniel, mobrovac, Prod, aude, fredw, Pkra, scfc, Mbch331, Ltrlg ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Changed Subscribers] T110534: [Story] Add a new datatype for MatrixValue
ArthurPSmith added a subscriber: ArthurPSmith. TASK DETAIL https://phabricator.wikimedia.org/T110534 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Tarrow, Addshore, Ricordisamoa, daniel, thiemowmde, Lydia_Pintscher, Aklapper, Powermelon, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Closed] T112130: Pywikibot crashes on items with quantities with units. Need to implement unit support in pywikibot WbQuanity
ArthurPSmith closed this task as "Resolved". TASK DETAIL https://phabricator.wikimedia.org/T112130 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, zhuyifei1999, jayvdb, Ladsgroup, gerritbot, Aklapper, pywikibot-bugs-list, Multichill, Wikidata-bugs, aude, Ricordisamoa, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Changed Subscribers] T112577: Make it possible to add a qualifier together with a new claim using new_claim.addQualifier()
ArthurPSmith added a subscriber: ArthurPSmith. ArthurPSmith added a comment. Herald added a subscriber: StudiesWorld. Just want to add support - this would be useful if possible! Of course it's not possible in the web interface (claim has to be added first, then qualifiers & sources in separate updates) so it may not be something that can be done easily. TASK DETAIL https://phabricator.wikimedia.org/T112577 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: StudiesWorld, ArthurPSmith, Aklapper, matej_suchanek, pywikibot-bugs-list, Wikidata-bugs, aude, Ricordisamoa, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T112130: Pywikibot crashes on items with quantities with units. Need to implement unit support in pywikibot WbQuanity
ArthurPSmith added a subscriber: ArthurPSmith. ArthurPSmith added a comment. I've been using pywikibot to handle quantities with units for the past few weeks, it seems to work fine. I don't see what else needs to be done here? TASK DETAIL https://phabricator.wikimedia.org/T112130 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, zhuyifei1999, jayvdb, Ladsgroup, gerritbot, Aklapper, pywikibot-bugs-list, Multichill, Wikidata-bugs, aude, Ricordisamoa, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T119226: Very small (or very large) quantity values (represented in scientific notation) result in error in add/update via pywikibot/wikidata API
ArthurPSmith created this task. ArthurPSmith added a subscriber: ArthurPSmith. ArthurPSmith added projects: Wikidata, Pywikibot-Wikidata, pywikibot-core. Herald added subscribers: pywikibot-bugs-list, StudiesWorld, Aklapper. TASK DESCRIPTION pywikibot was recently updated to better handle decimal values - see this gerrit code change: https://gerrit.wikimedia.org/r/#/c/250497/ - however, as I noted there in a comment at the end, there is a problem with very small values (and I believe also for very large ones) which the formatter converts to exponential notation. The Wikidata API does not accept numbers for quantity values formatted with exponential notation. Either the formatter on the pywikibot side needs to be smarter in converting values to a standard decimal value the API understands, or the API needs to be more generous in accepting scientific notation. Here's the symptom of the problem: I tried adding a "proportion" qualifier value that is 1.9e-9. I get the following warning and stack trace: WARNING: API error invalid-snak: Invalid snak (Value must match the pattern for decimal values.) Traceback (most recent call last): File "pwb.py", line 248, in if not main(): ... claim.addQualifier(prop_qual, bot=True, summary="Adding branching fraction qualifier from NNDC.") File ".../core/pywikibot/page.py", line 4404, in addQualifier data = self.repo.editQualifier(self, qualifier, **kwargs) File ".../core/pywikibot/site.py", line 1297, in callee return fn(self, *args, **kwargs) File ".../core/pywikibot/site.py", line 7019, in editQualifier data = req.submit() File ".../core/pywikibot/data/api.py", line 2178, in submit raise APIError(**result['error']) pywikibot.data.api.APIError: invalid-snak: Invalid snak (Value must match the pattern for decimal values.) [messages:[{'parameters': [], 'name': 'wikibase-api-invalid-snak', 'html': {'*': 'Invalid snak'}}]; help:See https://www.wikidata.org/w/api.php for API usage] I have modified the pywikibot code to format the quantity values as "+0.19" rather than "+1.9e-09" and it goes through just fine. That is one solution, but it would probably better for the API to handle scientific notation properly as this will come up with any client that tries to provide very small (or large) values as quantities. TASK DETAIL https://phabricator.wikimedia.org/T119226 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, StudiesWorld, pywikibot-bugs-list, ArthurPSmith, Wikidata-bugs, aude, Ricordisamoa, Mbch331, jayvdb ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Created] T117808: Display very large or very small quantity values using scientific notation
ArthurPSmith created this task. ArthurPSmith added a subscriber: ArthurPSmith. ArthurPSmith added a project: Wikidata. Herald added subscribers: StudiesWorld, Aklapper. TASK DESCRIPTION After entering a value for the Planck constant (https://www.wikidata.org/wiki/Q122894) in terms of its SI unit joule-seconds, the value of 6.6x10^-34 displays in wikidata as 0.00 joule-second (and the uncertainty value also disappears). This isn't very useful. In a similar vein, entering the half-life of Bi-209 (https://www.wikidata.org/wiki/Q1193) which is 1.9 +- 0.2 x 10^19 years displays in wikidata as 19,000,000,000,000,000,000±2,000,000,000,000,000,000, a little hard to read. I think python's 'g' format defaults are reasonable - for anything below 10^-4 or above 10^6 (or above the length of the number in the default precision) display in scientific notation with an 'e', otherwise display as a regular number. Or quantity display format could perhaps be a user-specified setting as with language. TASK DETAIL https://phabricator.wikimedia.org/T117808 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: Aklapper, StudiesWorld, ArthurPSmith, Wikidata-bugs, aude, Mbch331 ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T114547: Table of nuclides based on Wikidata
ArthurPSmith added a comment. Ok - see https://gerrit.wikimedia.org/r/245591 for the change. TASK DETAIL https://phabricator.wikimedia.org/T114547 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Pamputt, Tobias1984, Ricordisamoa, Aklapper, Wikidata-bugs, Lucie, aude ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T114547: Table of nuclides based on Wikidata
ArthurPSmith added a comment. Thanks! I'm partially set up but I need to do a bit of reading. I will most likely get this in (with updates) Monday - hope that's ok! TASK DETAIL https://phabricator.wikimedia.org/T114547 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Pamputt, Tobias1984, Ricordisamoa, Aklapper, Wikidata-bugs, Lucie, aude ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T114547: Table of nuclides based on Wikidata
ArthurPSmith added a comment. I hacked on the periodic table code to get a very bare-bones nuclides code working... see files uploaded (nuclides.py has the main content, units.py is to do something with half-life data for now, nu_app.py runs the flask app, index.html is the template display - needs a lot of styling!) F2674065: nu_app.py <https://phabricator.wikimedia.org/F2674065> F2674070: nuclides.py <https://phabricator.wikimedia.org/F2674070> F2674076: units.py <https://phabricator.wikimedia.org/F2674076> F2674080: index.html <https://phabricator.wikimedia.org/F2674080> TASK DETAIL https://phabricator.wikimedia.org/T114547 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Pamputt, Tobias1984, Ricordisamoa, Aklapper, Wikidata-bugs, Lucie, aude ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Commented On] T114547: Table of nuclides based on Wikidata
ArthurPSmith added a comment. I've never used Gerrit - I guess it's gerrit.wikimedia.org? phabricator/tools is the project? How does one get an account there? I've made a few changes but a couple of things still in progress - it's getting closer, here's an image of what it looks like right now:F2676399: nuclides_chart_01.png <https://phabricator.wikimedia.org/F2676399> When you hover over a box in the chart it shows the label, and allows you to link to the wikidata page for that isotope. That part's nice... I need to handle the 'm' nuclides (excited state isomers), also find some way of identifying the stable nuclides. The chart is color-coded by half-life (but not many nuclides have half lives entered yet). TASK DETAIL https://phabricator.wikimedia.org/T114547 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Pamputt, Tobias1984, Ricordisamoa, Aklapper, Wikidata-bugs, Lucie, aude ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
[Wikidata-bugs] [Maniphest] [Changed Subscribers] T114547: Table of nuclides based on Wikidata
ArthurPSmith added a subscriber: ArthurPSmith. TASK DETAIL https://phabricator.wikimedia.org/T114547 EMAIL PREFERENCES https://phabricator.wikimedia.org/settings/panel/emailpreferences/ To: ArthurPSmith Cc: ArthurPSmith, Pamputt, Tobias1984, Ricordisamoa, Aklapper, Wikidata-bugs, Lucie, aude ___ Wikidata-bugs mailing list Wikidata-bugs@lists.wikimedia.org https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs