[Wikidata-bugs] [Maniphest] T367010: WIkidata Query Service example queries are missing

2024-06-09 Thread Audiodude
Audiodude created this task.
Audiodude added projects: Wikidata, Wikidata Query UI.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  **Steps to replicate the issue**
  
  - Go to query.wikidata.org
  - Click the "examples" button
  
  **Expected behavior**
  
  The list of searchable examples appears.
  
  **Actual behavior**:
  
  An empty modal pops up, with zero example queries (but the search bar is 
still there):
  
  **Software version** (on `Special:Version` page; skip for WMF-hosted wikis 
like Wikipedia):
  
  N/A
  
  **Other information** (browser name/version, screenshots, etc.):
  
  F55137206: image.png <https://phabricator.wikimedia.org/F55137206>

TASK DETAIL
  https://phabricator.wikimedia.org/T367010

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Audiodude
Cc: Aklapper, Audiodude, Danny_Benjafield_WMDE, S8321414, Astuthiodit_1, 
AWesterinen, karapayneWMDE, Invadibot, maantietaja, ItamarWMDE, Akuckartz, 
Dringsim, Nandana, Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, 
GoranSMilovanovic, Mahir256, QZanden, EBjune, KimKelting, merbst, LawExplorer, 
Salgo60, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, jkroll, 
Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Lydia_Pintscher, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T322982: What are the expected HTTP status codes of the Wikidata Query Service?

2022-11-13 Thread Audiodude
Audiodude created this task.
Audiodude added a project: Wikidata-Query-Service.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  I tried Googling this and looking through the user manual here: 
https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual
  
  I didn't see any description of what HTTP status codes the service endpoint 
might return in various situations. I'm hoping to get a list like:
  
  400: Returned when there is a generic problem with the query
  455: Returned when the query could not be parsed
  
  500: Unspecified server error
  501: Unknown error when trying to parse the query
  
  etc
  
  My main goal is to design my service so that I can decide which errors are 
transient, caused by connectivity or temporary capacity issues, and can be 
retried. This is opposed to errors that are caused by the query data I'm 
sending which will never work.
  
  Thanks!

TASK DETAIL
  https://phabricator.wikimedia.org/T322982

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Audiodude
Cc: Aklapper, Kelson, Audiodude, AWesterinen, MPhamWMF, CBogen, Namenlos314, 
Gq86, Lucas_Werkmeister_WMDE, EBjune, merbst, Jonas, Xmlizer, jkroll, 
Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T179879: Provide a 5-minute timeout in WDQS for trusted users using OAuth

2022-10-02 Thread Audiodude
Audiodude added a comment.


  Copied from the duplicate bug:
  
  > Is it possible to get allowlisted for a longer WDQS timeout? We're 
currently building Wikipedia On Demand, a project funded through WMF to query 
WDQS with SPARQL and generate (sometimes massive) article lists for Wikipedia 
projects, that can then be fed to a Zimfarm to generate a custom offline 
version of Wikipedia. We're not sure of the exact timeout value we need, but 60 
seconds seems too small.
  
  I've read through the comments on this (old) task, and I believe that many of 
the solutions in comment (https://phabricator.wikimedia.org/T179879#4002776) 
could work for us. Specifically, an OAuth gated privileged endpoint or 
offlining large tasks could both be viable solutions.
  
  The last comment before my task was merged is from February 2021, about 18 
months ago. What can be done to get this further off the ground?
  
  Relatedly, I've considered setting up our own WDQS instance in AWS with the 
limits removed. Can someone provide pointers to the resource requirements of 
such a service? I've already looked at 
https://www.mediawiki.org/wiki/Wikidata_Query_Service/User_Manual#Standalone_service

TASK DETAIL
  https://phabricator.wikimedia.org/T179879

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Audiodude
Cc: Kelson, Audiodude, Nikki, So9q, vitaly-zdanevich, NoInkling, Jheald, 
Bawolff, Bugreporter, Manu1400, Liuxinyu970226, MichaelSchoenitzer, Edgars2007, 
chasemp, Lydia_Pintscher, Magnus, MichaelSchoenitzer_WMDE, MisterSynergy, 
doctaxon, Jonas, Ash_Crow, Daniel_Mietchen, Lucas_Werkmeister_WMDE, Jane023, 
Base, Gehel, Smalyshev, Ijon, Aklapper, Astuthiodit_1, AWesterinen, bking, 
karapayneWMDE, Invadibot, MPhamWMF, maantietaja, CBogen, ItamarWMDE, Akuckartz, 
ET4Eva, Nandana, Namenlos314, Lahi, Gq86, GoranSMilovanovic, QZanden, EBjune, 
merbst, LawExplorer, Avner, _jensen, rosalieper, Scott_WUaS, FloNight, Xmlizer, 
jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T319151: Pagination through Wikidata Query Service results

2022-10-02 Thread Audiodude
Audiodude created this task.
Audiodude added projects: Wikidata, Wikidata-Query-Service.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  **Feature summary** (what you would like to be able to do and where):
  
  See also justification in https://phabricator.wikimedia.org/T319150. For the 
Wikipedia On Demand <https://meta.wikimedia.org/wiki/Kiwix/Wikipedia_on_demand> 
service, it is essential that we are able to generate and consume large 
datasets. It doesn't seem that the Wikidata Query Service currently supports 
pagination of returned results.
  
  **Use case(s)** (list the steps that you performed to discover that problem, 
and describe the actual underlying problem which you want to solve. Do not 
describe only a solution):
  
  When making a very large query, it would be difficult for our software to 
transfer a massive result set as a single response. We would prefer to be able 
to paginate through the results so that we can limit the amount of data we keep 
in memory at once.
  
  **Benefits** (why should this be implemented?):
  
  Pagination would benefit all clients making large queries, even if the 60 
second timeout isn't increased. It would allow clients to consume results in 
smaller chunks, saving working memory. It would also allow clients to be more 
resilient to network failures, because if a single page request fails, it can 
easily be retried, whereas if a single monolithic response fails, it's more 
difficult.

TASK DETAIL
  https://phabricator.wikimedia.org/T319151

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Audiodude
Cc: Aklapper, Kelson, Audiodude, Astuthiodit_1, AWesterinen, karapayneWMDE, 
Invadibot, MPhamWMF, maantietaja, CBogen, ItamarWMDE, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, 
EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, 
jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org


[Wikidata-bugs] [Maniphest] T319150: Allowlist for longer Wikidata Query Service timeout?

2022-10-02 Thread Audiodude
Audiodude created this task.
Audiodude added projects: Wikidata, Wikidata-Query-Service.
Restricted Application added a subscriber: Aklapper.

TASK DESCRIPTION
  **Feature summary** (what you would like to be able to do and where):
  
  Is it possible to get allowlisted for a longer WDQS timeout? We're currently 
building Wikipedia On Demand 
<https://meta.wikimedia.org/wiki/Kiwix/Wikipedia_on_demand>, a project funded 
through WMF to query WDQS with SPARQL and generate (sometimes massive) article 
lists for Wikipedia projects, that can then be fed to a Zimfarm 
<https://farm.openzim.org/> to generate a custom offline version of Wikipedia. 
We're not sure of the exact timeout value we need, but 60 seconds seems too 
small.
  
  **Use case(s)** (list the steps that you performed to discover that problem, 
and describe the actual underlying problem which you want to solve. Do not 
describe only a solution):
  
  We would like to do massive queries against WDQS, such as "All living 
persons" or "Geographic locations in Africa". Currently, our attempts to do 
these queries has met the timeout limit. If there is any way to have our User 
Agent allowlisted so that it can have a longer timeout, we would greatly 
appreciate it.
  
  **Benefits** (why should this be implemented?):
  
  If there currently is an allowlist, it should be easy to add our project, and 
we can work on providing further documentation and explanation. If there's not 
an allowlist, it might benefit the WDQS to add one for use cases like this

TASK DETAIL
  https://phabricator.wikimedia.org/T319150

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Audiodude
Cc: Aklapper, Kelson, Audiodude, Astuthiodit_1, AWesterinen, karapayneWMDE, 
Invadibot, MPhamWMF, maantietaja, CBogen, ItamarWMDE, Akuckartz, Nandana, 
Namenlos314, Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, QZanden, 
EBjune, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, Xmlizer, 
jkroll, Wikidata-bugs, Jdouglas, aude, Tobias1984, Manybubbles, Mbch331
___
Wikidata-bugs mailing list -- wikidata-bugs@lists.wikimedia.org
To unsubscribe send an email to wikidata-bugs-le...@lists.wikimedia.org