Gehel added a comment.

  It is unclear to me what "running from dumps" mean. WDQS is already running 
from dumps, in the sense that the dumps are loaded into the Blazegraph instance 
backing WDQS, and then updated on the fly. Loading dumps currently takes 
multiple weeks when we are lucky, and multiple months in cases of repeated 
errors (see T323096 <https://phabricator.wikimedia.org/T323096> or T263110 
<https://phabricator.wikimedia.org/T263110>), so just loading from dumps is 
prohibitively expensive in the current technical context. It is also unclear 
how this would help queries not fail.
  
  An approach that might be more successful is to provide an endpoint that only 
contains a subset of data that is relevant for the Scholia project. Working 
with a smaller graph is more likely to help queries complete in a timely 
manner. This is an approach that we want to investigate as part of T335067 
<https://phabricator.wikimedia.org/T335067>.
  
  I'm closing this for now. Feel free to reopen if you think I'm missing the 
point.

TASK DETAIL
  https://phabricator.wikimedia.org/T334082

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: Gehel
Cc: Gehel, ArielGlenn, Jelabra, TiagoLubiana, WolfgangFahl, EgonWillighagen, 
Fnielsen, Daniel_Mietchen, Busfault, Astuthiodit_1, Atieno, AWesterinen, 
karapayneWMDE, Invadibot, Pablo, MPhamWMF, maantietaja, jannee_e, CBogen, 
ItamarWMDE, Akuckartz, MGerlach, holger.knust, Nandana, Namenlos314, kostajh, 
Lahi, Gq86, Lucas_Werkmeister_WMDE, GoranSMilovanovic, Lunewa, QZanden, EBjune, 
srishakatux, merbst, LawExplorer, _jensen, rosalieper, Scott_WUaS, Jonas, 
Xmlizer, gnosygnu, Fuzheado, Magioladitis, jkroll, Wikidata-bugs, Jdouglas, 
aude, Tobias1984, Manybubbles, Mbch331, Hokwelum
_______________________________________________
Wikidata-bugs mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to