dcausse added a comment.

  Discussed this with Joseph as we believe that having to configure the cleanup 
job in another repo is not ideal.
  It seems that the long term approach might be around using the data catalog 
(https://datahub.wikimedia.org/) to store some retention metadata and have 
generic jobs relying on this to do the cleanups.
  One option (short term) could be to copy refinery-drop-mediawiki-snapshots to 
the search airflow code base and use it for for our needs.
  It's not ideal but might be acceptable for some time? @EBernhardson would 
that work for you?

TASK DETAIL
  https://phabricator.wikimedia.org/T303831

EMAIL PREFERENCES
  https://phabricator.wikimedia.org/settings/panel/emailpreferences/

To: EBernhardson, dcausse
Cc: EBernhardson, dcausse, Gehel, JAllemandou, Aklapper, AKhatun_WMF, Jersione, 
Hellket777, LisafBia6531, Astuthiodit_1, AWesterinen, 786, Biggs657, 
karapayneWMDE, Invadibot, MPhamWMF, maantietaja, Juan90264, Alter-paule, 
Beast1978, CBogen, ItamarWMDE, Un1tY, Akuckartz, Hook696, Kent7301, 
joker88john, CucyNoiD, Nandana, Namenlos314, Gaboe420, Giuliamocci, Cpaulf30, 
Lahi, Gq86, Af420, Bsandipan, Lucas_Werkmeister_WMDE, GoranSMilovanovic, 
QZanden, EBjune, merbst, LawExplorer, Lewizho99, Maathavan, _jensen, 
rosalieper, Neuronton, Scott_WUaS, Jonas, Xmlizer, jkroll, Wikidata-bugs, 
Jdouglas, aude, Tobias1984, Manybubbles, Mbch331
_______________________________________________
Wikidata-bugs mailing list -- [email protected]
To unsubscribe send an email to [email protected]

Reply via email to