| hoo added a comment. |
My suggestion is, once the fix for not introducing new duplicates (https://gerrit.wikimedia.org/r/354550) is in, to list all duplicate row ids in a text file (generated via a query similar to the one I already mentioned). Then we can delete these in batches using a maintenance script reading from the file of term_row_id to delete.
TASK DETAIL
EMAIL PREFERENCES
To: hoo
Cc: gerritbot, daniel, Smalyshev, jcrespo, aude, Aklapper, hoo, GoranSMilovanovic, Adik2382, Th3d3v1ls, Ramalepe, Liugev6, QZanden, Lewizho99, Maathavan, Izno, Wikidata-bugs, Mbch331
Cc: gerritbot, daniel, Smalyshev, jcrespo, aude, Aklapper, hoo, GoranSMilovanovic, Adik2382, Th3d3v1ls, Ramalepe, Liugev6, QZanden, Lewizho99, Maathavan, Izno, Wikidata-bugs, Mbch331
_______________________________________________ Wikidata-bugs mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
