| Pchelolo updated the task description. (Show Details) |
CHANGES TO TASK DESCRIPTION
There's another example that is **44 Mb is size** serialized. Kafka is capable of handling that, but it's not great in dealing with very large messages, so we can't increase the cap indefinitely. Maybe there's something we could do on Wikidata side to reduce the size of these jobs?
...
```There's another example that is **44 Mb is size** serialized. Kafka is capable of handling that, but it's not great in dealing with very large messages, so we can't increase the cap indefinitely. Maybe there's something we could do on Wikidata side to reduce the size of these jobs?
TASK DETAIL
EMAIL PREFERENCES
To: Pchelolo
Cc: daniel, GWicke, Aklapper, Pchelolo, GoranSMilovanovic, QZanden, Izno, Eevans, mobrovac, Hardikj, Wikidata-bugs, aude, Mbch331
Cc: daniel, GWicke, Aklapper, Pchelolo, GoranSMilovanovic, QZanden, Izno, Eevans, mobrovac, Hardikj, Wikidata-bugs, aude, Mbch331
_______________________________________________ Wikidata-bugs mailing list [email protected] https://lists.wikimedia.org/mailman/listinfo/wikidata-bugs
