Hello;

These days I'm processing Wikipedia dumps. Today I tried English Wikipedia,
which is in 150+ chunks (pages-meta-history*.7z).

I have a bash script that launches the jsub jobs, one job per chunk, so I
queued more than +150 jobs. After that, I saw that 95 jobs of them were
started and spread all over the execution nodes.

I saw the load of some of the nodes to reach 250%, is this normal? I
stopped all them because I'm not sure if I have to launch small batches, 10
each time or so, or it is OK to launch all them and ignore the CPU load of
execution nodes.

Regards
_______________________________________________
Labs-l mailing list
[email protected]
https://lists.wikimedia.org/mailman/listinfo/labs-l

Reply via email to