I will soon start the two Wikistats jobs which run for about several weeks each 
month, 

They might use two cores each, one for unzip, one for perl. 

How many cores are there anyway?

 

Cheers,

Erik

 

From: Analytics [mailto:analytics-boun...@lists.wikimedia.org] On Behalf Of 
Adrian Bielefeldt
Sent: Saturday, August 12, 2017 19:44
To: analytics@lists.wikimedia.org
Subject: [Analytics] Resources stat1005

 

Hello everyone,

I wanted to ask about resource allocation on stat1005. We 
<https://meta.wikimedia.org/wiki/Research:Understanding_Wikidata_Queries>  need 
quite a bit since we process every entry in wdqs_extract and I was wondering 
how many cores and how much memory we can use without conflicting with anyone 
else.

Greetings,

Adrian

_______________________________________________
Analytics mailing list
Analytics@lists.wikimedia.org
https://lists.wikimedia.org/mailman/listinfo/analytics

Reply via email to