Hi all,

I am trying to run a spark program on a server. It is not a cluster but
only a server. I want to configure my spark program can use at most 20 CPU,
because this machine is also shared by other users.

I know I can set local[K] as the value of Master URLs to limited how many
worker threads in this program. But after I run my program, there is only
at least two CPUs used. And the program will be run a long time if there is
only one or two cpus used.

Does any one have met similar situation or have any suggestion?

Thanks.

Xiang
-- 
Xiang Huo
Department of Computer Science
University of Illinois at Chicago(UIC)
Chicago, Illinois
US
Email: [email protected]
           or [email protected]

Reply via email to