Hi Sean,
My system is windows 64 bit. I looked into the resource manager, Java is
the only process that used about 13% CPU recourse; no disk activity related
to Java; only about 6GB memory used out of 56GB in total.
My system response very well. I don't think it is a system issue.
Thanks,
David
I think you'd have to say more about "stopped working". Is the GC
thrashing? does the UI respond? is the CPU busy or not?
On Mon, Mar 16, 2015 at 4:25 AM, Xi Shen wrote:
> Hi,
>
> I am running k-means using Spark in local mode. My data set is about 30k
> records, and I set the k = 1000.
>
> The a
I used "local[*]". The CPU hits about 80% when there are active jobs, then
it drops to about 13% and hand for a very long time.
Thanks,
David
On Mon, 16 Mar 2015 17:46 Akhil Das wrote:
> How many threads are you allocating while creating the sparkContext? like
> local[4] will allocate 4 threads
How many threads are you allocating while creating the sparkContext? like
local[4] will allocate 4 threads. You can try increasing it to a higher
number also try setting level of parallelism to a higher number.
Thanks
Best Regards
On Mon, Mar 16, 2015 at 9:55 AM, Xi Shen wrote:
> Hi,
>
> I am r
Hi,
I am running k-means using Spark in local mode. My data set is about 30k
records, and I set the k = 1000.
The algorithm starts and finished 13 jobs according to the UI monitor, then
it stopped working.
The last log I saw was:
[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cl