Hi,

I am running k-means using Spark in local mode. My data set is about 30k
records, and I set the k = 1000.

The algorithm starts and finished 13 jobs according to the UI monitor, then
it stopped working.

The last log I saw was:

[Spark Context Cleaner] INFO org.apache.spark.ContextCleaner - Cleaned
broadcast *16*

There're many similar log repeated, but it seems it always stop at the 16th.

If I try to low down the *k* value, the algorithm will terminated. So I
just want to know what's wrong with *k=1000*.


Thanks,
David

Reply via email to