Re: Managed memory leak detected + OutOfMemoryError: Unable to acquire X bytes of memory, got 0

2016-08-03 Thread Rychnovsky, Dusan
OK, thank you. What do you suggest I do to get rid of the error? From: Ted Yu <yuzhih...@gmail.com> Sent: Wednesday, August 3, 2016 6:10 PM To: Rychnovsky, Dusan Cc: user@spark.apache.org Subject: Re: Managed memory leak detected + OutOfMemoryError:

Re: Managed memory leak detected + OutOfMemoryError: Unable to acquire X bytes of memory, got 0

2016-08-03 Thread Rychnovsky, Dusan
I am confused. I tried to look for Spark that would have this issue fixed, i.e. https://github.com/apache/spark/pull/13027/ merged in, but it looks like the patch has not been merged for 1.6. How do I get a fixed 1.6 version? Thanks, Dusan [https://avatars2.githubusercontent.com/u

Re: Managed memory leak detected + OutOfMemoryError: Unable to acquire X bytes of memory, got 0

2016-08-03 Thread Rychnovsky, Dusan
I have 1.6.0 and therefore should have it fixed, right? Or what do I do to fix it? Thanks, Dusan From: Ted Yu <yuzhih...@gmail.com> Sent: Wednesday, August 3, 2016 3:52 PM To: Rychnovsky, Dusan Cc: user@spark.apache.org Subject: Re: Managed memory leak detected

Managed memory leak detected + OutOfMemoryError: Unable to acquire X bytes of memory, got 0

2016-08-03 Thread Rychnovsky, Dusan
job keeps failing in the same way (I tried a few times). What could be causing such error? I have a feeling that I'm not providing enough context necessary to understand the issue. Please ask for any other information needed. Thank you, Dusan

Application not showing in Spark History

2016-08-02 Thread Rychnovsky, Dusan
\ ... the application appears in Spark History correctly. What am I missing? Also, is this a good way to launch a Spark application from within a Java application or is there a better way? Thanks, Dusan

FullOuterJoin on Spark

2016-06-21 Thread Rychnovsky, Dusan
in this case, though, is to hold just the domain and iterate over all corresponding pages, one at a time. What would be the best way to do this on Spark? Thank you, Dusan Rychnovsky

How to create correct data frame for classification in Spark ML?

2015-06-25 Thread dusan
,masters,female,B 42,40,bachelors,male,B age and hours_per_week are integers while other features including label salaryRange are categorical (String) Loading this csv file (lets call it sample.csv) can be done by Spark csv library like this: val data = sqlContext.csvFile(/home/dusan