Do Spark executors restrict native heap vs JVM heap?

2014-11-02 Thread Paul Wais
Thanks Sean! My novice understanding is that the 'native heap' is the
address space not allocated to the JVM heap, but I wanted to check to see
if I'm missing something.  I found out my issue appeared to be actual
memory pressure on the executor machine.  There was space for the JVM heap
but not much more.

On Thu, Oct 30, 2014 at 12:49 PM, Sean Owen so...@cloudera.com
javascript:; wrote:
 No, but, the JVM also does not allocate memory for native code on the
heap.
 I dont think heap has any bearing on whether your native code can't
allocate
 more memory except that of course the heap is also taking memory.

 On Oct 30, 2014 6:43 PM, Paul Wais pw...@yelp.com javascript:;
wrote:

 Dear Spark List,

 I have a Spark app that runs native code inside map functions.  I've
 noticed that the native code sometimes sets errno to ENOMEM indicating
 a lack of available memory.  However, I've verified that the /JVM/ has
 plenty of heap space available-- Runtime.getRuntime().freeMemory()
 shows gigabytes free and the native code needs only megabytes.  Does
 spark limit the /native/ heap size somehow?  Am poking through the
 executor code now but don't see anything obvious.

 Best Regards,
 -Paul Wais

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org javascript:;
 For additional commands, e-mail: user-h...@spark.apache.org
javascript:;




Re: Do Spark executors restrict native heap vs JVM heap?

2014-11-02 Thread Sean Owen
Yes, that's correct to my understanding and the probable explanation of
your issue. There are no additional limits or differences from how the JVM
works here.
On Nov 3, 2014 4:40 AM, Paul Wais pw...@yelp.com wrote:

 Thanks Sean! My novice understanding is that the 'native heap' is the
 address space not allocated to the JVM heap, but I wanted to check to see
 if I'm missing something.  I found out my issue appeared to be actual
 memory pressure on the executor machine.  There was space for the JVM heap
 but not much more.

 On Thu, Oct 30, 2014 at 12:49 PM, Sean Owen so...@cloudera.com wrote:
  No, but, the JVM also does not allocate memory for native code on the
 heap.
  I dont think heap has any bearing on whether your native code can't
 allocate
  more memory except that of course the heap is also taking memory.
 
  On Oct 30, 2014 6:43 PM, Paul Wais pw...@yelp.com wrote:
 
  Dear Spark List,
 
  I have a Spark app that runs native code inside map functions.  I've
  noticed that the native code sometimes sets errno to ENOMEM indicating
  a lack of available memory.  However, I've verified that the /JVM/ has
  plenty of heap space available-- Runtime.getRuntime().freeMemory()
  shows gigabytes free and the native code needs only megabytes.  Does
  spark limit the /native/ heap size somehow?  Am poking through the
  executor code now but don't see anything obvious.
 
  Best Regards,
  -Paul Wais
 
  -
  To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
  For additional commands, e-mail: user-h...@spark.apache.org
 
 




Re: Do Spark executors restrict native heap vs JVM heap?

2014-10-30 Thread Sean Owen
No, but, the JVM also does not allocate memory for native code on the heap.
I dont think heap has any bearing on whether your native code can't
allocate more memory except that of course the heap is also taking memory.
On Oct 30, 2014 6:43 PM, Paul Wais pw...@yelp.com wrote:

 Dear Spark List,

 I have a Spark app that runs native code inside map functions.  I've
 noticed that the native code sometimes sets errno to ENOMEM indicating
 a lack of available memory.  However, I've verified that the /JVM/ has
 plenty of heap space available-- Runtime.getRuntime().freeMemory()
 shows gigabytes free and the native code needs only megabytes.  Does
 spark limit the /native/ heap size somehow?  Am poking through the
 executor code now but don't see anything obvious.

 Best Regards,
 -Paul Wais

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org