Hello all,

    well I try to profile Spark in order to see which functions are called while the execution of an application. Based on my results, i see that in SVM benchmark, TaskMemoryManager called to allocate extra memory using HeapMemoryAllocator. In addition with Linear Regression application execution where TaskMemoryManager does not need to allocate extra memory.

Can anyone explain to me why this happens. Thanks a lot and I am looking forward for your reply.

--Jack Kolokasis


---------------------------------------------------------------------
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org

Reply via email to