Hi Grega, This memory is not taken away from the application in any way, so the setting doesn’t matter if you don’t use caching. You don’t need to configure it in any special way.
Matei On Nov 8, 2013, at 8:01 AM, Grega Kešpret <[email protected]> wrote: > Hi, > > The docs say: Fraction of Java heap to use for Spark's memory cache. This > should not be larger than the "old" generation of objects in the JVM, which > by default is given 2/3 of the heap, but you can increase it if you configure > your own old generation size. > > if we are not caching any RDDs, does it mean that we only have > 1-memoryFraction heap available for "normal" JVM objects? Would it make sense > then to set memoryFraction to 0? > > Thanks, > > Grega > -- > <celtra_logo.png> > Grega Kešpret > Analytics engineer > > Celtra — Rich Media Mobile Advertising > celtra.com | @celtramobile
