Regards,
Raymond Liu
From: 牛兆捷 [mailto:nzjem...@gmail.com]
Sent: Thursday, September 04, 2014 2:57 PM
To: Liu, Raymond
Cc: Patrick Wendell; u...@spark.apache.org; dev@spark.apache.org
Subject: Re: memory size for caching RDD
Oh I see.
I want to implement something like this: sometimes I need to
You don’t need to. It is not static allocated to RDD cache, it is just an up
limit.
If you don’t use up the memory by RDD cache, it is always available for other
usage. except those one also controlled by some memoryFraction conf. e.g.
spark.shuffle.memoryFraction which you also set the up limit
so how to run the check locally?
On master tree, sbt mimaReportBinaryIssues Seems to lead to a lot of errors
reported. Do we need to modify SparkBuilder.scala etc to run it locally? Could
not figure out how Jekins run the check on its console outputs.
Best Regards,
Raymond Liu
-Original M