Re: Measuer Bytes READ and Peak Memory Usage for Query

2015-03-24 Thread anamika gupta
Yeah thanks, I can now see the memory usage.

Please also verify if bytes read == Combined size of all RDDs ?

Actually, all my RDDs are completely cached in memory. So, Combined size of
my RDDs = Mem used (verified from WebUI)


On Fri, Mar 20, 2015 at 12:07 PM, Akhil Das ak...@sigmoidanalytics.com
wrote:

 You could do a cache and see the memory usage under Storage tab in the
 driver UI (runs on port 4040)

 Thanks
 Best Regards

 On Fri, Mar 20, 2015 at 12:02 PM, anu anamika.guo...@gmail.com wrote:

 Hi All

 I would like to measure Bytes Read and Peak Memory Usage for a Spark SQL
 Query.

 Please clarify if Bytes Read = aggregate size of all RDDs ??
 All my RDDs are in memory and 0B spill to disk.

 And I am clueless how to measure Peak Memory Usage.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Measuer-Bytes-READ-and-Peak-Memory-Usage-for-Query-tp22159.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org





Measuer Bytes READ and Peak Memory Usage for Query

2015-03-20 Thread anu
Hi All

I would like to measure Bytes Read and Peak Memory Usage for a Spark SQL
Query. 

Please clarify if Bytes Read = aggregate size of all RDDs ??
All my RDDs are in memory and 0B spill to disk.

And I am clueless how to measure Peak Memory Usage.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Measuer-Bytes-READ-and-Peak-Memory-Usage-for-Query-tp22159.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Measuer Bytes READ and Peak Memory Usage for Query

2015-03-20 Thread Akhil Das
You could do a cache and see the memory usage under Storage tab in the
driver UI (runs on port 4040)

Thanks
Best Regards

On Fri, Mar 20, 2015 at 12:02 PM, anu anamika.guo...@gmail.com wrote:

 Hi All

 I would like to measure Bytes Read and Peak Memory Usage for a Spark SQL
 Query.

 Please clarify if Bytes Read = aggregate size of all RDDs ??
 All my RDDs are in memory and 0B spill to disk.

 And I am clueless how to measure Peak Memory Usage.



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Measuer-Bytes-READ-and-Peak-Memory-Usage-for-Query-tp22159.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org