I would like to monitor cpu/memory usage.

I read the section Metrics on :
http://spark.apache.org/docs/1.3.1/monitoring.html.

Here my $SPARK_HOME/conf/metrics.properties

# Enable CsvSink for all instances
*.sink.csv.class=org.apache.spark.metrics.sink.CsvSink

# Polling period for CsvSink
*.sink.csv.period=1

*.sink.csv.unit=seconds

# Polling directory for CsvSink
*.sink.csv.directory=/home/spark/Documents/test/

In my spark application I add this on sparkcontex : 
.set("spark.metrics.conf",
"/home/spark/development/spark/conf/metrics.properties")

What I tried :

1/ spark@cv-local:~$ $SPARK_HOME/sbin/stop-all.sh
2/ spark@cv-local:~$ ls ~/Documents/test/                   //--> EMPTY
3/ spark@cv-local:~$ $SPARK_HOME/sbin/start-all.sh
4/ spark@cv-local:~$ ls ~/Documents/test/
master.apps.csv         master.workers.csv    worker.coresUsed.csv 
worker.memFree_MB.csv
master.waitingApps.csv  worker.coresFree.csv  worker.executors.csv 
worker.memUsed_MB.csv

5/ Start my spark application (sbt "run-main ..")
6/ spark@cv-local:~/Documents/test$ ls
local-1444064889008.<driver>.BlockManager.disk.diskSpaceUsed_MB.csv  
master.apps.csv
local-1444064889008.<driver>.BlockManager.memory.maxMem_MB.csv       
master.waitingApps.csv
local-1444064889008.<driver>.BlockManager.memory.memUsed_MB.csv      
master.workers.csv
local-1444064889008.<driver>.BlockManager.memory.remainingMem_MB.csv 
worker.coresFree.csv
local-1444064889008.<driver>.DAGScheduler.job.activeJobs.csv         
worker.coresUsed.csv
local-1444064889008.<driver>.DAGScheduler.job.allJobs.csv            
worker.executors.csv
local-1444064889008.<driver>.DAGScheduler.stage.failedStages.csv     
worker.memFree_MB.csv
local-1444064889008.<driver>.DAGScheduler.stage.runningStages.csv    
worker.memUsed_MB.csv
local-1444064889008.<driver>.DAGScheduler.stage.waitingStages.csv

I did this 6 steps on local and on a cluster. The result is same expect on
the cluster there is not the files worker.*.csv.

When I run my application only the driver seems to be monitored, why? 

How can I get the memory/cpu usage of the master and slaves?








--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Spark-metrics-cpu-memory-tp24932.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to