Bryce Ageno created SPARK-10410: ----------------------------------- Summary: spark 1.4.1 kill command does not work with streaming job. Key: SPARK-10410 URL: https://issues.apache.org/jira/browse/SPARK-10410 Project: Spark Issue Type: Bug Components: Deploy Affects Versions: 1.4.1 Reporter: Bryce Ageno
Our team recently upgraded a cluster to 1.4.1 from 1.3.1 and we discovered that when you run the kill command for a driver (/usr/spark/bin/spark-submit --master spark://$SPARK_MASTER_IP:6066 --kill $SPARK_DRIVER) it is not removing the driver off of the sparkUI. It is a streaming job and the kill command "ends" the job but it does not free up the resources or remove it from the spark master. We are running in cluster mode. We have also noticed that with 1.4.1 multiple spark-submits all of the drivers ends up on a single worker. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org