Ashwin Shankar created SPARK-8170: ------------------------------------- Summary: Ctrl-C in pyspark shell doesn't kill running job Key: SPARK-8170 URL: https://issues.apache.org/jira/browse/SPARK-8170 Project: Spark Issue Type: Improvement Components: PySpark Affects Versions: 1.3.1 Reporter: Ashwin Shankar
Hitting Ctrl-C in spark-sql(and other tools like presto) cancels any running job and starts a new input line on the prompt. It would be nice if pyspark shell also can do that. Otherwise, in case a user submits a job, say he made a mistake, and wants to cancel it, he needs to exit the shell and re-login to continue his work. Re-login can be a pain especially in Spark on yarn, since it takes a while to allocate AM container and initial executors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org