Hi guys,

I was trying to submit several spark applications to a standalone cluster at
the same time from a shell script. One issue I met is sometimes one
application may be submitted to cluster twice unexpectedly(from web UI, I
can see the two applications of same name were generated exactly at the same
time).  One of the two applications will run to end, but the other will
always stay in running state and never exit and release resources.
Does anyone meet the same issue?
The spark version I am using is spark1.1.1.

Best Regards,
Pengcheng



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/spark-application-was-submitted-twice-unexpectedly-tp22551.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to