Hi All,

I am working on a spark streaming job . The job is supposed to  read
streaming data from Kafka. But after submitting the job its showing 
org.apache.spark.SparkException: Job aborted due to stage failure: All
masters are unresponsive! and the following lines are getting printed
infinitely.
14/11/20 17:29:10 INFO Client: Application report from ASM:
         application identifier: application_1416402687687_0011
         appId: 11
         clientToAMToken: null
         appDiagnostics:
         appMasterHost: N/A
<http://apache-spark-user-list.1001560.n3.nabble.com/file/n19365/errorLog.png> 
         appQueue: root.root
         appMasterRpcPort: -1
         appStartTime: 1416484185518
         yarnAppState: ACCEPTED
         distributedFinalState: UNDEFINED
         appTrackingUrl:
http://URL:8088/proxy/application_1416402687687_0011/
         appUser: root

 

I am ruuning the spark job on 4 worker YARN cluster .

Any clue to solve this issue?

Thanks




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-while-submitting-spark-streaming-job-in-YARN-tp19365.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to