Hi,

We installed spark1.2.1 in single node, running a job in yarn-client mode
on yarn which loads data into hbase and elasticsearch,

the error which we are encountering is
Exception in thread "main" org.apache.spark.SparkException: Job aborted due
to stage failure: Task 38 in stage 26800.0 failed 4 times, most recent
failure: Lost task 38.3 in stage 26800.0 (TID 4990082, hdprd-c01-r04-03):
java.io.FileNotFoundException:
/opt/mapr/tmp/hadoop-tmp/hadoop-mapr/nm-local-dir/usercache/sparkuser/appcache/application_1463194314221_211370/spark-3cc37dc7-fa3c-4b98-aa60-0acdfc79c725/28/shuffle_8553_38_0.index
(No such file or directory)

any idea about this error ?
-- 
Thanks,
Kishore.

Reply via email to