Hi

I suspect the code wrong, try the following

file:///home/spark/spark-1.5.1-bin-hadoop2.6/local/spark-b9c155cf-4624-4c68-9d0d-d3b9d5748601/__spark_conf__6786864409197988390.zip<file:///\\home\spark\spark-1.5.1-bin-hadoop2.6\local\spark-b9c155cf-4624-4c68-9d0d-d3b9d5748601\__spark_conf__6786864409197988390.zip>

Regards

Chandrasekaran
Technical Consultant
Big Data & Analytics
Business Solution Group

Jardine OneSolution (2001) Pte Ltd
Mobile: (+65) 8138 4761 | Email: chiranchan...@jos.com.sg
67,Ubi Avenue 1 #02-01, North Wing, Starhub Green, Singapore 408942

[JOS_Email_Signature_Signoff_EN]<http://www.jos.com/>

[Facebook]<https://www.facebook.com/jositsolutions>-[LinkedIn]<http://www.linkedin.com/company/jos>-[WeChat]<http://pro9465d9.pic12.websiteonline.cn/upload/joswechatnew_1rgi.jpg>

From: Zhong,Zhenyu [mailto:edwardzh...@baidu.com]
Sent: Friday, 28 October, 2016 6:58 AM
To: user@hadoop.apache.org
Subject: yarn spark job submit problem

Hi,

I am using spark-1.5.1 and hadoop-2.6.4. After completing the configuration, I 
am able to run spark job via YARN.
However, I have a problem that YARN often could't find the local zip-archive 
generated during the preparation stage.
I submitted the job using the client mode. Thus the zip-archive was prepared on 
the machine which runs the client process.

According to the following log, the zip-archive was there on the local 
filesystem, however, yarn couldn't find it and report "File Not Found".

Has anyone have similar problems? Any solution for that?

Thanks in advance.

edward


16/10/27 15:17:46 INFO yarn.Client: Will allocate AM container, with 896 MB 
memory including 384 MB overhead

16/10/27 15:17:46 INFO yarn.Client: Setting up container launch context for our 
AM

16/10/27 15:17:46 INFO yarn.Client: Setting up the launch environment for our 
AM container

16/10/27 15:17:46 INFO yarn.Client: Preparing resources for our AM container

16/10/27 15:17:46 INFO yarn.Client: Source and destination file systems are the 
same. Not copying 
file:/home/spark/spark-1.5.1-bin-hadoop2.6/lib/spark-assembly-1.5.1-hadoop2.6.0.jar

16/10/27 15:17:46 INFO yarn.Client: Source and destination file systems are the 
same. Not copying 
file:/home/spark/spark-1.5.1-bin-hadoop2.6/python/lib/pyspark.zip

16/10/27 15:17:46 INFO yarn.Client: Source and destination file systems are the 
same. Not copying 
file:/home/spark/spark-1.5.1-bin-hadoop2.6/python/lib/py4j-0.8.2.1-src.zip

16/10/27 15:17:46 INFO yarn.Client: Source and destination file systems are the 
same. Not copying 
file:/home/spark/spark-1.5.1-bin-hadoop2.6/local/spark-b9c155cf-4624-4c68-9d0d-d3b9d5748601/__spark_conf__6786864409197988390.zip

16/10/27 15:17:46 INFO spark.SecurityManager: Changing view acls to: spark

16/10/27 15:17:46 INFO spark.SecurityManager: Changing modify acls to: spark

16/10/27 15:17:46 INFO spark.SecurityManager: SecurityManager: authentication 
disabled; ui acls disabled; users with view permissions: Set(spark); users with 
modify permissions: Set(spark)

16/10/27 15:17:46 INFO yarn.Client: Submitting application 2371 to 
ResourceManager

16/10/27 15:17:46 INFO impl.YarnClientImpl: Submitted application 
application_1474479656852_2371

16/10/27 15:17:47 INFO yarn.Client: Application report for 
application_1474479656852_2371 (state: ACCEPTED)

16/10/27 15:17:47 INFO yarn.Client:

client token: N/A

diagnostics: N/A

ApplicationMaster host: N/A

ApplicationMaster RPC port: -1

queue: default

start time: 1477606690545

final status: UNDEFINED

tracking URL: 
http://xlab-node4.usdc.baidu.com:8088/proxy/application_1474479656852_2371/

user: spark

16/10/27 15:17:48 INFO yarn.Client: Application report for 
application_1474479656852_2371 (state: ACCEPTED)

16/10/27 15:17:49 INFO yarn.Client: Application report for 
application_1474479656852_2371 (state: FAILED)

16/10/27 15:17:49 INFO yarn.Client:

client token: N/A

diagnostics: Application application_1474479656852_2371 failed 2 times due to 
AM Container for appattempt_1474479656852_2371_000002 exited with  exitCode: 
-1000

For more detailed output, check application tracking 
page:http://xlab-node4.usdc.baidu.com:8088/proxy/application_1474479656852_2371/Then,
 click on links to logs of each attempt.

Diagnostics: File 
file:/home/spark/spark-1.5.1-bin-hadoop2.6/local/spark-b9c155cf-4624-4c68-9d0d-d3b9d5748601/__spark_conf__6786864409197988390.zip
 does not exist

java.io.FileNotFoundException: File 
file:/home/spark/spark-1.5.1-bin-hadoop2.6/local/spark-b9c155cf-4624-4c68-9d0d-d3b9d5748601/__spark_conf__6786864409197988390.zip
 does not exist



______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
For more information please visit http://www.symanteccloud.com
______________________________________________________________________

______________________________________________________________________
This email has been scanned by the Symantec Email Security.cloud service.
Confidentiality Notice and Disclaimer: This email (including any attachment to 
it) is confidential and intended only for the use of the individual or entity 
named above and may contain information that is privileged.  If you are not the 
intended recipient, you are notified that any dissemination, distribution or 
copying of this email is strictly prohibited.  If you have received this email 
in error, please notify us immediately by return email or telephone and destroy 
the original message (including any attachment to it).  Thank you.
______________________________________________________________________

Reply via email to