Hi

Can  you  cross check by providing same library path in --jars of spark-submit 
and run .


Sent from Samsung Mobile.

<div>-------- Original message --------</div><div>From: "颜发才(Yan Facai)" 
<yaf...@gmail.com> </div><div>Date:18/08/2016  15:17  (GMT+05:30) 
</div><div>To: "user.spark" <user@spark.apache.org> </div><div>Cc:  
</div><div>Subject: [Spark 2.0] ClassNotFoundException is thrown when using 
Hive </div><div>
</div>Hi, all.

I copied hdfs-site.xml, core-site.xml and hive-site.xml to $SPARK_HOME/conf. 
And spark-submit is used to submit task to yarn, and run as **client** mode. 
However, ClassNotFoundException is thrown.

some details of logs are list below:
```
16/08/12 17:07:32 INFO hive.HiveUtils: Initializing HiveMetastoreConnection 
version 0.13.1 using 
file:/data0/facai/lib/hive-0.13.1/lib:file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
16/08/12 17:07:32 ERROR yarn.ApplicationMaster: User class threw exception: 
java.lang.ClassNotFoundException: java.lang.NoClassDefFoundError: 
org/apache/hadoop/hive/ql/session/SessionState when creating Hive client using 
classpath: file:/data0/facai/lib/hive-0.13.1/lib, 
file:/data0/facai/lib/hadoop-2.4.1/share/hadoop
```

In fact, all the jars needed by hive is  in the directory:
```Bash
[hadoop@h107713699 spark_test]$ ls /data0/facai/lib/hive-0.13.1/lib/ | grep hive
hive-ant-0.13.1.jar
hive-beeline-0.13.1.jar
hive-cli-0.13.1.jar
hive-common-0.13.1.jar
...
```

So, my question is:
why spark cannot find the jars needed? 

Any help will be appreciate, thanks.

Reply via email to