Hi,

 

I want to set execution engine for Hive as Spark. I deployed a 4  server
Spark  cluster in Yarn mode. When Ever am trying to execute any query am
getting the following error. Please suggest me weather I done something
wrong . Also attaching the used tools version.

Hive 2.1.0 ,

HBase 1.2.2,

Scala code runner version 2.11.8 ,

Spark version 2.0.0. 

 

Hive Log:

 

 

2016-08-24T14:55:47,883 ERROR [Thread-11]: spark.SparkTask (:()) - Failed to
execute spark task, with exception
'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark
client.)'

org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark
client.

                at
org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSess
ionImpl.java:64)

                at
org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSess
ion(SparkSessionManagerImpl.java:114)

                at
org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUti
lities.java:136)

                at
org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:89)

                at
org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:197)

                at
org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:100)

                at
org.apache.hadoop.hive.ql.exec.TaskRunner.run(TaskRunner.java:79)

Caused by: java.lang.NoClassDefFoundError:
scala/collection/GenTraversableOnce$class

 

Thanks & Regards

Anas A,
Trinity Mobility Pvt. Ltd  

 

Reply via email to