Mich: Please use Spark 1.5.0+ to work with Hive 1.2.1 Cheers
On Thu, Dec 3, 2015 at 10:32 AM, Mich Talebzadeh <m...@peridale.co.uk> wrote: > Hi, > > > > These are my stack for now > > > > 1. Spark version 1.3 > > 2. Hive version 1.2.1 > > 3. Hadoop version 2.6 > > > > So I am using hive version 1.2.1 > > > > hduser@rhes564::/usr/lib/spark/logs> hive --version > > SLF4J: Class path contains multiple SLF4J bindings. > > SLF4J: Found binding in > [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: Found binding in > [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > > SLF4J: Class path contains multiple SLF4J bindings. > > SLF4J: Found binding in > [jar:file:/usr/lib/spark/lib/spark-assembly-1.3.0-hadoop2.4.0.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: Found binding in > [jar:file:/home/hduser/hadoop-2.6.0/share/hadoop/common/lib/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class] > > SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an > explanation. > > SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory] > > *Hive 1.2.1* > > Subversion git://localhost.localdomain/home/sush/dev/hive.git -r > 243e7c1ac39cb7ac8b65c5bc6988f5cc3162f558 > > Compiled by sush on Fri Jun 19 02:03:48 PDT 2015 > > From source with checksum ab480aca41b24a9c3751b8c023338231 > > > > > > Thanks, > > > > > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > *From:* Furcy Pin [mailto:furcy....@flaminem.com] > *Sent:* 03 December 2015 18:22 > *To:* u...@hive.apache.org > *Cc:* user <user@spark.apache.org> > *Subject:* Re: Any clue on this error, Exception in thread "main" > java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT > > > > The field SPARK_RPC_CLIENT_CONNECT_TIMEOUT seems to have been added to > Hive in the 1.1.0 release > > > > > https://github.com/apache/hive/blob/release-1.1.0/common/src/java/org/apache/hadoop/hive/conf/HiveConf.java > > > > Are you using an older version of Hive somewhere? > > > > > > On Thu, Dec 3, 2015 at 7:15 PM, Mich Talebzadeh <m...@peridale.co.uk> > wrote: > > Thanks I tried all L > > > > I am trying to make Hive use Spark and apparently Hive can use version 1.3 > of Spark as execution engine. Frankly I don’t know why this is not working! > > > > Mich Talebzadeh > > > > *Sybase ASE 15 Gold Medal Award 2008* > > A Winning Strategy: Running the most Critical Financial Data on ASE 15 > > > http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf > > Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE > 15", ISBN 978-0-9563693-0-7*. > > co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN > 978-0-9759693-0-4* > > *Publications due shortly:* > > *Complex Event Processing in Heterogeneous Environments*, ISBN: > 978-0-9563693-3-8 > > *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume > one out shortly > > > > http://talebzadehmich.wordpress.com > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > *From:* Furcy Pin [mailto:furcy....@flaminem.com] > *Sent:* 03 December 2015 18:07 > *To:* u...@hive.apache.org > *Cc:* user@spark.apache.org > *Subject:* Re: Any clue on this error, Exception in thread "main" > java.lang.NoSuchFieldError: SPARK_RPC_CLIENT_CONNECT_TIMEOUT > > > > maybe you compile and run against different versions of spark? > > > > On Thu, Dec 3, 2015 at 6:54 PM, Mich Talebzadeh <m...@peridale.co.uk> > wrote: > > Trying to run Hive on Spark 1.3 engine, I get > > > > conf hive.spark.client.channel.log.level=null --conf > hive.spark.client.rpc.max.size=52428800 --conf > hive.spark.client.rpc.threads=8 --conf hive.spark.client.secret.bits=256 > > 15/12/03 17:53:18 [stderr-redir-1]: INFO client.SparkClientImpl: Spark > assembly has been built with Hive, including Datanucleus jars on classpath > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: > Ignoring non-spark config property: hive.spark.client.connect.timeout=1000 > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: > Ignoring non-spark config property: hive.spark.client.rpc.threads=8 > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: > Ignoring non-spark config property: hive.spark.client.rpc.max.size=52428800 > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: > Ignoring non-spark config property: hive.spark.client.secret.bits=256 > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: Warning: > Ignoring non-spark config property: > hive.spark.client.server.connect.timeout=90000 > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: 15/12/03 > 17:53:19 INFO client.RemoteDriver: Connecting to: rhes564:36577 > > *15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: > Exception in thread "main" java.lang.NoSuchFieldError: > SPARK_RPC_CLIENT_CONNECT_TIMEOUT* > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.hive.spark.client.rpc.RpcConfiguration.<clinit>(RpcConfiguration.java:46) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.hive.spark.client.RemoteDriver.<init>(RemoteDriver.java:146) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.hive.spark.client.RemoteDriver.main(RemoteDriver.java:556) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > java.lang.reflect.Method.invoke(Method.java:606) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:569) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:166) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:189) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:110) > > 15/12/03 17:53:19 [stderr-redir-1]: INFO client.SparkClientImpl: at > org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) > > > > Any clues? > > > > > > Mich Talebzadeh > > > > *Sybase ASE 15 Gold Medal Award 2008* > > A Winning Strategy: Running the most Critical Financial Data on ASE 15 > > > http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf > > Author of the books* "A Practitioner’s Guide to Upgrading to Sybase ASE > 15", ISBN 978-0-9563693-0-7*. > > co-author *"Sybase Transact SQL Guidelines Best Practices", ISBN > 978-0-9759693-0-4* > > *Publications due shortly:* > > *Complex Event Processing in Heterogeneous Environments*, ISBN: > 978-0-9563693-3-8 > > *Oracle and Sybase, Concepts and Contrasts*, ISBN: 978-0-9563693-1-4, volume > one out shortly > > > > http://talebzadehmich.wordpress.com > > > > NOTE: The information in this email is proprietary and confidential. This > message is for the designated recipient only, if you are not the intended > recipient, you should destroy it immediately. Any information in this > message shall not be understood as given or endorsed by Peridale Technology > Ltd, its subsidiaries or their employees, unless expressly so stated. It is > the responsibility of the recipient to ensure that this email is virus > free, therefore neither Peridale Ltd, its subsidiaries nor their employees > accept any responsibility. > > > > > > >