Yes I sorted out the issue. It was using an older version of maven when I was running as hduser (not root)
hduser@rhes564::/usr/lib/spark> build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package > log Using `mvn` from path: /usr/local/apache-maven/apache-maven-3.3.1/bin/mvn WARNING] Rule 0: org.apache.maven.plugins.enforcer.RequireMavenVersion failed with message: Detected Maven Version: 3.3.1 is not in the allowed range 3.3.3. Changed the maven version in the environment file to use maven-3.3.3 build for user hduser and ran the command again and it worked build/mvn -X -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0 -DskipTests clean package > log Using `mvn` from path: /usr/local/apache-maven/apache-maven-3.3.3/bin/mvn …… INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary: [INFO] [INFO] Spark Project Parent POM ........................... SUCCESS [ 39.937 s] [INFO] Spark Project Launcher ............................. SUCCESS [ 44.718 s] [INFO] Spark Project Networking ........................... SUCCESS [ 11.294 s] [INFO] Spark Project Shuffle Streaming Service ............ SUCCESS [ 4.720 s] [INFO] Spark Project Unsafe ............................... SUCCESS [ 10.705 s] [INFO] Spark Project Core ................................. SUCCESS [02:52 min] [INFO] Spark Project Bagel ................................ SUCCESS [ 5.937 s] [INFO] Spark Project GraphX ............................... SUCCESS [ 15.977 s] [INFO] Spark Project Streaming ............................ SUCCESS [ 36.453 s] [INFO] Spark Project Catalyst ............................. SUCCESS [ 54.381 s] [INFO] Spark Project SQL .................................. SUCCESS [01:07 min] [INFO] Spark Project ML Library ........................... SUCCESS [01:22 min] [INFO] Spark Project Tools ................................ SUCCESS [ 2.493 s] [INFO] Spark Project Hive ................................. SUCCESS [ 58.496 s] [INFO] Spark Project REPL ................................. SUCCESS [ 9.278 s] [INFO] Spark Project YARN ................................. SUCCESS [ 12.424 s] [INFO] Spark Project Assembly ............................. SUCCESS [01:51 min] [INFO] Spark Project External Twitter ..................... SUCCESS [ 7.604 s] [INFO] Spark Project External Flume Sink .................. SUCCESS [ 7.580 s] [INFO] Spark Project External Flume ....................... SUCCESS [ 9.526 s] [INFO] Spark Project External Flume Assembly .............. SUCCESS [ 3.163 s] [INFO] Spark Project External MQTT ........................ SUCCESS [ 31.774 s] [INFO] Spark Project External MQTT Assembly ............... SUCCESS [ 8.698 s] [INFO] Spark Project External ZeroMQ ...................... SUCCESS [ 6.992 s] [INFO] Spark Project External Kafka ....................... SUCCESS [ 11.487 s] [INFO] Spark Project Examples ............................. SUCCESS [02:12 min] [INFO] Spark Project External Kafka Assembly .............. SUCCESS [ 9.046 s] [INFO] Spark Project YARN Shuffle Service ................. SUCCESS [ 6.097 s] [INFO] ------------------------------------------------------------------------ [INFO] BUILD SUCCESS [INFO] ------------------------------------------------------------------------ [INFO] Total time: 16:16 min [INFO] Finished at: 2015-11-25T23:34:35+00:00 [INFO] Final Memory: 90M/1312M [INFO] ------------------------------------------------------------------------ Mich Talebzadeh Sybase ASE 15 Gold Medal Award 2008 A Winning Strategy: Running the most Critical Financial Data on ASE 15 http://login.sybase.com/files/Product_Overviews/ASE-Winning-Strategy-091908.pdf Author of the books "A Practitioner’s Guide to Upgrading to Sybase ASE 15", ISBN 978-0-9563693-0-7. co-author "Sybase Transact SQL Guidelines Best Practices", ISBN 978-0-9759693-0-4 Publications due shortly: Complex Event Processing in Heterogeneous Environments, ISBN: 978-0-9563693-3-8 Oracle and Sybase, Concepts and Contrasts, ISBN: 978-0-9563693-1-4, volume one out shortly http://talebzadehmich.wordpress.com <http://talebzadehmich.wordpress.com/> NOTE: The information in this email is proprietary and confidential. This message is for the designated recipient only, if you are not the intended recipient, you should destroy it immediately. Any information in this message shall not be understood as given or endorsed by Peridale Technology Ltd, its subsidiaries or their employees, unless expressly so stated. It is the responsibility of the recipient to ensure that this email is virus free, therefore neither Peridale Ltd, its subsidiaries nor their employees accept any responsibility. From: Xuefu Zhang [mailto:xzh...@cloudera.com] Sent: 25 November 2015 23:33 To: user@hive.apache.org Subject: Re: hive1.2.1 on spark connection time out There usually a few more messages before this but after "spark-submit" in hive.log. Do you have spark.home set? On Sun, Nov 22, 2015 at 10:17 PM, zhangjp <smart...@hotmail.com <mailto:smart...@hotmail.com> > wrote: I'm using hive1.2.1 . I want to run hive on spark model,but there is some issues. have been set spark.master=yarn-client; spark version 1.4.1 which run spark-shell --master yarn-client there is no problem. log 2015-11-23 13:54:56,068 ERROR [main]: spark.SparkTask (SessionState.java:printError(960)) - Failed to execute spark task, with exception 'org.apache.hadoop.hive.ql.metadata.HiveException(Failed to create spark client.)' org.apache.hadoop.hive.ql.metadata.HiveException: Failed to create spark client. at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:57) at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionManagerImpl.getSession(SparkSessionManagerImpl.java:116) at org.apache.hadoop.hive.ql.exec.spark.SparkUtilities.getSparkSession(SparkUtilities.java:112) at org.apache.hadoop.hive.ql.exec.spark.SparkTask.execute(SparkTask.java:101) at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:160) at org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:88) at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:1653) at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:1412) at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:1195) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1059) at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1049) at org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:213) at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:165) at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:376) at org.apache.hadoop.hive.cli.CliDriver.executeDriver(CliDriver.java:736) at org.apache.hadoop.hive.cli.CliDriver.run(CliDriver.java:681) at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:621) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.hadoop.util.RunJar.main(RunJar.java:212) Caused by: java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection. at com.google.common.base.Throwables.propagate(Throwables.java:156) at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:109) at org.apache.hive.spark.client.SparkClientFactory.createClient(SparkClientFactory.java:80) at org.apache.hadoop.hive.ql.exec.spark.RemoteHiveSparkClient.<init>(RemoteHiveSparkClient.java:90) at org.apache.hadoop.hive.ql.exec.spark.HiveSparkClientFactory.createHiveSparkClient(HiveSparkClientFactory.java:65) at org.apache.hadoop.hive.ql.exec.spark.session.SparkSessionImpl.open(SparkSessionImpl.java:55) ... 21 more Caused by: java.util.concurrent.ExecutionException: java.util.concurrent.TimeoutException: Timed out waiting for client connection. at io.netty.util.concurrent.AbstractFuture.get(AbstractFuture.java:37) at org.apache.hive.spark.client.SparkClientImpl.<init>(SparkClientImpl.java:99) ... 25 more Caused by: java.util.concurrent.TimeoutException: Timed out waiting for client connection. at org.apache.hive.spark.client.rpc.RpcServer$2.run(RpcServer.java:141) at io.netty.util.concurrent.PromiseTask$RunnableAdapter.call(PromiseTask.java:38) at io.netty.util.concurrent.ScheduledFutureTask.run(ScheduledFutureTask.java:123) at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:380) at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357) at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:116) at java.lang.Thread.run(Thread.java:745)