Re: How to make ./bin/spark-sql work with hive?

2014-10-06 Thread Li HM
After disabled the client side authorization and no anything in the
SPARK_CLASSPATH, I am still getting no class found error.

property
  namehive.security.authorization.enabled/name
  valuefalse/value
  descriptionPerform authorization checks on the client/description
/property

Am I hitting a dead end? Please help.

spark-sql use mydb;
OK
Time taken: 4.567 seconds
spark-sql select * from test;
java.lang.RuntimeException: java.lang.RuntimeException:
java.lang.ClassNotFoundException: Class
org.apache.hadoop.hdfs.server.namenode.ha.IPFailoverProxyProvider not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1927)
at
org.apache.hadoop.hdfs.NameNodeProxies.getFailoverProxyProviderClass(NameNodeProxies.java:409)
at
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:139)
at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:579)
at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:524)
at
org.apache.hadoop.hdfs.DistributedFileSystem.initialize(DistributedFileSystem.java:146)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2397)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:89)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2431)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2413)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:368)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:167)
at org.apache.hadoop.mapred.JobConf.getWorkingDirectory(JobConf.java:653)
at
org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:427)
at
org.apache.hadoop.mapred.FileInputFormat.setInputPaths(FileInputFormat.java:400)
at
org.apache.spark.sql.hive.HadoopTableReader$.initializeLocalJobConfFunc(TableReader.scala:250)
at
org.apache.spark.sql.hive.HadoopTableReader$$anonfun$8.apply(TableReader.scala:228)
at
org.apache.spark.sql.hive.HadoopTableReader$$anonfun$8.apply(TableReader.scala:228)
at
org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$1.apply(HadoopRDD.scala:149)
at
org.apache.spark.rdd.HadoopRDD$$anonfun$getJobConf$1.apply(HadoopRDD.scala:149)
at scala.Option.map(Option.scala:145)
at org.apache.spark.rdd.HadoopRDD.getJobConf(HadoopRDD.scala:149)
at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:172)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
at
org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:32)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
at org.apache.spark.rdd.MappedRDD.getPartitions(MappedRDD.scala:28)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:204)
at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:202)
at scala.Option.getOrElse(Option.scala:120)
at org.apache.spark.rdd.RDD.partitions(RDD.scala:202)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1135)
at org.apache.spark.rdd.RDD.collect(RDD.scala:774)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.stringResult(HiveContext.scala:415)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:59)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.lang.RuntimeException: java.lang.ClassNotFoundException:
Class org.apache.hadoop.hdfs.server.namenode.ha.IPFailoverProxyProvider not
found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1895)
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1919)
... 56 more
Caused by: java.lang.ClassNotFoundException: Class

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Michael Armbrust
Often java.lang.NoSuchMethodError means that you have more than one version
of a library on your classpath, in this case it looks like hive.

On Thu, Oct 2, 2014 at 8:44 PM, Li HM hmx...@gmail.com wrote:

 I have rebuild package with -Phive
 Copied hive-site.xml to conf (I am using hive-0.12)

 When I run ./bin/spark-sql, I get java.lang.NoSuchMethodError for every
 command.

 What am I missing here?

 Could somebody share what would be the right procedure to make it work?

 java.lang.NoSuchMethodError:
 org.apache.hadoop.hive.ql.Driver.getResults(Ljava/util/ArrayList;)Z
 at
 org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
 at
 org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
 at
 org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
 at
 org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 spark-sql use mydb;
 OK
 java.lang.NoSuchMethodError:
 org.apache.hadoop.hive.ql.Driver.getResults(Ljava/util/ArrayList;)Z
 at
 org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
 at
 org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
 at
 org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
 at
 org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
 org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

 spark-sql select count(*) from test;
 java.lang.NoSuchMethodError:
 com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
 at org.apache.spark.util.collection.OpenHashSet.org
 $apache$spark$util$collection$OpenHashSet$$hashcode(OpenHashSet.scala:261)
 at
 

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Li HM
This is my SPARK_CLASSPATH after cleanup
SPARK_CLASSPATH=/home/test/lib/hcatalog-core.jar:$SPARK_CLASSPATH

now use mydb works.

but show tables and select * from test still gives exception:

spark-sql show tables;
OK
java.io.IOException: java.io.IOException: Cannot create an instance of
InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in
mapredWork!
at
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:551)
at
org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:489)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:136)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1471)
at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: java.io.IOException: Cannot create an instance of InputFormat
class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
at
org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:223)
at
org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:379)
at
org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:515)
... 25 more
Caused by: java.lang.RuntimeException: Error in configuring object
at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
at
org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:219)
... 27 more
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at
org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
... 30 more
Caused by: java.lang.IllegalArgumentException: Compression codec
com.hadoop.compression.lzo.LzoCodec not found.
at
org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:135)
at
org.apache.hadoop.io.compress.CompressionCodecFactory.init(CompressionCodecFactory.java:175)
at
org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
... 35 more
Caused by: java.lang.ClassNotFoundException: Class
com.hadoop.compression.lzo.LzoCodec not found
at
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801)
at
org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128)
... 37 more

spark-sql select * from test;
java.lang.RuntimeException: java.lang.RuntimeException:
java.lang.ClassNotFoundException: Class
org.apache.hadoop.hdfs.server.namenode.ha.IPFailoverProxyProvider not found
at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1927)
at
org.apache.hadoop.hdfs.NameNodeProxies.getFailoverProxyProviderClass(NameNodeProxies.java:409)
at
org.apache.hadoop.hdfs.NameNodeProxies.createProxy(NameNodeProxies.java:139)
at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:579)
at org.apache.hadoop.hdfs.DFSClient.init(DFSClient.java:524)
at

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Michael Armbrust
Why are you including hcatalog-core.jar?  That is probably causing the
issues.

On Fri, Oct 3, 2014 at 3:03 PM, Li HM hmx...@gmail.com wrote:

 This is my SPARK_CLASSPATH after cleanup
 SPARK_CLASSPATH=/home/test/lib/hcatalog-core.jar:$SPARK_CLASSPATH

 now use mydb works.

 but show tables and select * from test still gives exception:

 spark-sql show tables;
 OK
 java.io.IOException: java.io.IOException: Cannot create an instance of
 InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in
 mapredWork!
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:551)
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:489)
 at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:136)
 at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1471)
 at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
 at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
 at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
 at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.io.IOException: Cannot create an instance of InputFormat
 class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:223)
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getRecordReader(FetchOperator.java:379)
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:515)
 ... 25 more
 Caused by: java.lang.RuntimeException: Error in configuring object
 at
 org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:109)
 at org.apache.hadoop.util.ReflectionUtils.setConf(ReflectionUtils.java:75)
 at
 org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:133)
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:219)
 ... 27 more
 Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at
 org.apache.hadoop.util.ReflectionUtils.setJobConf(ReflectionUtils.java:106)
 ... 30 more
 Caused by: java.lang.IllegalArgumentException: Compression codec
 com.hadoop.compression.lzo.LzoCodec not found.
 at
 org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:135)
 at
 org.apache.hadoop.io.compress.CompressionCodecFactory.init(CompressionCodecFactory.java:175)
 at
 org.apache.hadoop.mapred.TextInputFormat.configure(TextInputFormat.java:45)
 ... 35 more
 Caused by: java.lang.ClassNotFoundException: Class
 com.hadoop.compression.lzo.LzoCodec not found
 at
 org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:1801)
 at
 org.apache.hadoop.io.compress.CompressionCodecFactory.getCodecClasses(CompressionCodecFactory.java:128)
 ... 37 more

 spark-sql select * from test;
 java.lang.RuntimeException: java.lang.RuntimeException:
 java.lang.ClassNotFoundException: Class
 org.apache.hadoop.hdfs.server.namenode.ha.IPFailoverProxyProvider not found
 at org.apache.hadoop.conf.Configuration.getClass(Configuration.java:1927)
 at
 

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Li HM
If I don't have that jar, I am getting the following error:

xception in thread main java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassNotFoundException:
org.apache.hcatalog.security.HdfsAuthorizationProvider
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassNotFoundException:
org.apache.hcatalog.security.HdfsAuthorizationProvider
at
org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:280)
... 9 more
Caused by: java.lang.ClassNotFoundException:
org.apache.hcatalog.security.HdfsAuthorizationProvider
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
at java.lang.Class.forName0(Native Method)
at java.lang.Class.forName(Class.java:266)
at
org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:335)
... 10 more

On Fri, Oct 3, 2014 at 3:27 PM, Michael Armbrust mich...@databricks.com
wrote:

 Why are you including hcatalog-core.jar?  That is probably causing the
 issues.

 On Fri, Oct 3, 2014 at 3:03 PM, Li HM hmx...@gmail.com wrote:

 This is my SPARK_CLASSPATH after cleanup
 SPARK_CLASSPATH=/home/test/lib/hcatalog-core.jar:$SPARK_CLASSPATH

 now use mydb works.

 but show tables and select * from test still gives exception:

 spark-sql show tables;
 OK
 java.io.IOException: java.io.IOException: Cannot create an instance of
 InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in
 mapredWork!
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:551)
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:489)
 at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:136)
 at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1471)
 at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
 at org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
 at
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
 at
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
 at org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
 at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
 at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
 at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: java.io.IOException: Cannot create an instance of InputFormat
 class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!
 at
 org.apache.hadoop.hive.ql.exec.FetchOperator.getInputFormatFromCache(FetchOperator.java:223)
 at
 

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Hmxxyy
No, it is hive 0.12.4.

Let me try your suggestion. It is an existing hive db. I am using the original 
hive-site.xml as is.

Sent from my iPhone

 On Oct 3, 2014, at 5:02 PM, Edwin Chiu edwin.c...@manage.com wrote:
 
 Are you using hive 0.13?
 
 Switching back to HadoopDefaultAuthenticator in your hive-site.xml worth a 
 shot
 
 property
 
   namehive.security.authenticator.manager/name
 
   
 !--valueorg.apache.hadoop.hive.ql.security.ProxyUserAuthenticator/value--
 
   
 valueorg.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator/value
 
 /property
 
 
 
 - Edwin
 
 On Fri, Oct 3, 2014 at 4:25 PM, Li HM hmx...@gmail.com wrote:
 If I don't have that jar, I am getting the following error:
 
 xception in thread main java.lang.RuntimeException: 
 org.apache.hadoop.hive.ql.metadata.HiveException: 
 java.lang.ClassNotFoundException: 
 org.apache.hcatalog.security.HdfsAuthorizationProvider
  at 
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
  at 
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
  at 
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
  at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
  at 
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
  at 
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
  at java.lang.reflect.Method.invoke(Method.java:601)
  at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
  at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
  at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: 
 java.lang.ClassNotFoundException: 
 org.apache.hcatalog.security.HdfsAuthorizationProvider
  at 
 org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
  at 
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:280)
  ... 9 more
 Caused by: java.lang.ClassNotFoundException: 
 org.apache.hcatalog.security.HdfsAuthorizationProvider
  at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
  at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
  at java.security.AccessController.doPrivileged(Native Method)
  at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
  at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
  at java.lang.Class.forName0(Native Method)
  at java.lang.Class.forName(Class.java:266)
  at 
 org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:335)
  ... 10 more
 
 On Fri, Oct 3, 2014 at 3:27 PM, Michael Armbrust mich...@databricks.com 
 wrote:
 Why are you including hcatalog-core.jar?  That is probably causing the 
 issues.
 
 On Fri, Oct 3, 2014 at 3:03 PM, Li HM hmx...@gmail.com wrote:
 This is my SPARK_CLASSPATH after cleanup
 SPARK_CLASSPATH=/home/test/lib/hcatalog-core.jar:$SPARK_CLASSPATH
 
 now use mydb works.
 
 but show tables and select * from test still gives exception:
 
 spark-sql show tables;
 OK
 java.io.IOException: java.io.IOException: Cannot create an instance of 
 InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in 
 mapredWork!
at 
 org.apache.hadoop.hive.ql.exec.FetchOperator.getNextRow(FetchOperator.java:551)
at 
 org.apache.hadoop.hive.ql.exec.FetchOperator.pushRow(FetchOperator.java:489)
at org.apache.hadoop.hive.ql.exec.FetchTask.fetch(FetchTask.java:136)
at org.apache.hadoop.hive.ql.Driver.getResults(Driver.java:1471)
at org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
at 
 org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
at 
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
at 
 org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
at 
 org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
at 
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
at 
 org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
at 
 org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
at 
 org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
at 
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:413)
at 
 

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Li HM
It won't work with valueorg.apache.hadoop.hive.ql.security.
HadoopDefaultAuthenticator/value.

Just wonder how and why it works with you guys.

Here is the new error:
Exception in thread main java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassCastException:
org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator cannot be
cast to
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.ClassCastException:
org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator cannot be
cast to
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider
at
org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:280)
... 9 more
Caused by: java.lang.ClassCastException:
org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator cannot be
cast to
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider
at
org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:339)
... 10 more

On Fri, Oct 3, 2014 at 5:06 PM, Hmxxyy hmx...@gmail.com wrote:

 No, it is hive 0.12.4.

 Let me try your suggestion. It is an existing hive db. I am using the
 original hive-site.xml as is.

 Sent from my iPhone

 On Oct 3, 2014, at 5:02 PM, Edwin Chiu edwin.c...@manage.com wrote:

 Are you using hive 0.13?

 Switching back to HadoopDefaultAuthenticator in your hive-site.xml worth a
 shot

 property

   namehive.security.authenticator.manager/name


 !--valueorg.apache.hadoop.hive.ql.security.ProxyUserAuthenticator/value--

   valueorg.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator
 /value

 /property


 - Edwin

 On Fri, Oct 3, 2014 at 4:25 PM, Li HM hmx...@gmail.com wrote:

 If I don't have that jar, I am getting the following error:

 xception in thread main java.lang.RuntimeException:
 org.apache.hadoop.hive.ql.metadata.HiveException:
 java.lang.ClassNotFoundException:
 org.apache.hcatalog.security.HdfsAuthorizationProvider
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
 java.lang.ClassNotFoundException:
 org.apache.hcatalog.security.HdfsAuthorizationProvider
 at
 org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:280)
 ... 9 more
 Caused by: java.lang.ClassNotFoundException:
 org.apache.hcatalog.security.HdfsAuthorizationProvider
 at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
 at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
 at java.security.AccessController.doPrivileged(Native Method)
 at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:423)
 at java.lang.ClassLoader.loadClass(ClassLoader.java:356)
 at java.lang.Class.forName0(Native Method)
 at java.lang.Class.forName(Class.java:266)
 at
 org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:335)
 ... 10 more

 On Fri, Oct 3, 2014 at 3:27 PM, Michael Armbrust mich...@databricks.com
 wrote:

 Why are you including hcatalog-core.jar?  That is probably causing the
 issues.

 On Fri, Oct 3, 2014 at 3:03 PM, Li HM hmx...@gmail.com wrote:

 This is my SPARK_CLASSPATH after cleanup
 

Re: How to make ./bin/spark-sql work with hive?

2014-10-03 Thread Li HM
If I change it to
valueorg.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider/value

The error becomes:
Exception in thread main java.lang.RuntimeException:
org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: java.lang.NoSuchMethodException:
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.init()
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
java.lang.RuntimeException: java.lang.NoSuchMethodException:
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.init()
at
org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
at
org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:280)
... 9 more
Caused by: java.lang.RuntimeException: java.lang.NoSuchMethodException:
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.init()
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
at
org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:339)
... 10 more
Caused by: java.lang.NoSuchMethodException:
org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider.init()
at java.lang.Class.getConstructor0(Class.java:2730)
at java.lang.Class.getDeclaredConstructor(Class.java:2004)
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:125)
... 11 more

On Fri, Oct 3, 2014 at 10:05 PM, Li HM hmx...@gmail.com wrote:

 It won't work with valueorg.apache.hadoop.hive.ql.security.
 HadoopDefaultAuthenticator/value.

 Just wonder how and why it works with you guys.

 Here is the new error:
 Exception in thread main java.lang.RuntimeException:
 org.apache.hadoop.hive.ql.metadata.HiveException:
 java.lang.ClassCastException:
 org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator cannot be
 cast to
 org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:286)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:116)
 at
 org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:601)
 at org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
 at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
 at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
 Caused by: org.apache.hadoop.hive.ql.metadata.HiveException:
 java.lang.ClassCastException:
 org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator cannot be
 cast to
 org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider
 at
 org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:342)
 at
 org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:280)
 ... 9 more
 Caused by: java.lang.ClassCastException:
 org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator cannot be
 cast to
 org.apache.hadoop.hive.ql.security.authorization.HiveAuthorizationProvider
 at
 org.apache.hadoop.hive.ql.metadata.HiveUtils.getAuthorizeProviderManager(HiveUtils.java:339)
 ... 10 more

 On Fri, Oct 3, 2014 at 5:06 PM, Hmxxyy hmx...@gmail.com wrote:

 No, it is hive 0.12.4.

 Let me try your suggestion. It is an existing hive db. I am using the
 original hive-site.xml as is.

 Sent from my iPhone

 On Oct 3, 2014, at 5:02 PM, Edwin Chiu edwin.c...@manage.com wrote:

 Are you using hive 0.13?

 Switching back to HadoopDefaultAuthenticator in your hive-site.xml worth
 a shot

 property

   namehive.security.authenticator.manager/name


 !--valueorg.apache.hadoop.hive.ql.security.ProxyUserAuthenticator/value--

   value
 org.apache.hadoop.hive.ql.security.HadoopDefaultAuthenticator/value

 /property


 - Edwin

 On Fri, Oct 3, 2014 at 4:25 PM, Li HM hmx...@gmail.com wrote:

 If I don't have that jar, I am getting the following 

How to make ./bin/spark-sql work with hive?

2014-10-02 Thread Li HM
I have rebuild package with -Phive
Copied hive-site.xml to conf (I am using hive-0.12)

When I run ./bin/spark-sql, I get java.lang.NoSuchMethodError for every
command.

What am I missing here?

Could somebody share what would be the right procedure to make it work?

java.lang.NoSuchMethodError:
org.apache.hadoop.hive.ql.Driver.getResults(Ljava/util/ArrayList;)Z
at
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

spark-sql use mydb;
OK
java.lang.NoSuchMethodError:
org.apache.hadoop.hive.ql.Driver.getResults(Ljava/util/ArrayList;)Z
at
org.apache.spark.sql.hive.HiveContext.runHive(HiveContext.scala:305)
at
org.apache.spark.sql.hive.HiveContext.runSqlHive(HiveContext.scala:272)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult$lzycompute(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.sideEffectResult(NativeCommand.scala:35)
at
org.apache.spark.sql.hive.execution.NativeCommand.execute(NativeCommand.scala:38)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd$lzycompute(HiveContext.scala:360)
at
org.apache.spark.sql.hive.HiveContext$QueryExecution.toRdd(HiveContext.scala:360)
at
org.apache.spark.sql.SchemaRDDLike$class.$init$(SchemaRDDLike.scala:58)
at org.apache.spark.sql.SchemaRDD.init(SchemaRDD.scala:103)
at org.apache.spark.sql.hive.HiveContext.sql(HiveContext.scala:98)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLDriver.run(SparkSQLDriver.scala:58)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.processCmd(SparkSQLCLIDriver.scala:291)
at
org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:422)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver$.main(SparkSQLCLIDriver.scala:226)
at
org.apache.spark.sql.hive.thriftserver.SparkSQLCLIDriver.main(SparkSQLCLIDriver.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:601)
at
org.apache.spark.deploy.SparkSubmit$.launch(SparkSubmit.scala:328)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

spark-sql select count(*) from test;
java.lang.NoSuchMethodError:
com.google.common.hash.HashFunction.hashInt(I)Lcom/google/common/hash/HashCode;
at org.apache.spark.util.collection.OpenHashSet.org
$apache$spark$util$collection$OpenHashSet$$hashcode(OpenHashSet.scala:261)
at
org.apache.spark.util.collection.OpenHashSet$mcI$sp.getPos$mcI$sp(OpenHashSet.scala:165)
at
org.apache.spark.util.collection.OpenHashSet$mcI$sp.contains$mcI$sp(OpenHashSet.scala:102)
at
org.apache.spark.util.SizeEstimator$$anonfun$visitArray$2.apply$mcVI$sp(SizeEstimator.scala:214)
at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)