[ 
https://issues.apache.org/jira/browse/SPARK-2706?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14183452#comment-14183452
 ] 

Zhan Zhang commented on SPARK-2706:
-----------------------------------

I just check the trunk, the change is already there. But your code seems to
be old code.  I tried clone the latest one, and the build does not have any
problem with hive-0.13.1

sbt/sbt -Phive -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 assembly

Trunk code:

val proc: CommandProcessor = HiveShim.getCommandProcessor(Array(tokens(0)),
hiveconf)

Your code seems to be like below, which is old version.
val proc: CommandProcessor = HiveShim.getCommandProcessor(tokens(0),
hiveconf)

Thanks.

Zhan Zhang



On Fri, Oct 24, 2014 at 11:05 AM, Michael Armbrust (JIRA) <j...@apache.org>




-- 
Regards,

Zhan Zhang


> Enable Spark to support Hive 0.13
> ---------------------------------
>
>                 Key: SPARK-2706
>                 URL: https://issues.apache.org/jira/browse/SPARK-2706
>             Project: Spark
>          Issue Type: Dependency upgrade
>          Components: SQL
>    Affects Versions: 1.0.1
>            Reporter: Chunjun Xiao
>            Assignee: Zhan Zhang
>             Fix For: 1.2.0
>
>         Attachments: hive.diff, spark-2706-v1.txt, spark-2706-v2.txt, 
> spark-hive.err, v1.0.2.diff
>
>
> It seems Spark cannot work with Hive 0.13 well.
> When I compiled Spark with Hive 0.13.1, I got some error messages, as 
> attached below.
> So, when can Spark be enabled to support Hive 0.13?
> Compiling Error:
> {quote}
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveContext.scala:180:
>  type mismatch;
>  found   : String
>  required: Array[String]
> [ERROR]       val proc: CommandProcessor = 
> CommandProcessorFactory.get(tokens(0), hiveconf)
> [ERROR]                                                                      ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveMetastoreCatalog.scala:264:
>  overloaded method constructor TableDesc with alternatives:
>   (x$1: Class[_ <: org.apache.hadoop.mapred.InputFormat[_, _]],x$2: 
> Class[_],x$3: java.util.Properties)org.apache.hadoop.hive.ql.plan.TableDesc 
> <and>
>   ()org.apache.hadoop.hive.ql.plan.TableDesc
>  cannot be applied to (Class[org.apache.hadoop.hive.serde2.Deserializer], 
> Class[(some other)?0(in value tableDesc)(in value tableDesc)], Class[?0(in 
> value tableDesc)(in value tableDesc)], java.util.Properties)
> [ERROR]   val tableDesc = new TableDesc(
> [ERROR]                   ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/TableReader.scala:140:
>  value getPartitionPath is not a member of 
> org.apache.hadoop.hive.ql.metadata.Partition
> [ERROR]       val partPath = partition.getPartitionPath
> [ERROR]                                ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/HiveTableScan.scala:132:
>  value appendReadColumnNames is not a member of object 
> org.apache.hadoop.hive.serde2.ColumnProjectionUtils
> [ERROR]     ColumnProjectionUtils.appendReadColumnNames(hiveConf, 
> attributes.map(_.name))
> [ERROR]                           ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala:79:
>  org.apache.hadoop.hive.common.type.HiveDecimal does not have a constructor
> [ERROR]       new HiveDecimal(bd.underlying())
> [ERROR]       ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala:132:
>  type mismatch;
>  found   : org.apache.hadoop.fs.Path
>  required: String
> [ERROR]       
> SparkHiveHadoopWriter.createPathFromString(fileSinkConf.getDirName, conf))
> [ERROR]                                                               ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/execution/InsertIntoHiveTable.scala:179:
>  value getExternalTmpFileURI is not a member of 
> org.apache.hadoop.hive.ql.Context
> [ERROR]     val tmpLocation = hiveContext.getExternalTmpFileURI(tableLocation)
> [ERROR]                                   ^
> [ERROR] 
> /ws/spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUdfs.scala:209:
>  org.apache.hadoop.hive.common.type.HiveDecimal does not have a constructor
> [ERROR]           case bd: BigDecimal => new HiveDecimal(bd.underlying())
> [ERROR]                                  ^
> [ERROR] 8 errors found
> [DEBUG] Compilation failed (CompilerInterface)
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Reactor Summary:
> [INFO] 
> [INFO] Spark Project Parent POM .......................... SUCCESS [2.579s]
> [INFO] Spark Project Core ................................ SUCCESS [2:39.805s]
> [INFO] Spark Project Bagel ............................... SUCCESS [21.148s]
> [INFO] Spark Project GraphX .............................. SUCCESS [59.950s]
> [INFO] Spark Project ML Library .......................... SUCCESS [1:08.771s]
> [INFO] Spark Project Streaming ........................... SUCCESS [1:17.759s]
> [INFO] Spark Project Tools ............................... SUCCESS [15.405s]
> [INFO] Spark Project Catalyst ............................ SUCCESS [1:17.405s]
> [INFO] Spark Project SQL ................................. SUCCESS [1:11.094s]
> [INFO] Spark Project Hive ................................ FAILURE [11.121s]
> [INFO] Spark Project REPL ................................ SKIPPED
> [INFO] Spark Project YARN Parent POM ..................... SKIPPED
> [INFO] Spark Project YARN Stable API ..................... SKIPPED
> [INFO] Spark Project Assembly ............................ SKIPPED
> [INFO] Spark Project External Twitter .................... SKIPPED
> [INFO] Spark Project External Kafka ...................... SKIPPED
> [INFO] Spark Project External Flume ...................... SKIPPED
> [INFO] Spark Project External ZeroMQ ..................... SKIPPED
> [INFO] Spark Project External MQTT ....................... SKIPPED
> [INFO] Spark Project Examples ............................ SKIPPED
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] BUILD FAILURE
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] Total time: 9:25.609s
> [INFO] Finished at: Wed Jul 23 05:22:06 EDT 2014
> [INFO] Final Memory: 52M/873M
> [INFO] 
> ------------------------------------------------------------------------
> [ERROR] Failed to execute goal 
> net.alchim31.maven:scala-maven-plugin:3.1.6:compile (scala-compile-first) on 
> project spark-hive_2.10: Execution scala-compile-first of goal 
> net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed. CompileFailed -> 
> [Help 1]
> org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute 
> goal net.alchim31.maven:scala-maven-plugin:3.1.6:compile 
> (scala-compile-first) on project spark-hive_2.10: Execution 
> scala-compile-first of goal 
> net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed.
>       at 
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:225)
>       at 
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:153)
>       at 
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:145)
>       at 
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:84)
>       at 
> org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject(LifecycleModuleBuilder.java:59)
>       at 
> org.apache.maven.lifecycle.internal.LifecycleStarter.singleThreadedBuild(LifecycleStarter.java:183)
>       at 
> org.apache.maven.lifecycle.internal.LifecycleStarter.execute(LifecycleStarter.java:161)
>       at org.apache.maven.DefaultMaven.doExecute(DefaultMaven.java:320)
>       at org.apache.maven.DefaultMaven.execute(DefaultMaven.java:156)
>       at org.apache.maven.cli.MavenCli.execute(MavenCli.java:537)
>       at org.apache.maven.cli.MavenCli.doMain(MavenCli.java:196)
>       at org.apache.maven.cli.MavenCli.main(MavenCli.java:141)
>       at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>       at 
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
>       at java.lang.reflect.Method.invoke(Method.java:606)
>       at 
> org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced(Launcher.java:290)
>       at 
> org.codehaus.plexus.classworlds.launcher.Launcher.launch(Launcher.java:230)
>       at 
> org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode(Launcher.java:409)
>       at 
> org.codehaus.plexus.classworlds.launcher.Launcher.main(Launcher.java:352)
> Caused by: org.apache.maven.plugin.PluginExecutionException: Execution 
> scala-compile-first of goal 
> net.alchim31.maven:scala-maven-plugin:3.1.6:compile failed.
>       at 
> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:110)
>       at 
> org.apache.maven.lifecycle.internal.MojoExecutor.execute(MojoExecutor.java:209)
>       ... 19 more
> Caused by: Compilation failed
>       at sbt.compiler.AnalyzingCompiler.call(AnalyzingCompiler.scala:76)
>       at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:35)
>       at sbt.compiler.AnalyzingCompiler.compile(AnalyzingCompiler.scala:29)
>       at 
> sbt.compiler.AggressiveCompile$$anonfun$4$$anonfun$compileScala$1$1.apply$mcV$sp(AggressiveCompile.scala:71)
>       at 
> sbt.compiler.AggressiveCompile$$anonfun$4$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:71)
>       at 
> sbt.compiler.AggressiveCompile$$anonfun$4$$anonfun$compileScala$1$1.apply(AggressiveCompile.scala:71)
>       at 
> sbt.compiler.AggressiveCompile.sbt$compiler$AggressiveCompile$$timed(AggressiveCompile.scala:101)
>       at 
> sbt.compiler.AggressiveCompile$$anonfun$4.compileScala$1(AggressiveCompile.scala:70)
>       at 
> sbt.compiler.AggressiveCompile$$anonfun$4.apply(AggressiveCompile.scala:88)
>       at 
> sbt.compiler.AggressiveCompile$$anonfun$4.apply(AggressiveCompile.scala:60)
>       at 
> sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:24)
>       at 
> sbt.inc.IncrementalCompile$$anonfun$doCompile$1.apply(Compile.scala:22)
>       at sbt.inc.Incremental$.cycle(Incremental.scala:52)
>       at sbt.inc.Incremental$.compile(Incremental.scala:29)
>       at sbt.inc.IncrementalCompile$.apply(Compile.scala:20)
>       at sbt.compiler.AggressiveCompile.compile2(AggressiveCompile.scala:96)
>       at sbt.compiler.AggressiveCompile.compile1(AggressiveCompile.scala:44)
>       at com.typesafe.zinc.Compiler.compile(Compiler.scala:158)
>       at com.typesafe.zinc.Compiler.compile(Compiler.scala:142)
>       at 
> sbt_inc.SbtIncrementalCompiler.compile(SbtIncrementalCompiler.java:77)
>       at 
> scala_maven.ScalaCompilerSupport.incrementalCompile(ScalaCompilerSupport.java:308)
>       at 
> scala_maven.ScalaCompilerSupport.compile(ScalaCompilerSupport.java:124)
>       at 
> scala_maven.ScalaCompilerSupport.doExecute(ScalaCompilerSupport.java:104)
>       at scala_maven.ScalaMojoSupport.execute(ScalaMojoSupport.java:482)
>       at 
> org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo(DefaultBuildPluginManager.java:101)
>       ... 20 more
> [ERROR] 
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to