hi!

I am following  this
<http://hortonworks.com/hadoop-tutorial/using-hive-with-orc-from-apache-spark/> 
 
tutorial to read and write from hive. But i am facing following exception
when i run the code.

15/10/12 14:57:36 INFO storage.BlockManagerMaster: Registered BlockManager
15/10/12 14:57:38 INFO scheduler.EventLoggingListener: Logging events to
hdfs://host:9000/spark/logs/local-1444676256555
Exception in thread "main" java.lang.VerifyError: Bad return type
Exception Details:
  Location:
   
org/apache/spark/sql/catalyst/expressions/Pmod.inputType()Lorg/apache/spark/sql/types/AbstractDataType;
@3: areturn
  Reason:
    Type 'org/apache/spark/sql/types/NumericType$' (current frame, stack[0])
is not assignable to 'org/apache/spark/sql/types/AbstractDataType' (from
method signature)
  Current Frame:
    bci: @3
    flags: { }
    locals: { 'org/apache/spark/sql/catalyst/expressions/Pmod' }
    stack: { 'org/apache/spark/sql/types/NumericType$' }
  Bytecode:
    0000000: b200 63b0

        at java.lang.Class.getDeclaredConstructors0(Native Method)
        at java.lang.Class.privateGetDeclaredConstructors(Class.java:2595)
        at java.lang.Class.getConstructor0(Class.java:2895)
        at java.lang.Class.getDeclaredConstructor(Class.java:2066)
        at
org.apache.spark.sql.catalyst.analysis.FunctionRegistry$$anonfun$4.apply(FunctionRegistry.scala:267)
        at
org.apache.spark.sql.catalyst.analysis.FunctionRegistry$$anonfun$4.apply(FunctionRegistry.scala:267)
        at scala.util.Try$.apply(Try.scala:161)
        at
org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.expression(FunctionRegistry.scala:267)
        at
org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.<init>(FunctionRegistry.scala:148)
        at
org.apache.spark.sql.catalyst.analysis.FunctionRegistry$.<clinit>(FunctionRegistry.scala)
        at
org.apache.spark.sql.hive.HiveContext.functionRegistry$lzycompute(HiveContext.scala:414)
        at
org.apache.spark.sql.hive.HiveContext.functionRegistry(HiveContext.scala:413)
        at
org.apache.spark.sql.UDFRegistration.<init>(UDFRegistration.scala:39)
        at org.apache.spark.sql.SQLContext.<init>(SQLContext.scala:203)
        at
org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:72)


Is there any suggestion how to read and write in hive?

thanks



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/writing-to-hive-tp25046.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to