[ 
https://issues.apache.org/jira/browse/SPARK-18009?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15607052#comment-15607052
 ] 

Jerryjung edited comment on SPARK-18009 at 10/26/16 1:44 AM:
-------------------------------------------------------------

Yes!
In my case, it's necessary option for integration with BI tools.


was (Author: jerryjung):
Yes!
But In my case, it's necessary option for integration with BI tools.

> Spark 2.0.1 SQL Thrift Error
> ----------------------------
>
>                 Key: SPARK-18009
>                 URL: https://issues.apache.org/jira/browse/SPARK-18009
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.1
>         Environment: apache hadoop 2.6.2 
> spark 2.0.1
>            Reporter: Jerryjung
>            Priority: Critical
>              Labels: thrift
>
> After deploy spark thrift server on YARN, then I tried to execute from the 
> beeline following command.
> > show databases;
> I've got this error message. 
> {quote}
> beeline> !connect jdbc:hive2://localhost:10000 a a
> Connecting to jdbc:hive2://localhost:10000
> 16/10/19 22:50:18 INFO Utils: Supplied authorities: localhost:10000
> 16/10/19 22:50:18 INFO Utils: Resolved authority: localhost:10000
> 16/10/19 22:50:18 INFO HiveConnection: Will try to open client transport with 
> JDBC Uri: jdbc:hive2://localhost:10000
> Connected to: Spark SQL (version 2.0.1)
> Driver: Hive JDBC (version 1.2.1.spark2)
> Transaction isolation: TRANSACTION_REPEATABLE_READ
> 0: jdbc:hive2://localhost:10000> show databases;
> java.lang.IllegalStateException: Can't overwrite cause with 
> java.lang.ClassCastException: 
> org.apache.spark.sql.catalyst.expressions.GenericInternalRow cannot be cast 
> to org.apache.spark.sql.catalyst.expressions.UnsafeRow
>       at java.lang.Throwable.initCause(Throwable.java:456)
>       at 
> org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:236)
>       at 
> org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:236)
>       at 
> org.apache.hive.service.cli.HiveSQLException.toCause(HiveSQLException.java:197)
>       at 
> org.apache.hive.service.cli.HiveSQLException.<init>(HiveSQLException.java:108)
>       at org.apache.hive.jdbc.Utils.verifySuccess(Utils.java:256)
>       at org.apache.hive.jdbc.Utils.verifySuccessWithInfo(Utils.java:242)
>       at 
> org.apache.hive.jdbc.HiveQueryResultSet.next(HiveQueryResultSet.java:365)
>       at org.apache.hive.beeline.BufferedRows.<init>(BufferedRows.java:42)
>       at org.apache.hive.beeline.BeeLine.print(BeeLine.java:1794)
>       at org.apache.hive.beeline.Commands.execute(Commands.java:860)
>       at org.apache.hive.beeline.Commands.sql(Commands.java:713)
>       at org.apache.hive.beeline.BeeLine.dispatch(BeeLine.java:973)
>       at org.apache.hive.beeline.BeeLine.execute(BeeLine.java:813)
>       at org.apache.hive.beeline.BeeLine.begin(BeeLine.java:771)
>       at 
> org.apache.hive.beeline.BeeLine.mainWithInputRedirection(BeeLine.java:484)
>       at org.apache.hive.beeline.BeeLine.main(BeeLine.java:467)
> Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: 
> Task 0 in stage 669.0 failed 4 times, most recent failure: Lost task 0.3 in 
> stage 669.0 (TID 3519, edw-014-22): java.lang.ClassCastException: 
> org.apache.spark.sql.catalyst.expressions.GenericInternalRow cannot be cast 
> to org.apache.spark.sql.catalyst.expressions.UnsafeRow
>       at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:247)
>       at 
> org.apache.spark.sql.execution.SparkPlan$$anonfun$4.apply(SparkPlan.scala:240)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:803)
>       at 
> org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$24.apply(RDD.scala:803)
>       at 
> org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
>       at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:319)
>       at org.apache.spark.rdd.RDD.iterator(RDD.scala:283)
>       at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:70)
>       at org.apache.spark.scheduler.Task.run(Task.scala:86)
>       at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:274)
>       at 
> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
>       at 
> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
>       at java.lang.Thread.run(Thread.java:745)
> Driver stacktrace:
>       at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
>       at 
> sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
>       at 
> sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
>       at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
>       at 
> org.apache.hive.service.cli.HiveSQLException.newInstance(HiveSQLException.java:244)
>       at 
> org.apache.hive.service.cli.HiveSQLException.toStackTrace(HiveSQLException.java:210)
>       ... 15 more
> Error: Error retrieving next row (state=,code=0)
> {quote}
> "add jar" command also same error occurred.
> {quote}
> add jar ~/udf.jar
> java.lang.IllegalStateException: Can’t overwrite cause with 
> java.lang.ClassCastException: 
> org.apache.spark.sql.catalyst.expressions.GenericInternalRow cannot be cast 
> to org.apache.spark.sql.catalyst.expressions.UnsafeRow
> at java.lang.Throwable.initCause (Throwable.java:456)
> …
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to