[ 
https://issues.apache.org/jira/browse/SPARK-12533?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15075451#comment-15075451
 ] 

Thomas Sebastian commented on SPARK-12533:
------------------------------------------

Submitted the pull request

> hiveContext.table() throws the wrong exception
> ----------------------------------------------
>
>                 Key: SPARK-12533
>                 URL: https://issues.apache.org/jira/browse/SPARK-12533
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.6.0
>            Reporter: Michael Armbrust
>
> This should throw an {{AnalysisException}} that includes the table name 
> instead of the following:
> {code}
> org.apache.spark.sql.catalyst.analysis.NoSuchTableException
>       at 
> org.apache.spark.sql.hive.client.ClientInterface$$anonfun$getTable$1.apply(ClientInterface.scala:122)
>       at 
> org.apache.spark.sql.hive.client.ClientInterface$$anonfun$getTable$1.apply(ClientInterface.scala:122)
>       at scala.Option.getOrElse(Option.scala:120)
>       at 
> org.apache.spark.sql.hive.client.ClientInterface$class.getTable(ClientInterface.scala:122)
>       at 
> org.apache.spark.sql.hive.client.ClientWrapper.getTable(ClientWrapper.scala:60)
>       at 
> org.apache.spark.sql.hive.HiveMetastoreCatalog.lookupRelation(HiveMetastoreCatalog.scala:384)
>       at 
> org.apache.spark.sql.hive.HiveContext$$anon$2.org$apache$spark$sql$catalyst$analysis$OverrideCatalog$$super$lookupRelation(HiveContext.scala:458)
>       at 
> org.apache.spark.sql.catalyst.analysis.OverrideCatalog$class.lookupRelation(Catalog.scala:161)
>       at 
> org.apache.spark.sql.hive.HiveContext$$anon$2.lookupRelation(HiveContext.scala:458)
>       at org.apache.spark.sql.SQLContext.table(SQLContext.scala:830)
>       at org.apache.spark.sql.SQLContext.table(SQLContext.scala:826)
> {code}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to