[ 
https://issues.apache.org/jira/browse/SPARK-11778?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15009274#comment-15009274
 ] 

Huaxin Gao edited comment on SPARK-11778 at 11/17/15 7:05 PM:
--------------------------------------------------------------

Will simply add the parse in the following method

  def table(tableName: String): DataFrame = {
    DataFrame(sqlContext, 
sqlContext.catalog.lookupRelation(TableIdentifier(tableName)))
  }

so it will be 

 def table(tableName: String): DataFrame = {
    DataFrame(sqlContext, 
sqlContext.catalog.lookupRelation(SqlParser.parseTableIdentifier(tableName)))
  }

Will have a PR soon.


was (Author: huaxing):
Will simply add the parse in the following method

  def table(tableName: String): DataFrame = {
    DataFrame(sqlContext, 
sqlContext.catalog.lookupRelation(TableIdentifier(tableName)))
  }

so it will be 

 def table(tableName: String): DataFrame = {
    DataFrame(sqlContext, 
sqlContext.catalog.lookupRelation(SqlParser.parseTableIdentifier(tableName)))
  }



> HiveContext.read.table does not support user-specified database names
> ---------------------------------------------------------------------
>
>                 Key: SPARK-11778
>                 URL: https://issues.apache.org/jira/browse/SPARK-11778
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 1.5.1
>            Reporter: Stanislav Hadjiiski
>            Priority: Minor
>
> If we have defined a HiveContext instance
> {quote}
> val hiveContext = new org.apache.spark.sql.hive.HiveContext(sparkContext)
> {quote}
> then
> {quote}
> hiveContext.table("db_name.table")
> {quote}
> works but
> {quote}
> hiveContext.read.table("db_name.table")
> {quote}
> throws an {{org.apache.spark.sql.catalyst.analysis.NoSuchTableException}}
> However,
> {quote}
> hiveContext.sql("use db_name")
> hiveContext.read.table("table")
> {quote}
> works as expected



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to