[ 
https://issues.apache.org/jira/browse/SPARK-19935?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15907089#comment-15907089
 ] 

Xiaochen Ouyang commented on SPARK-19935:
-----------------------------------------

Throwing an operationNotAllowed exeception when we are running this command;
the method as foloowing:
override def visitCreateFileFormat(
      ctx: CreateFileFormatContext): CatalogStorageFormat = withOrigin(ctx) {
    (ctx.fileFormat, ctx.storageHandler) match {
      // Expected format: INPUTFORMAT input_format OUTPUTFORMAT output_format
      case (c: TableFileFormatContext, null) =>
        visitTableFileFormat(c)
      // Expected format: SEQUENCEFILE | TEXTFILE | RCFILE | ORC | PARQUET | 
AVRO
      case (c: GenericFileFormatContext, null) =>
        visitGenericFileFormat(c)
      case (null, storageHandler) =>
        operationNotAllowed("STORED BY", ctx)
      case _ =>
        throw new ParseException("Expected either STORED AS or STORED BY, not 
both", ctx)
    }
  }

I wonder that do we have a plan to support this command later?  because 
SparkSQL supports this command before spark2.0, Thanks!


> SparkSQL unsupports to create a hive table which is mapped for HBase table
> --------------------------------------------------------------------------
>
>                 Key: SPARK-19935
>                 URL: https://issues.apache.org/jira/browse/SPARK-19935
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0
>         Environment: spark2.0.2
>            Reporter: Xiaochen Ouyang
>
> SparkSQL unsupports the command as following:
>  CREATE TABLE spark_test(key int, value string)   
> STORED BY 'org.apache.hadoop.hive.hbase.HBaseStorageHandler'   
> WITH SERDEPROPERTIES ("hbase.columns.mapping" = ":key,cf1:val")   
> TBLPROPERTIES ("hbase.table.name" = "xyz");



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to