[ 
https://issues.apache.org/jira/browse/SPARK-17047?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Dongjoon Hyun resolved SPARK-17047.
-----------------------------------
       Resolution: Fixed
    Fix Version/s: 2.3.0

This is resolved by SPARK-17729.

{code}
scala> sql("""
     | CREATE TABLE t2(
     | ID INT
     | , CLUSTERED INT
     | , SCATTERED INT
     | , RANDOMISED INT
     | , RANDOM_STRING VARCHAR(50)
     | , SMALL_VC VARCHAR(10)
     | , PADDING VARCHAR(10)
     | )
     | CLUSTERED BY (ID) INTO 256 BUCKETS
     | STORED AS ORC
     | TBLPROPERTIES ( "orc.compress"="SNAPPY",
     | "orc.create.index"="true",
     | "orc.bloom.filter.columns"="ID",
     | "orc.bloom.filter.fpp"="0.05",
     | "orc.stripe.size"="268435456",
     | "orc.row.index.stride"="10000" )
     | """)

scala> spark.version
res3: String = 2.3.0-SNAPSHOT
{code}

> Spark 2 cannot create table when CLUSTERED.
> -------------------------------------------
>
>                 Key: SPARK-17047
>                 URL: https://issues.apache.org/jira/browse/SPARK-17047
>             Project: Spark
>          Issue Type: Bug
>          Components: SQL
>    Affects Versions: 2.0.0, 2.1.1, 2.2.0
>            Reporter: Dr Mich Talebzadeh
>             Fix For: 2.3.0
>
>
> This does not work with CLUSTERED BY clause in Spark 2 now!
> CREATE TABLE test.dummy2
>  (
>      ID INT
>    , CLUSTERED INT
>    , SCATTERED INT
>    , RANDOMISED INT
>    , RANDOM_STRING VARCHAR(50)
>    , SMALL_VC VARCHAR(10)
>    , PADDING  VARCHAR(10)
> )
> CLUSTERED BY (ID) INTO 256 BUCKETS
> STORED AS ORC
> TBLPROPERTIES ( "orc.compress"="SNAPPY",
> "orc.create.index"="true",
> "orc.bloom.filter.columns"="ID",
> "orc.bloom.filter.fpp"="0.05",
> "orc.stripe.size"="268435456",
> "orc.row.index.stride"="10000" )
> scala> HiveContext.sql(sqltext)
> org.apache.spark.sql.catalyst.parser.ParseException:
> Operation not allowed: CREATE TABLE ... CLUSTERED BY(line 2, pos 0)



--
This message was sent by Atlassian JIRA
(v6.4.14#64029)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to