[ 
https://issues.apache.org/jira/browse/SPARK-31001?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17597941#comment-17597941
 ] 

Kevin Appel commented on SPARK-31001:
-------------------------------------

[~nchammas] 

I ran into this recently trying to create the external table that is 
partitioned and I found a bunch of items but nothing was working easily without 
having to manually extract the ddl and remove the partition column.  I found in 
here [https://github.com/delta-io/delta/issues/31] there is vprus there that 
posted an example and they did that option and include the partition columns.

I used this and tried this and combined with the second item this is working 
for me, to create the external table in spark using the existing parquet data.  
From here since this is registered then I am also able to view this in Trino

spark.catalog.createTable("kevin.ktest1", "/user/kevin/ktest1", 
**\{"__partition_columns":"['id']"})
spark.sql("alter table kevin.ktest1 recover partitions")
 
Can you see if this is working for you as well?
 
If this is working for you as well then possibly we can get the Spark 
documentation updated to include this on using this

> Add ability to create a partitioned table via catalog.createTable()
> -------------------------------------------------------------------
>
>                 Key: SPARK-31001
>                 URL: https://issues.apache.org/jira/browse/SPARK-31001
>             Project: Spark
>          Issue Type: Improvement
>          Components: SQL
>    Affects Versions: 3.1.0
>            Reporter: Nicholas Chammas
>            Priority: Minor
>
> There doesn't appear to be a way to create a partitioned table using the 
> Catalog interface.
> In SQL, however, you can do this via {{{}CREATE TABLE ... PARTITIONED BY{}}}.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to