GitHub user timvw opened a pull request:

    https://github.com/apache/spark/pull/19796

    [SPARK-22581][SQL] Catalog api does not allow to specify partitioning 
columns with create(external)table

    ## What changes were proposed in this pull request?
    
    Enhance Catalog api such that partition columns can be specified on 
createTable method
    
    ## How was this patch tested?
    
    Added a test to verify that the created table indeed has added a partition 
column


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/timvw/spark master

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/19796.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #19796
    
----
commit 8596be6f6266bcd21b69e19dad5243cac702ac8a
Author: Tim Van Wassenhove <[email protected]>
Date:   2017-11-22T10:32:04Z

    extended api and impl such that users can specify the partitioning columns 
as well

commit 9982fa81ed2ad2a0caf1152762aab91a6c8e0a3d
Author: Tim Van Wassenhove <[email protected]>
Date:   2017-11-22T12:33:33Z

    added test to verify that column is correctly added as partition column

commit 5d5b3a852bb7cfc707125eda3ecf1fa5638a46c9
Author: Tim Van Wassenhove <[email protected]>
Date:   2017-11-22T12:36:07Z

    corrected signature (forgot to remove passing of bucketspec)

commit ff423a36dc31f49eeda5e191ce902b2e951006f0
Author: Tim Van Wassenhove <[email protected]>
Date:   2017-11-22T13:10:22Z

    corrected test. no need to create data or to assert on it's existence. all 
we want to know is whether the columns are correctly recognized as partition 
columns

----


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to