GitHub user dongjoon-hyun opened a pull request:

    https://github.com/apache/spark/pull/15546

    [SPARK-17892][SQL] SQLBuilder should wrap the generated SQL with 
parenthesis for LIMIT

    ## What changes were proposed in this pull request?
    
    Currently, `SQLBuilder` handles `LIMIT` by simply adding `LIMIT` at the end 
of the generated subSQL. It makes `RuntimeException`s like the following. This 
PR adds a parenthesis to prevent it.
    
    **Before**
    ```scala
    scala> sql("CREATE TABLE tbl(id INT)")
    scala> sql("CREATE VIEW v1(id2) AS SELECT id FROM tbl LIMIT 2")
    java.lang.RuntimeException: Failed to analyze the canonicalized SQL: ...
    ```
    
    **After**
    ```scala
    scala> sql("CREATE TABLE tbl(id INT)")
    scala> sql("CREATE VIEW v1(id2) AS SELECT id FROM tbl LIMIT 2")
    scala> sql("SELECT id2 FROM v1")
    res4: org.apache.spark.sql.DataFrame = [id2: int]
    ```
    
    ## How was this patch tested?
    
    Pass the Jenkins test with a newly added test case.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/dongjoon-hyun/spark SPARK-17982

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/15546.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #15546
    
----
commit 85c0686a90395f3a7b56f22d78654e83e7ede7a6
Author: Dongjoon Hyun <dongj...@apache.org>
Date:   2016-10-19T03:15:48Z

    [SPARK-17892][SQL] SQLBuilder should wrap the generated SQL with 
parenthesis for LIMIT

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to