GitHub user xwu0226 opened a pull request:

    https://github.com/apache/spark/pull/13088

    [Spark-15236][SQL][SPARK SHELL] Add spark-defaults property to switch to 
use InMemoryCatalog

    ## What changes were proposed in this pull request?
    1. Add property `spark.user.hive.catalog ` in the 
spark-defaults.conf.template 
    2. Change REPL/Main to check this property to decide if `enableHiveSupport 
`should be called. 
    
    ## How was this patch tested?
    Run the REPL component test. 


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/xwu0226/spark SPARK-15236

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/13088.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #13088
    
----
commit ef5944949f362a5f4846b218f18d5ceab7853349
Author: xin Wu <[email protected]>
Date:   2016-05-08T07:06:36Z

    spark-15206 add testcases for distinct aggregate in having clause following 
up PR12974

commit 5f55a1a54099668c82b5404c2929b36ac88cd34b
Author: xin Wu <[email protected]>
Date:   2016-05-08T07:09:44Z

    Revert "spark-15206 add testcases for distinct aggregate in having clause 
following up PR12974"
    
    This reverts commit 98a1f804d7343ba77731f9aa400c00f1a26c03fe.

commit 74c65dbeb60cda3b2aade493542eed4470a6143c
Author: xin Wu <[email protected]>
Date:   2016-05-13T01:40:37Z

    SPARK-15236: add a spark-defaults.conf property  to control whehter REPL 
will use Hive catalog or not

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to