GitHub user rxin opened a pull request:
https://github.com/apache/spark/pull/10982
[SPARK-13078][SQL] Internal catalog API - WIP
This pull request creates an internal catalog API. The creation of this API
is the first step towards consolidating SQLContext and HiveContext. I envision
we will have two different implementations in Spark 2.0: (1) a simple in-memory
implementation, and (2) an implementation based on the current HiveClient
(ClientWrapper).
I took a look at what Hive's internal metastore implementation, and then
created a simplified version of it that is more consistent with respect to
naming.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/rxin/spark SPARK-13078
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/10982.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #10982
----
commit 16c9395625b8c28d71d35e891fd4decabf9a0fa5
Author: Reynold Xin <[email protected]>
Date: 2016-01-29T10:02:51Z
[SPARK-13078][SQL] Internal catalog API
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]