GitHub user andrewor14 opened a pull request:
https://github.com/apache/spark/pull/12198
[SPARK-14410][SQL] Push functions existence check into catalog
## What changes were proposed in this pull request?
This is a followup to #12117 and addresses some of the TODOs introduced
there. In particular, the resolution of database is now pushed into session
catalog, which knows about the current database. Further, the logic for
checking whether a function exists is pushed into the external catalog.
No change in functionality is expected.
## How was this patch tested?
`SessionCatalogSuite`, `DDLSuite`
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/andrewor14/spark function-exists
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/12198.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #12198
----
commit 8b13b7eb5710542c8c393722a3b3b29d20a8c9f2
Author: Andrew Or <[email protected]>
Date: 2016-04-06T06:01:20Z
Push resolve database from RunnableCommand to catalog
commit 6e6c689e62efd69009cb5c860291884cfcb1d05e
Author: Andrew Or <[email protected]>
Date: 2016-04-06T06:12:08Z
Push function exists check into external catalog
commit a3601fb14fa39c1dd7b620abc2ff39d21a9a221c
Author: Andrew Or <[email protected]>
Date: 2016-04-06T06:12:54Z
Rename: getX -> getXMetadata
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]