GitHub user c-sahuja opened a pull request:
https://github.com/apache/spark/pull/16185
Update Spark documentation to provide information on how to create External
Table
## What changes were proposed in this pull request?
Although, currently, the saveAsTable does not provide an API to save the
table as an external table from a DataFrame, we can achieve this functionality
by using options on DataFrameWriter where the key for the map is the String:
"path" and the value is another String which is the location of the external
table itself. This can be provided before the call to saveAsTable is performed.
## How was this patch tested?
Documentation was reviewed for formatting and content after the push was
performed on the branch.

You can merge this pull request into a Git repository by running:
$ git pull https://github.com/c-sahuja/spark createExternalTable
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/16185.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #16185
----
commit ff5f20f3c959586595c311e135540c35801dd9d9
Author: c-sahuja <[email protected]>
Date: 2016-12-06T14:19:19Z
Initial commit
commit 807569c976d922d007e6afc10dde4d16d2744009
Author: c-sahuja <[email protected]>
Date: 2016-12-07T02:50:30Z
Updated the documentation on how to create an external table for review
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]