GitHub user dwmclary opened a pull request:
https://github.com/apache/spark/pull/4421
Spark-2789: Apply names to RDD to create DataFrame
This seemed like a reasonably useful function to add to SparkSQL. However,
unlike the [JIRA](https://issues.apache.org/jira/browse/SPARK-2789), this
implementation does not parse type characters (e.g. brackets and braces). This
method creates a DataFrame with column names that map to the existing types in
the RDD. In general, this seems far more useful, as users likely wish to
quickly apply names to existing collections.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/dwmclary/spark SPARK-2789
Alternatively you can review and apply these changes as the patch at:
https://github.com/apache/spark/pull/4421.patch
To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:
This closes #4421
----
commit df8b01528519ebe0c480daedcc5099306e690a5e
Author: Dan McClary <[email protected]>
Date: 2015-02-05T18:56:14Z
basic apply names functionality
commit 15eb351e2a1c43191193bca768607cc56ce3aede
Author: Dan McClary <[email protected]>
Date: 2015-02-05T23:31:04Z
working for map type
commit aa38d7618a9cd069f73cf8673bfdef4ecc0fe339
Author: Dan McClary <[email protected]>
Date: 2015-02-06T02:43:30Z
added array and list types, struct types don't seem relevant
commit 29d8ffa58b6faa9f20b9c36b5afe649d523e2eb8
Author: Dan McClary <[email protected]>
Date: 2015-02-06T05:14:34Z
added applyNames to pyspark
commit 8c773b372c122c4b90f375933e83816ec99ace1d
Author: Dan McClary <[email protected]>
Date: 2015-02-06T07:41:24Z
added pyspark method and tests
----
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]