Github user kul commented on the pull request:
https://github.com/apache/spark/pull/4243#issuecomment-72786838
@marmbrus Thanks for review!
Rebased against master and sqashed in a new commit renaming
`schemaRDDOperations` to now more aptly called `dataFrameRDDOperations
Github user kul commented on the pull request:
https://github.com/apache/spark/pull/4327#issuecomment-72607021
since SchemaRDD in 1.3.0 was java compatible a few days back. Would it make
sense to alias `JavaSchemaRDD` as well?
---
If your project is set up for it, you can reply
Github user kul commented on the pull request:
https://github.com/apache/spark/pull/4327#issuecomment-72607251
Got it! Thanks.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
GitHub user kul opened a pull request:
https://github.com/apache/spark/pull/4243
[SPARK-5426][SQL] Add SparkSQL Java API helper methods.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com/kul/spark master
Alternatively you can
Github user kul commented on the pull request:
https://github.com/apache/spark/pull/4243#issuecomment-71807950
Looking into it further, seems like even in Scala one will have to do with
`.rdd` for normal spark operations as functions like filter etc are being
overwritten