GitHub user holdenk opened a pull request:

    https://github.com/apache/spark/pull/831

    Spark 1857 improve error message when trying perform a spark operation 
inside another spark operation

    This is a quick little PR that should improve error message when trying 
perform a spark operation inside another spark operation. Its implemented by 
adding a getPartitioner function to check if the partitioner is null and adding 
the same check to the existing sparkContext function on RDDs.

You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/holdenk/spark 
spark-1857-improve-error-message-when-trying-perform-a-spark-operation-inside-another-spark-operation

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/831.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #831
    
----
commit a88ac121deef213c96dbf2f1968dfef328e26e9e
Author: Holden Karau <[email protected]>
Date:   2014-05-19T22:44:41Z

    A quick pass at improving our error messages when trying to perform an RDD 
operation inside of another RDD operation. This refactors the access to the 
transients sc and partioner so that they can be checked. We don't change 
partioner itself since it is meant to be overrident.

commit fa0ec8b571b97892a55fd40cdfb2b4199de146a3
Author: Holden Karau <[email protected]>
Date:   2014-05-19T23:50:36Z

    Call get partioner in some other cases where we were using partitioner

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

Reply via email to