Github user mwws commented on a diff in the pull request:

    https://github.com/apache/spark/pull/11595#discussion_r55616064
  
    --- Diff: core/src/main/scala/org/apache/spark/rdd/RDD.scala ---
    @@ -85,10 +85,13 @@ abstract class RDD[T: ClassTag](
       private def sc: SparkContext = {
         if (_sc == null) {
           throw new SparkException(
    -        "RDD transformations and actions can only be invoked by the 
driver, not inside of other " +
    -        "transformations; for example, rdd1.map(x => rdd2.values.count() * 
x) is invalid because " +
    -        "the values transformation and count action cannot be performed 
inside of the rdd1.map " +
    -        "transformation. For more information, see SPARK-5063.")
    +        "This RDD refers to empty SparkContext. It could happen in 
following cases: \n(1) RDD " +
    --- End diff --
    
    Shame... I will fix it. Thanks for the comment


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to