Github user goungoun commented on the issue:

    https://github.com/apache/spark/pull/20800
  
    For additional check that I mentioned. The following code shows that Spark 
users does not need to add take(1). ds.rdd.take(1).isEmpty is redundant.
    
    
[RDD.scala](https://github.com/apache/spark/blob/master/core/src/main/scala/org/apache/spark/rdd/RDD.scala)
 
    `def isEmpty(): Boolean = withScope {
        partitions.length == 0 || take(1).length == 0
    }`



---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to