Github user ceedubs commented on a diff in the pull request:
https://github.com/apache/spark/pull/21971#discussion_r207246593
--- Diff: core/src/main/scala/org/apache/spark/rdd/AsyncRDDActions.scala ---
@@ -61,6 +62,36 @@ class AsyncRDDActions[T: ClassTag](self: RDD[T]) extends
Serializable with Loggi
(index, data) => results(index) = data, results.flatten.toSeq)
}
+
+ /**
+ * Returns a future of an aggregation across the RDD.
+ *
+ * @see [[RDD.aggregate]] which is the synchronous version of this
method.
+ */
+ def aggregateAsync[U](zeroValue: U)(seqOp: (U, T) => U, combOp: (U, U)
=> U): FutureAction[U] =
+ self.withScope {
+ val cleanSeqOp = self.context.clean(seqOp)
+ val cleanCombOp = self.context.clean(combOp)
+ val combBinOp = new BinaryOperator[U] {
--- End diff --
Is there a cleaner way to integrate with `BinaryOperator` before Scala 2.12?
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]