Github user mtustin-handy commented on a diff in the pull request:
https://github.com/apache/spark/pull/11982#discussion_r57529308
--- Diff: core/src/main/scala/org/apache/spark/partial/SumEvaluator.scala
---
@@ -56,9 +56,12 @@ private[spark] class SumEvaluator(totalOutputs: Int,
confidence: Double)
val confFactor = {
if (counter.count > 100) {
new NormalDistribution().inverseCumulativeProbability(1 - (1 -
confidence) / 2)
- } else {
+ } else if (counter.count > 1) {
val degreesOfFreedom = (counter.count - 1).toInt
new
TDistribution(degreesOfFreedom).inverseCumulativeProbability(1 - (1 -
confidence) / 2)
+ } else {
+ // No way to meaningfully estimate confidence, so we signal no
particular confidence interval
+ Double.PositiveInfinity
--- End diff --
I'm not sure I follow you. If `sumStdev` is `NaN` it really won't matter
what this value is. If it's any other value, multiplying by `Infinity` results
in `Infinity`, which I believe is what is desired.
`sumStdDev` will only be `NaN` if `sumVar` is negative. By the construction
of `sumVar`, I don't think that's possible.
```
scala> Double.PositiveInfinity * 2
res1: Double = Infinity
scala> Double.PositiveInfinity * 2.0
res2: Double = Infinity
scala> Double.PositiveInfinity * Double.PositiveInfinity
res3: Double = Infinity
scala> Double.PositiveInfinity * Double.NegativeInfinity
res4: Double = -Infinity
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]