srowen commented on PR #44690:
URL: https://github.com/apache/spark/pull/44690#issuecomment-1968149149

   So, this _doesn't_ work:
   
   ```
   scala> val ONE_ENTIRE_RESOURCE: Long = 10000000000000000L
        | val taskAmount = 1.0/11.0
        | var total: Double = ONE_ENTIRE_RESOURCE
        | for (i <- 1 to 11 ) {
        |     if (total >= taskAmount * ONE_ENTIRE_RESOURCE) {
        |         total -= taskAmount * ONE_ENTIRE_RESOURCE
        |         println(s"assign $taskAmount for task $i, total left: ${total 
/ ONE_ENTIRE_RESOURCE}")
        |      } else {
        |         println(s"ERROR Can't assign $taskAmount for task $i, total 
left: ${total / ONE_ENTIRE_RESOURCE}")
        |      }
        | }
   ```
   
   But I think what you're doing is converting `taskAmount * 
ONE_ENTIRE_RESOURCE` to a long, thus rounding down and making the number a 
little smaller by throwing away a tiny bit.
   
   If that's the explanation, then I think that's pretty fine actually. But the 
comments suggest that somehow you avoid precision loss; in fact you're getting 
more precision loss, just erring in a direction that works because you are 
under-allocating by a tiiiiiny amount that won't matter in any real world 
scenario


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to