wbo4958 commented on PR #44690: URL: https://github.com/apache/spark/pull/44690#issuecomment-1968217911
I know if we directly convert float to integer, there will be something like round down, but the fact is we have multiplied a Big number, so I think there is no round down anymore, please correct me if I'm wrong by giving a concrete sample. Thx. Just like I said before, why we should care about the real value of (1.0/11.0), the value is coming from the user end, if it's a little bigger than the real (1.0/11.0), what spark can only do is only allow 10 tasks running at the same time instead of 11. Spark can't do anything about this value (1.0/11) passed from the user. It's the user end's precision error instead of the spark itself. -- This is an automated message from the Apache Git Service. To respond to the message, please log on to GitHub and use the URL above to go to the specific comment. To unsubscribe, e-mail: [email protected] For queries about this service, please contact Infrastructure at: [email protected] --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
