HeartSaVioR edited a comment on pull request #28412:
URL: https://github.com/apache/spark/pull/28412#issuecomment-656158596
Yeah I’m OK either way. I think both ways wouldn’t bring major issues in
reality.
This is an
HeartSaVioR edited a comment on pull request #28412:
URL: https://github.com/apache/spark/pull/28412#issuecomment-655125113
@baohe-zhang
Thanks for the update. This is really helpful. So the small event log file
shows there's a chance the ratio can be beyond 1/2. (There's information
HeartSaVioR edited a comment on pull request #28412:
URL: https://github.com/apache/spark/pull/28412#issuecomment-654853105
So if I understand correctly, what we want to confirm is that the
(size+compression):memory ratio goes linearly on the number of tasks or not.
Like short running