HeartSaVioR edited a comment on pull request #30139:
URL: https://github.com/apache/spark/pull/30139#issuecomment-718410305
I somehow revisited the related configuration
`spark.shuffle.maxChunksBeingTransferred` and realized the default value is set
to `Long.MAX_VALUE`. (effectively off
HeartSaVioR edited a comment on pull request #30139:
URL: https://github.com/apache/spark/pull/30139#issuecomment-718410305
I somehow revisited the related configuration
`spark.shuffle.maxChunksBeingTransferred` and realized the default value is set
to `Long.MAX_VALUE`. (effectively off
HeartSaVioR edited a comment on pull request #30139:
URL: https://github.com/apache/spark/pull/30139#issuecomment-717022097
Sorry but your latest change doesn't actually lock properly. Long is
immutable, and you always replace the object when you do the calculation and
assign to the
HeartSaVioR edited a comment on pull request #30139:
URL: https://github.com/apache/spark/pull/30139#issuecomment-717022097
Sorry but your latest change doesn't actually lock properly. Long is
immutable, and you always replace the object when you do the calculation and
assign to the