gerashegalov opened a new pull request #28720:
URL: https://github.com/apache/spark/pull/28720


   An example of a command that fail in the client mode but would not in the 
cluster mode or if the same value were used for `spark.executor.memory`
   ```
   > $SPARK_HOME/bin/spark-shell --conf spark.driver.memory=" 4g "
   Invalid maximum heap size: -Xmx 4g
   Error: Could not create the Java Virtual Machine.
   Error: A fatal exception has occurred. Program will exit.
   ```
   also note that the same message is produced by the JVM makes it hard to 
identify which of the configs SPARK_DAEMON_MEMORY, SPARK_DRIVER_MEMORY, etc was 
responsible 
   
   ### Why are the changes needed?
   easier ops/diagnostics, more uniform behavior 
   
   
   ### Does this PR introduce _any_ user-facing change?
   fix-only change
   
   
   ### How was this patch tested?
   - unit tests
   - manual testing
   


----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to