Github user Pashugan commented on the issue:
https://github.com/apache/spark/pull/16913
JAVA_TOOL_OPTIONS doesn't take precedence over command line options (in
oppose to _JAVA_OPTIONS). Thus, JVM is trying to start with an initial heap
size set in JAVA_TOOL_OPTIONS and a max heap size overridden by the spark-class
script, which is currently 128m. If I set a system-wide initial heap size
bigger than 128m I'll get a conflict with the max heap size option hardcoded in
the Spark code, so it looks to me like a Spark problem.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]