Github user vanzin commented on the issue:
https://github.com/apache/spark/pull/14231
I think the change is a little confusing and doesn't really explain what's
going on. I ran some tests ("ulimit -v" is your friend), and this patch covers
what's wrong here.
```
diff --git a/bin/spark-class b/bin/spark-class
index 658e076..cd19e30 100755
--- a/bin/spark-class
+++ b/bin/spark-class
@@ -80,8 +80,17 @@ done < <(build_command "$@")
COUNT=${#CMD[@]}
LAST=$((COUNT - 1))
LAUNCHER_EXIT_CODE=${CMD[$LAST]}
-if [ $LAUNCHER_EXIT_CODE != 0 ]; then
- exit $LAUNCHER_EXIT_CODE
+
+# Certain JVM failures result in error being printed to stdout (instead of
stderr), which causes
+# the code that parses the output of the launcher to get confused. In
those cases, we check if
+# the exit code is an integer, and if it's not, handle it as a special
error case.
+if [[ $LAUNCHER_EXIT_CODE =~ ^[0-9]+$ ]]; then
+ if [ "$LAUNCHER_EXIT_CODE" != 0 ]; then
+ exit "$LAUNCHER_EXIT_CODE"
+ fi
+else
+ echo "${CMD[@]}" | head -n-1 1>&2
+ exit 1
fi
CMD=("${CMD[@]:0:$LAST}")
```
Here's the error output:
```
$ ./bin/spark-submit
Error occurred during initialization of VM
Unable to load native library: /apps/jdk1.7.0_80/jre/lib/amd64/libjava.so:
failed to map segment from shared object
```
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]