Github user Jiri-Kremser commented on the issue:
https://github.com/apache/spark/pull/19802
To explain better the intentions, it doesn't try to solve of somehow
provide a compatibility layer between old and new versions of Spark, all it
does is slightly improving the UX, because people are hitting the issue all the
time (including me)
couple of instances of this issue:
https://stackoverflow.com/questions/32241485/apache-spark-error-local-class-incompatible-when-initiating-a-sparkcontext-clas
https://stackoverflow.com/questions/38559597/failed-to-connect-to-spark-masterinvalidclassexception-org-apache-spark-rpc-rp
https://issues.apache.org/jira/browse/SPARK-13956
https://community.cloudera.com/t5/Advanced-Analytics-Apache-Spark/Spark-Standalone-error-local-class-incompatible-stream-classdesc/td-p/25909
https://github.com/USCDataScience/sparkler/issues/56
https://groups.google.com/a/lists.datastax.com/forum/#!topic/spark-connector-user/Z-4qSGbqhYc
https://sparkr.atlassian.net/browse/SPARKR-72
more here:
https://www.google.com/search?ei=BxMYWt3xLJDdwQKLwr9w&q=java.io.InvalidClassException%3A+org.apache.spark.rpc.RpcEndpointRef%3B+local+class+incompatible&oq=java.io.InvalidClassException%3A+org.apache.spark.rpc.RpcEndpointRef%3B+local+class+incompatible
btw. it's not only issue with `spark-submit`, `spark-shell` connecting to
a remote master has the same flaw and with this change it would get the message
that something is wrong.. Otherwise, currently it just hangs without any
additional error.--- --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
