Github user Jiri-Kremser commented on the issue:

    https://github.com/apache/spark/pull/19802
  
    > Can you please explain more, and how to reproduce this issue?
    
    Sure,
    1. Start master and worker (in version 2.2.0 for instance):
    ```bash
    cat ./python/pyspark/version.py
    __version__='2.2.0'
    ```
    ```bash
    ./bin/spark-class org.apache.spark.deploy.worker.Worker spark://`hostname 
-I | cut -d' ' -f1`:7077 -c 1 -m 1G &> /dev/null &
    ./bin/spark-class org.apache.spark.deploy.master.Master 
    ```
    2. Then in another terminal run `spark-submit` from Spark in different 
version (be it `2.3.0-SNAPSHOT`)
    ```bash
    cat ./python/pyspark/version.py
    ...
    __version__ = "2.3.0.dev0"
    ```
    ```bash
    ./bin/spark-submit --class org.apache.spark.examples.SparkPi \
                       --master spark://`hostname -I | cut -d' ' -f1`:7077 \
                       --executor-memory 512M \
                       
$PWD/examples/target/scala-2.11/jars/spark-examples_2.11-2.3.0-SNAPSHOT.jar \
                       10
    ```
    
    
    result: in the Spark Master log there is the following exception:
    ```
    java.io.InvalidClassException: org.apache.spark.rpc.RpcEndpointRef; local 
class incompatible: stream classdesc serialVersionUID = -1329125091869941550, 
local class serialVersionUID = 1835832137613908542
    ```
    but on the `spark-submit` terminal there is nothing about the possibility 
of running a different version, actually it tries couple of times to connect 
and then fails on `17/11/24 13:13:58 ERROR StandaloneSchedulerBackend: 
Application has been killed. Reason: All masters are unresponsive! Giving up.
    `, here is the complete log: https://pastebin.com/Wzs8vjBd
    
    > Spark's RPC is not designed for version compatible.
    
    I hear you, on the other hand PR doesn't even try to make it compatible, 
all it does is to translate a cryptic error to more understandable one. I 
think, it may be quite common to run older spark-submit against the updated 
spark master or at least I've hit the issue couple of times. And I had to 
google the exception and then on stackoverflow or elsewhere figure out that it 
actually means that there was a version discrepancy.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to