Github user hustfxj commented on the issue:
https://github.com/apache/spark/pull/16450
If we submit the application submission by SPARK REST API, we transmit the
configure by the sparkProperties. Like that:
```
curl -X POST http://spark-cluster-ip:6066/v1/submissions/create --header
"Content-Type:application/json;charset=UTF-8" --data'{
"action" : "CreateSubmissionRequest",
"appArgs" : [ "myAppArgument1" ],
"appResource" : "file:/myfilepath/spark-job-1.0.jar",
"clientSparkVersion" : "2.1.0",
"environmentVariables" : {
"SPARK_ENV_LOADED" : "1"
},
"mainClass" : "com.mycompany.MyJob",
"sparkProperties" : {
"spark.jars" : "file:/myfilepath/spark-job-1.0.jar",
"spark.driver.supervise" : "true",
"spark.app.name" : "MyJob",
"spark.eventLog.enabled": "true",
"spark.submit.deployMode" : "cluster",
"spark.master" : "spark://10.20.23.22:7077,10.20.23.21:7077"
}}
```
We hope "spark.master" of the driver 's configure should be
"spark://10.20.23.22:7077,10.20.23.21:7077", but in fact it maybe
spark://10.20.23.22:7077 due to the spark core's codeï¼
```
val conf = new SparkConf(false)
.setAll(sparkProperties)
.set("spark.master", masterUrl)
```
If we kill the master(spark://10.20.23.22:7077), and the
"spark://10.20.23.21:7077" will be masterãAfter we kill the driver, then the
spark can't restart the driver automatically. Because the driver only know old
master's address which is spark://10.20.23.22:7077.
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]