[
https://issues.apache.org/jira/browse/SPARK-18353?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jason Pan updated SPARK-18353:
------------------------------
Comment: was deleted
(was: Hi Sean.
--conf also didn't make it work.
Thanks.)
> spark.rpc.askTimeout defalut value is not 120s
> ----------------------------------------------
>
> Key: SPARK-18353
> URL: https://issues.apache.org/jira/browse/SPARK-18353
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Affects Versions: 1.6.1, 2.0.1
> Environment: Linux zzz 3.10.0-327.el7.x86_64 #1 SMP Thu Oct 29
> 17:29:29 EDT 2015 x86_64 x86_64 x86_64 GNU/Linux
> Reporter: Jason Pan
> Priority: Critical
>
> in http://spark.apache.org/docs/latest/configuration.html
> spark.rpc.askTimeout 120s Duration for an RPC ask operation to wait
> before timing out
> the defalut value is 120s as documented.
> However when I run "spark-summit" with standalone cluster mode:
> the cmd is:
> Launch Command: "/opt/jdk1.8.0_102/bin/java" "-cp"
> "/opt/spark-2.0.1-bin-hadoop2.7/conf/:/opt/spark-2.0.1-bin-hadoop2.7/jars/*"
> "-Xmx1024M" "-Dspark.eventLog.enabled=true"
> "-Dspark.master=spark://9.111.159.127:7101" "-Dspark.driver.supervise=false"
> "-Dspark.app.name=org.apache.spark.examples.SparkPi"
> "-Dspark.submit.deployMode=cluster"
> "-Dspark.jars=file:/opt/spark-1.6.1-bin-hadoop2.6/lib/spark-examples-1.6.1-hadoop2.6.0.jar"
> "-Dspark.history.ui.port=18087" "-Dspark.rpc.askTimeout=10"
> "-Dspark.history.fs.logDirectory=file:/opt/tmp/spark-event"
> "-Dspark.eventLog.dir=file:///opt/tmp/spark-event"
> "org.apache.spark.deploy.worker.DriverWrapper"
> "spark://[email protected]:7103"
> "/opt/spark-2.0.1-bin-hadoop2.7/work/driver-20161109031939-0002/spark-examples-1.6.1-hadoop2.6.0.jar"
> "org.apache.spark.examples.SparkPi" "1000"
> Dspark.rpc.askTimeout=10
> the value is 10, it is not the same as document.
> Note: when I summit to REST URL, it has no this issue.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]