[
https://issues.apache.org/jira/browse/SPARK-1609?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=13979576#comment-13979576
]
Sean Owen commented on SPARK-1609:
----------------------------------
Hi witgo, on several issues you have opened, I have had trouble understanding
what is being reported and what the proposed solution is. Here I had to look a
while to see that the difference is "-server"? Are you proposing not setting
that option?
It's a JVM option. Does it cause the JVM to not start? You're running Java 8,
but I think -server is still a valid (if redundant) flag in that version.
Your log shows that SPARK_JAVA_OPTS is deprecated anyway. What about using the
invocation it suggests?
The errors are not obviously related to -server. It looks like workers are
simply failing to join the master. What about their logs?
> Executor fails to start when use spark-submit
> ---------------------------------------------
>
> Key: SPARK-1609
> URL: https://issues.apache.org/jira/browse/SPARK-1609
> Project: Spark
> Issue Type: Bug
> Components: Spark Core
> Reporter: witgo
> Priority: Blocker
> Attachments: spark.log
>
>
> {code}
> export SPARK_JAVA_OPTS="-server -Dspark.ui.killEnabled=false
> -Dspark.akka.askTimeout=120 -Dspark.akka.timeout=120
> -Dspark.locality.wait=10000
> -Dspark.storage.blockManagerTimeoutIntervalMs=6000000
> -Dspark.storage.memoryFraction=0.7
> -Dspark.broadcast.factory=org.apache.spark.broadcast.TorrentBroadcastFactory"
> {code}
> Executor fails to start.
> {code}
> export SPARK_JAVA_OPTS="-Dspark.ui.killEnabled=false
> -Dspark.akka.askTimeout=120 -Dspark.akka.timeout=120
> -Dspark.locality.wait=10000
> -Dspark.storage.blockManagerTimeoutIntervalMs=6000000
> -Dspark.storage.memoryFraction=0.7
> -Dspark.broadcast.factory=org.apache.spark.broadcast.TorrentBroadcastFactory"
> {code}
> Executor can work
--
This message was sent by Atlassian JIRA
(v6.2#6252)