[
https://issues.apache.org/jira/browse/SPARK-15203?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15289009#comment-15289009
]
Apache Spark commented on SPARK-15203:
--------------------------------------
User 'WeichenXu123' has created a pull request for this issue:
https://github.com/apache/spark/pull/13172
> The spark daemon shell script error, daemon process start successfully but
> script output fail message.
> ------------------------------------------------------------------------------------------------------
>
> Key: SPARK-15203
> URL: https://issues.apache.org/jira/browse/SPARK-15203
> Project: Spark
> Issue Type: Bug
> Components: Deploy
> Affects Versions: 1.6.1, 1.6.2, 2.0.0, 2.1.0
> Reporter: Weichen Xu
> Priority: Minor
> Labels: patch
> Original Estimate: 24h
> Remaining Estimate: 24h
>
> When using sbin/start-master.sh to start spark master daemon, sometimes the
> daemon service started successfully, but the shell script print error message
> such as:
> failed to launch org.apache.spark.deploy.master.Master...
> it makes me confused.
> This bug is because, sbin/spark-daemon.sh script use bin/spark-class shell to
> start daemon, then sleep 2s and check whether the daemon process exists,
> using shell script like following:
> if [[ ! $(ps -p "$newpid" -o comm=) =~ "java" ]]
> the problem is, some machine with bad performance may start the daemon using
> a long time(exceeding 2s), but still can start daemon successfully, but in
> this case, the shell script judgement ! $(ps -p "$newpid" -o comm=) =~ "java"
> will fail, because at this time, the $newpid process is still shell process,
> until the daemon started, it turns into java process.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]