Github user srowen commented on the pull request:

    https://github.com/apache/spark/pull/4611#issuecomment-74414416
  
    I don't think we want to take out this check entirely. This was changed 
recently to not just test whether the process can be killed (with `kill -0`) as 
a way of testing whether it's even a Spark process, but to further examine the 
command line. That stinks if this fails on long command lines.
    
    What about a weaker check like just grepping for `spark` in the command 
line? the point here is to not kill a foreign process that happens to have 
taken the PID named in a stale Spark PID file.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to