Github user felixcheung commented on the pull request:

    https://github.com/apache/spark/pull/10652#issuecomment-173110574
  
    I don't know if there is a way to distinguish that.
    It could be `spark-submit` or calling `SparkSubmit` class from Oozie and 
running the job in YARN client mode in which case the driver is actually 
running on a worker, which could be the same worker running executors.
    
    I guess we could explicitly bypass this if the cluster manager is `LOCAL`?
    



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to