Github user mridulm commented on the pull request:

    https://github.com/apache/spark/pull/157#issuecomment-37758977
  
    This can be done with SPARK_JAVA_OPTS set to java debug options.
    That goes to master and executors.
    
    Practically, particularly in multi-tennet deployments this need not work
    due to port conflicts
     On Mar 16, 2014 7:45 AM, "baishuo(白硕)" <notificati...@github.com> 
wrote:
    
    > enable the user can do remote-debugging on the ExcutorRunner Process. we
    > need one flag to enable this function: spark.excutor.debug, an other flag
    > spark.excutor.debug.port to set the port. at normal time, the function is
    > off
    > if we want do remote-debuggging,weshould use -Dspark.excutor.debug=1 as
    > jvm arguments when we start worker process
    > ------------------------------
    > You can merge this Pull Request by running
    >
    >   git pull https://github.com/baishuo/spark master
    >
    > Or view, comment on, or merge it at:
    >
    >   https://github.com/apache/spark/pull/157
    > Commit Summary
    >
    >    - Update CommandUtils.scala
    >
    > File Changes
    >
    >    - *M*
    >    
core/src/main/scala/org/apache/spark/deploy/worker/CommandUtils.scala<https://github.com/apache/spark/pull/157/files#diff-0>(9)
    >
    > Patch Links:
    >
    >    - https://github.com/apache/spark/pull/157.patch
    >    - https://github.com/apache/spark/pull/157.diff
    >
    > —
    > Reply to this email directly or view it on 
GitHub<https://github.com/apache/spark/pull/157>
    > .
    >


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

Reply via email to