Github user wulei-bj-cn commented on the pull request:

    https://github.com/apache/spark/pull/8533#issuecomment-138051802
  
    Hi Sean,
    As you suggested, I gave up modifying Utils.scala, and tried to resolve 
unspecified host names to IP addresses in 
org.apache.spark.scheduler.TaskSetManager. With this patch, I tried both 
scenarios with/without specifying SPARK_LOCAL_HOSTNAME manually, the testing 
results turned out to be satisfying, i.e. locality level 'ANY' from HadoopRDD 
is eliminated, and we got 'NODE_LOCAL' always where it's supposed to be. Would 
you plz help review my patch and see if it's suitable to be merged ? Thanks !


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to