Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-151339424
Dear all,
@andrewor14 @sryza @cmccabe would you plz help take a look at this issue ?
Thanks so much !
---
If your project is set up for it, you can reply to
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-149078255
And furthermore, this always being 'ANY' issue in Spark will cause Tachyon
to keep re-caching remote memory blocks via networks from remote Tachyon
work
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-149077583
Dear all,
I know it's been more than a month since I submitted this PR, and probably
it's been rejected sliently. After a recent test with Spark +
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-139485003
@CodingCat Good point! Have just updated it to 'val'. See if there are any
further comments/suggestions ?
---
If your project is set up for it, you ca
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-138486424
@CodingCat Thanks so much for your suggestion! It does sound fair/smart to
make the patch as cleaner as possible. As you suggested, I moved the fixes into
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-138171655
@CodingCat It's quite smart to set SPARK_LOCAL_HOSTNAME=`hostname` to save
the efforts and energy I mentioned in earlier comments when it comes to
deploymen
Github user wulei-bj-cn commented on a diff in the pull request:
https://github.com/apache/spark/pull/8533#discussion_r38831196
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -322,7 +333,12 @@ private[spark] class TaskSetManager
Github user wulei-bj-cn commented on a diff in the pull request:
https://github.com/apache/spark/pull/8533#discussion_r38819729
--- Diff:
core/src/main/scala/org/apache/spark/scheduler/TaskSetManager.scala ---
@@ -190,11 +197,15 @@ private[spark] class TaskSetManager
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-138051802
Hi Sean,
As you suggested, I gave up modifying Utils.scala, and tried to resolve
unspecified host names to IP addresses in
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-137006608
Thanks Sean for your advice. As you suggested, I tried to translate
hostnames to IP addresses in org.apache.spark.scheduler.TaskSetManager, and it
turned out
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-136648219
Basically, yes. Indeed this locality level being "ANY" is directly caused
from org.apache.spark.scheduler.TaskSetManager:
// Check for node-l
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-136570529
Dear Owen, thanks for checking my updates. I'm not saying this locality
level being ANY all the time issue is caused by your code. Actually, it lies in
co
Github user wulei-bj-cn commented on the pull request:
https://github.com/apache/spark/pull/8533#issuecomment-136305824
@srowen Thanks for your quick response! To make my patch less impact
existing codes, I made a little adjustment. Would you plz help take a look
again ? Thanks
GitHub user wulei-bj-cn opened a pull request:
https://github.com/apache/spark/pull/8533
Function localHostName() is trying to fetch the hostname for each of â¦
â¦the hosts, yet when "SPARK_LOCAL_HOSTNAME" is not set, i.e.
customHostname is null, this function will tr
14 matches
Mail list logo