[GitHub] spark pull request: [SPARK-4098][YARN]use appUIAddress instead of ...

2014-10-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2958#issuecomment-60567168 Same replacement happened in https://github.com/apache/spark/pull/2276, same change in `runExecutorLauncher` is mentioned in that PR but done nothing

[GitHub] spark pull request: [SPARK-4096][YARN]Update executor memory descr...

2014-10-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2955#issuecomment-60587013 You mean we make `ApplicationMasterArguments` accept memory parameter in two kind of format, one is 2g style and another is just number in megabytes

[GitHub] spark pull request: [SPARK-4096][YARN]Update executor memory descr...

2014-10-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2955#issuecomment-60588263 There is a better idea: we use `MemoryParam` to accept only 2g style parameter in `ApplicationMaster` and let the memory string passed by `ClientBase` appended

[GitHub] spark pull request: [SPARK-4096][YARN]Update executor memory descr...

2014-10-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2955#issuecomment-60589510 Code updated. How about it? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-4096][YARN]Update executor memory descr...

2014-10-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2955#issuecomment-60590793 At beginning I misunderstood your point. Shame --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [SPARK-4098][YARN]use appUIAddress instead of ...

2014-10-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2958#issuecomment-60706448 @vanzin @benoyantony Could you two help to check? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [SPARK-3890][Docs]remove redundant spark.execu...

2014-10-13 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2745#issuecomment-58916414 @srowen I checked and found the line contains more than 100 charactors already, so keep the wrapping. The period is also deleted. --- If your project is set up

[GitHub] spark pull request: [SPARK-3890][Docs]remove redundant spark.execu...

2014-10-10 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2745#issuecomment-58735051 Actually I don't understant what `Runtime Environment` category means. --- If your project is set up for it, you can reply to this email and have your reply

[GitHub] spark pull request: [SPARK-3696]Do not override the user-difined c...

2014-10-03 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2541#issuecomment-57803237 @JoshRosen I have tested and it worked fine. You can also have a try simply. --- If your project is set up for it, you can reply to this email and have your

[GitHub] spark pull request: [SPARK-3696]Do not override the user-difined c...

2014-10-03 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2541#issuecomment-57803284 @JoshRosen I have tested and it worked fine. You can also have a try simply. --- If your project is set up for it, you can reply to this email and have your

[GitHub] spark pull request: [SPARK-3718] FsHistoryProvider should consider...

2014-09-30 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2573#issuecomment-57285525 Actually HistoryServer can read application logs generated by Spark apps on another node. The `spark.eventLog.dir` could be different between this and that. So

[GitHub] spark pull request: [SPARK-3718] FsHistoryProvider should consider...

2014-09-29 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2573#issuecomment-57131370 Looks like `spark.history.fs.logDirectory` and `spark.eventLog.dir` is same configuration item on different sides(driver side and HistoryServer side). I thingk

[GitHub] spark pull request: [SPARK-3722][Docs]minor improvement and fix in...

2014-09-29 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2579 [SPARK-3722][Docs]minor improvement and fix in docs https://issues.apache.org/jira/browse/SPARK-3722 You can merge this pull request into a Git repository by running: $ git pull https

[GitHub] spark pull request: [SPARK-3658][SQL]Take thrift server as a daemo...

2014-09-28 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2509#issuecomment-57082932 @liancheng I tried `export` and it worked. Thanks for the suggestion. Also modified permission of `stop-thriftserver.sh`. --- If your project is set up

[GitHub] spark pull request: [SPARK-3658][SQL]Take thrift server as a daemo...

2014-09-28 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2509#issuecomment-57097696 eh...I cloned the repository on another laptop and found it's executable, as shown in top-left corner of https://github.com/WangTaoTheTonic/spark/blob

[GitHub] spark pull request: [SPARK-3715][Docs]minor typo

2014-09-28 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2567 [SPARK-3715][Docs]minor typo https://issues.apache.org/jira/browse/SPARK-3715 You can merge this pull request into a Git repository by running: $ git pull https://github.com

[GitHub] spark pull request: [SPARK-3658][SQL]Take thrift server as a daemo...

2014-09-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on a diff in the pull request: https://github.com/apache/spark/pull/2509#discussion_r18123634 --- Diff: sbin/spark-daemon.sh --- @@ -142,8 +142,12 @@ case $startStop in spark_rotate_log $log echo starting $command

[GitHub] spark pull request: [SPARK-3696]Do not override the user-difined c...

2014-09-26 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2541 [SPARK-3696]Do not override the user-difined conf_dir https://issues.apache.org/jira/browse/SPARK-3696 We see if SPARK_CONF_DIR is already defined before assignment. You can merge

[GitHub] spark pull request: [SPARK-3658][SQL]Take thrift server as a daemo...

2014-09-25 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2509#issuecomment-56818354 @liancheng As you said, I put spark-submit as an option to achieve generalization and use source(dot) instead of `exec` to make `SUBMISSION_OPTS

[GitHub] spark pull request: [SPARK-3658]Take thrift server as a daemon

2014-09-23 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2509 [SPARK-3658]Take thrift server as a daemon https://issues.apache.org/jira/browse/SPARK-3658 And keep the `CLASS_NOT_FOUND_EXIT_STATUS` and exit message in `SparkSubmit.scala`. You

[GitHub] spark pull request: [SPARK-3599]Avoid loading properties file freq...

2014-09-20 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2454#issuecomment-56276648 I think it is better using lazy val for readability(putting all elements of defaultSparkProperties into value properties is more comfortable than conversely

[GitHub] spark pull request: [SPARK-3599]Avoid loading properties file freq...

2014-09-19 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2454#issuecomment-56252307 @vanzin Sorry for that. Fixed. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark pull request: [SPARK-3547]Using a special exit code instead ...

2014-09-18 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2421#issuecomment-56020425 @sarutak Thanks I see, thought only commiters can do it this way. --- If your project is set up for it, you can reply to this email and have your reply appear

[GitHub] spark pull request: [SPARK-3589][Minor]remove redundant code

2014-09-18 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2445 [SPARK-3589][Minor]remove redundant code You can merge this pull request into a Git repository by running: $ git pull https://github.com/WangTaoTheTonic/spark removeRedundant

[GitHub] spark pull request: [SPARK-3589][Minor]remove redundant code

2014-09-18 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on a diff in the pull request: https://github.com/apache/spark/pull/2445#discussion_r17765294 --- Diff: bin/spark-class --- @@ -169,7 +169,6 @@ if [ -n $SPARK_SUBMIT_BOOTSTRAP_DRIVER ]; then # This is used only if the properties file

[GitHub] spark pull request: [SPARK-3589][Minor]remove redundant code

2014-09-18 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on a diff in the pull request: https://github.com/apache/spark/pull/2445#discussion_r17766944 --- Diff: bin/spark-class --- @@ -169,7 +169,6 @@ if [ -n $SPARK_SUBMIT_BOOTSTRAP_DRIVER ]; then # This is used only if the properties file

[GitHub] spark pull request: [SPARK-3599]Avoid loaing properties file frequ...

2014-09-18 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2454 [SPARK-3599]Avoid loaing properties file frequently https://issues.apache.org/jira/browse/SPARK-3599 You can merge this pull request into a Git repository by running: $ git pull https

[GitHub] spark pull request: [SPARK-3565]Fix configuration item not consist...

2014-09-17 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2427 [SPARK-3565]Fix configuration item not consistent with document https://issues.apache.org/jira/browse/SPARK-3565 spark.ports.maxRetries should be spark.port.maxRetries. Make

[GitHub] spark pull request: [SPARK-3565]Fix configuration item not consist...

2014-09-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2427#issuecomment-55920064 I am not clear why there is an test failed in org.apache.spark.broadcast.BroadcastSuite, but it seems like it has nothing to do with this commit. Could we

[GitHub] spark pull request: [SPARK-3547]Using a special exit code instead ...

2014-09-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2421#issuecomment-55984128 Use 127 instead, it is the biggest prime number in those less than 128. How about it, guys? --- If your project is set up for it, you can reply

[GitHub] spark pull request: [SPARK-3565]Fix configuration item not consist...

2014-09-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2427#issuecomment-55984199 @andrewor14 Looks like Jenkins is not triggered? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [SPARK-3547]Using a special exit code instead ...

2014-09-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2421#issuecomment-55986443 Sorry for not noticnig If a command is not found, the child process created to execute it returns a status of 127. If a command is found but is not executable

[GitHub] spark pull request: [SPARK-3565]Fix configuration item not consist...

2014-09-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2427#issuecomment-55986575 He might be tired. -_- --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-3547]Using a special exit code instead ...

2014-09-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2421#issuecomment-55991903 Gosh, the test failed. I looked block generator throttling in NetworkReceiverSuite.scala but couldn't see why. --- If your project is set up for it, you can

[GitHub] spark pull request: Using a special exit code instead of 1 to repr...

2014-09-16 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2421 Using a special exit code instead of 1 to represent ClassNotFoundExcepti... ...on As improvement of https://github.com/apache/spark/pull/1944, we should use more special exit code

[GitHub] spark pull request: [SPARK-3547]Using a special exit code instead ...

2014-09-16 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/2421#issuecomment-55843315 @liancheng What do you think? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark pull request: [SPARK-2750] support https in spark web ui

2014-09-11 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1980#issuecomment-55356016 I did not make very much stuty of SecurityManager, but it only does authentication while not the encryption in communication. I am not sure

[GitHub] spark pull request: Optimize the schedule procedure in Master

2014-09-05 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1106#issuecomment-54588914 The PR is: https://issues.apache.org/jira/browse/SPARK-3411. Cause the filter will create copy of worker, so I change the way of filtering. The shuffle

[GitHub] spark pull request: [SPARK-3411] Improve load-balancing of concurr...

2014-09-05 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1106#issuecomment-54700647 To @JoshRosen , the pr title is already modified, so is the jira. @andrewor14 i think keeping track the workers' resource is too complex. So I choose worker

[GitHub] spark pull request: [SPARK-3344]Reformat code: add blank lines

2014-09-02 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/2234 [SPARK-3344]Reformat code: add blank lines Add blank lines between test cases. You can merge this pull request into a Git repository by running: $ git pull https://github.com

[GitHub] spark pull request: [SPARK-3344]Reformat code: add blank lines

2014-09-02 Thread WangTaoTheTonic
Github user WangTaoTheTonic closed the pull request at: https://github.com/apache/spark/pull/2234 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so

[GitHub] spark pull request: Typo in script

2014-08-25 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1926#issuecomment-53378027 All done. The issue is https://issues.apache.org/jira/browse/SPARK-3225. Please check. --- If your project is set up for it, you can reply to this email

[GitHub] spark pull request: [SPARK-2750][Web UI]Add Https support for Web ...

2014-08-15 Thread WangTaoTheTonic
Github user WangTaoTheTonic closed the pull request at: https://github.com/apache/spark/pull/1714 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so

[GitHub] spark pull request: Typo in script

2014-08-13 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/1926 Typo in script use_conf_dir = user_conf_dir in load-spark-env.sh. You can merge this pull request into a Git repository by running: $ git pull https://github.com/WangTaoTheTonic/spark

[GitHub] spark pull request: [Web UI]Make decision order of Worker's WebUI ...

2014-08-08 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1838#issuecomment-51675161 Could someone merge this? Thanks. @rxin --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request: [Web UI]Make decision order of Worker's WebUI ...

2014-08-07 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/1838 [Web UI]Make decision order of Worker's WebUI port consistent with Master's The decision order of Worker's WebUI port is --webui-port, SPARK_WORKER_WEBUI_POR, 8081(default

[GitHub] spark pull request: [Web UI]Make decision order of Worker's WebUI ...

2014-08-07 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1838#issuecomment-51499869 Sorry for my carelessness. Now I fixed it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [SPARK-2750]Add Https support for Web UI

2014-08-05 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1714#issuecomment-51287555 Anyone verify this? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-2750]Add Https support for Web UI

2014-08-01 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/1714 [SPARK-2750]Add Https support for Web UI https://issues.apache.org/jira/browse/SPARK-2750 Already tested on 1 master, 3worker cluster. You can merge this pull request into a Git

[GitHub] spark pull request: Optimize the schedule procedure in Master

2014-06-19 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1106#issuecomment-46536007 Make it short, the commit will better balance the load strategy when there comes a lot of drivers, while not result in bad performance when drivers is few

[GitHub] spark pull request: Optimize the schedule procedure in Master

2014-06-18 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1106#issuecomment-46401368 Another situation is that the works lists changes frequently, which will make drivers relaunching a lot. --- If your project is set up for it, you can reply

[GitHub] spark pull request: Minor fix

2014-06-18 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1105#issuecomment-46437289 Updated. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: Minor fix

2014-06-17 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/1105 Minor fix The value env is never used in SparkContext.scala. Add detailed comment for method setDelaySeconds in MetadataCleaner.scala instead of the unsure one. You can merge

[GitHub] spark pull request: Optimize the schedule procedure in Master

2014-06-17 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/1106 Optimize the schedule procedure in Master If the waiting driver array is too big, the drivers in it will be dispatched to the first worker we get(if it has enough resources

[GitHub] spark pull request: Optimize the schedule procedure in Master

2014-06-17 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/1106#issuecomment-46390574 You mean the increased shuffles may lead to a bad performance? --- If your project is set up for it, you can reply to this email and have your reply appear

[GitHub] spark pull request: Handle the vals that never used

2014-04-27 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/565#issuecomment-41492225 Updated, please check. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: Handle the vals that never used

2014-04-26 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/565 Handle the vals that never used In XORShiftRandom.scala, use val million instead of constant 1e6.toInt. Delete vals that never used in other files. You can merge this pull request

[GitHub] spark pull request: Handle the vals that never used

2014-04-26 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/565#issuecomment-41473587 Hi Owen, thanks for your suggestion. I inspected the unused assignment, method parameters and symbol using Intellij IDEA, here is the results exclude test

[GitHub] spark pull request: Handle the vals that never used

2014-04-26 Thread WangTaoTheTonic
Github user WangTaoTheTonic commented on the pull request: https://github.com/apache/spark/pull/565#issuecomment-41484541 Thanks for that, i already fixed it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request: Delete the val that never used

2014-04-25 Thread WangTaoTheTonic
GitHub user WangTaoTheTonic opened a pull request: https://github.com/apache/spark/pull/553 Delete the val that never used It seems that the val startTime and endTime is never used, so delete them. You can merge this pull request into a Git repository by running: $ git pull

<    1   2   3   4   5