[GitHub] spark pull request: cycle of deleting history log

2014-09-14 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/2391 cycle of deleting history log You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark xk2 Alternatively you can review and apply

[GitHub] spark pull request: cycle of deleting history log

2014-09-15 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2391#issuecomment-55688517 @srowen If we run spark application frequently, it will write many spark event log into spark.eventLog.dir. After a long time later, there will be many spark event log

[GitHub] spark pull request: Update configuration.md

2014-09-16 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/2406 Update configuration.md change the value of spark.files.fetchTimeout You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark master

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-09-18 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2391#discussion_r17766911 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -100,6 +125,12 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-09-18 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2391#issuecomment-56129613 @andrewor14 i have checked Hadoop's JobHistoryServer. it is JobHistoryServer's responsibility to delete the application logs. --- If your project is set up

[GitHub] spark pull request: Periodic cleanup event logs

2014-09-20 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/2471 Periodic cleanup event logs You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark deletelog2 Alternatively you can review

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-09-20 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2391#discussion_r17818744 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -214,6 +245,27 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-09-20 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2391#issuecomment-56261135 @vanzin , @andrewor14 .Thanks for your opinions. Because the source branch had been deleted by me, i can change it in this commit. i submit another commit[#2471

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-09-28 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-57083376 Thanks for your options. @vanzin @andrewor14 .i have changed code according your options. --- If your project is set up for it, you can reply to this email and have

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-10 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-58624839 @mattf @vanzin is this ok to go ? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-11 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2471#discussion_r18739911 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -195,22 +241,68 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-11 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2471#discussion_r18740124 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -195,22 +241,68 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-11 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2471#discussion_r18740169 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -195,22 +241,68 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-11 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2471#discussion_r18741084 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -195,22 +241,68 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-13 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/2471#discussion_r18763243 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -214,6 +224,43 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-14 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-59038886 in my opinion, spark create event log data, and spark delete it. In hadoop, event log is deleted by JobHistoryServer, not by fileSystem. --- If your project is set

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-15 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-59164199 @vanzin , is it ok to go? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-19 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-59648648 @vanzin. is it ok to go? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-21 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-60027733 @vanzin @andrewor14. is it ok to go? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2014-10-26 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-60541215 @vanzin @andrewor14 @srowen . is it ok to go? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request: invalid variable

2014-11-07 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/3154 invalid variable You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark master Alternatively you can review and apply these changes

[GitHub] spark pull request: [SPARK-3266] [WIP] Remove implementations from...

2014-11-19 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2951#issuecomment-63768467 are you still working on this? i am working on this. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [SPARK-5526][SQL] fix issue about cast to date

2015-02-03 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/4307 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: [SPARK-5526][SQL] fix issue about cast to date

2015-02-03 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/4307#issuecomment-72777528 Thanks, the issue is fixed by #4325. I will close this pr. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub

[GitHub] spark pull request: [SPARK-5526][SQL] fix issue about cast to date

2015-02-02 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/4307 [SPARK-5526][SQL] fix issue about cast to date You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark fixcastissue Alternatively

[GitHub] spark pull request: Remove unused function

2015-02-05 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/4418#issuecomment-73190743 @srowen ok. if it is useful later; we should change it like this def hasShutdownDeleteTachyonDir(file: TachyonFile): Boolean = { val absolutePath

[GitHub] spark pull request: remove unused function

2015-02-05 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/4418 remove unused function hasShutdownDeleteTachyonDir(file: TachyonFile) should use shutdownDeleteTachyonPaths(not shutdownDeletePaths) to determine Whether contain file. To solve it ,delete two

[GitHub] spark pull request: Remove unused function

2015-02-05 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/4418#issuecomment-73179289 cc @haoyuan --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature

[GitHub] spark pull request: [SPARK-5661]function hasShutdownDeleteTachyonD...

2015-02-08 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/4418#issuecomment-73341954 ok. i will create a JIRA. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-01-26 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/4214 [SPARK-3562]Periodic cleanup event logs You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark cleaneventlog Alternatively you can

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-01-26 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/2471#issuecomment-71575458 I have file a new pr #4214 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-01-28 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/4214#discussion_r23750085 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -163,9 +179,6 @@ private[history] class FsHistoryProvider(conf

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-01-28 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/4214#discussion_r23750759 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -113,12 +129,12 @@ private[history] class FsHistoryProvider

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-01-28 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/4214#discussion_r23750788 --- Diff: core/src/main/scala/org/apache/spark/deploy/history/FsHistoryProvider.scala --- @@ -53,8 +79,10 @@ private[history] class FsHistoryProvider(conf

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-01-28 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/2471 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: [SPARK-5661]function hasShutdownDeleteTachyonD...

2015-02-08 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/4418#issuecomment-73452955 @srowen it is a static function and unused now. I think we should better leave it in Utils class now. --- If your project is set up for it, you can reply

[GitHub] spark pull request: [SPARK-6521][Core]executors in the same node r...

2015-03-24 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/5178 [SPARK-6521][Core]executors in the same node read local shuffle file In the past, executor read other executor's shuffle file in the same node by net. This pr make that executors in the same node

[GitHub] spark pull request: [SPARK-6521][Core]executors in the same node r...

2015-03-31 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5178#issuecomment-88064150 @andrewor14 Pls review it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-6521][Core]executors in the same node r...

2015-03-27 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5178#issuecomment-86849484 Hi @andrewor14. pls retest it, test build time out. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [SPARK-3562]Periodic cleanup event logs

2015-02-25 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/4214#issuecomment-76109919 @andrewor14 thanks for your check. Pls retest it . I can not get test log. --- If your project is set up for it, you can reply to this email and have your reply

[GitHub] spark pull request: [SPARK-6521][Core]executors in the same node r...

2015-03-26 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/5178#discussion_r27193887 --- Diff: core/src/main/scala/org/apache/spark/storage/ShuffleBlockFetcherIterator.scala --- @@ -181,68 +181,100 @@ final class

[GitHub] spark pull request: [SPARK-6879][HistoryServer]check if app is com...

2015-04-13 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5491#issuecomment-92578144 I think it is ok. User must call sc.stop(), if not, it just not delete some event log. --- If your project is set up for it, you can reply to this email and have your

[GitHub] spark pull request: [Spark-6924]Fix driver hangs in yarn-client mo...

2015-04-23 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/5523 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: [Spark-6924]Fix driver hangs in yarn-client mo...

2015-04-23 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5523#issuecomment-95764614 ok. I will close it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-6521][Core]executors in the same node r...

2015-04-19 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5178#issuecomment-94250246 @maropu I will update it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-6521][Core]executors in the same node r...

2015-04-19 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/5178#discussion_r28650737 --- Diff: core/src/main/scala/org/apache/spark/shuffle/FileShuffleBlockManager.scala --- @@ -180,7 +180,8 @@ class FileShuffleBlockManager(conf: SparkConf

[GitHub] spark pull request: [Spark-6924]Fix driver hangs in yarn-client mo...

2015-04-15 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5523#issuecomment-93615543 @srowen I updated the jira. Pls review it. Thanks。 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well

[GitHub] spark pull request: [Spark-6924]Fix driver hangs in yarn-client mo...

2015-04-16 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/5523#discussion_r28502410 --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala --- @@ -75,6 +75,8 @@ import org.apache.spark.util._ */ class SparkContext

[GitHub] spark pull request: [Spark-6924]Fix driver hangs in yarn-client mo...

2015-04-16 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/5523#issuecomment-93728497 Between construction, it is normal that it didn't hear from executors. Only after construction, executors have connected and sent heartbeat to driver. We can indicates

[GitHub] spark pull request: [Spark-6924]Fix driver hangs in yarn-client mo...

2015-04-16 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/5523#discussion_r28501942 --- Diff: core/src/main/scala/org/apache/spark/SparkContext.scala --- @@ -75,6 +75,8 @@ import org.apache.spark.util._ */ class SparkContext

[GitHub] spark pull request: [Spark-6924]Fix client hands in yarn-client mo...

2015-04-15 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/5523 [Spark-6924]Fix client hands in yarn-client mode when net is broken https://issues.apache.org/jira/browse/SPARK-6924 You can merge this pull request into a Git repository by running: $ git

[GitHub] spark pull request: [SPARK-6479][Block Manager]Create off-heap blo...

2015-04-08 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/5430#discussion_r28030832 --- Diff: core/src/main/scala/org/apache/spark/storage/OffHeapStore.scala --- @@ -0,0 +1,142 @@ +/* + * Licensed to the Apache Software Foundation

[GitHub] spark pull request: [SPARK-6479][Block Manager]Create off-heap blo...

2015-04-08 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/5430#discussion_r28030751 --- Diff: core/src/main/scala/org/apache/spark/storage/OffHeapBlockManager.scala --- @@ -0,0 +1,110 @@ +/* + * Licensed to the Apache Software

[GitHub] spark pull request: correct buffer size

2015-08-14 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/8189 correct buffer size No need multiply columnType.defaultSize here, we have done it In ColumnBuilder class. buffer = ByteBuffer.allocate(4 + size * columnType.defaultSize) You can merge

[GitHub] spark pull request: correct buffer size

2015-08-14 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/8189#issuecomment-130988766 @liancheng @scwf is it OK? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-10-22 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9191#issuecomment-150176502 Hi @davies, Sorry, I do not understand Python. Can you help me fix it? --- If your project is set up for it, you can reply to this email and have your reply

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-10-27 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9191#issuecomment-151385025 @JoshRosen Is it ok? If it doesn't work, I will close this pr. --- If your project is set up for it, you can reply to this email and have your reply appear

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-10-23 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9191#issuecomment-150747910 @davies This test is flaky, pls re-test it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-10-21 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9191#issuecomment-150076597 @JoshRosen In my test environment, it do not have this error. Pls retest it.Thanks! --- If your project is set up for it, you can reply to this email and have

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-11-05 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/9191 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-11-05 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9191#issuecomment-154050289 ok. close it --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-10-20 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/9191 [SPARK-11225]Prevent generate empty file If no data will be written into the bucket, it will be generate empty files. So open() must be called in the first write(key,value). You can merge

[GitHub] spark pull request: [SPARK-11225]Prevent generate empty file

2015-10-21 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9191#issuecomment-149825404 Thanks @JoshRosen Sorry, I don't do performacne test. As I know, it will reduce number of open file. When there too much empty file, it will get some benefit

[GitHub] spark pull request: remove unnecessary copy

2015-11-30 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/10040#issuecomment-160824966 ok.close it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: remove unnecessary copy

2015-11-30 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/10040 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: remove unnecessary copy

2015-11-30 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/10040 remove unnecessary copy You can merge this pull request into a Git repository by running: $ git pull https://github.com/viper-kun/spark patch-1 Alternatively you can review and apply

[GitHub] spark issue #9113: [SPARK-11100][SQL]HiveThriftServer HA issue,HiveThriftSer...

2016-06-05 Thread viper-kun
Github user viper-kun commented on the issue: https://github.com/apache/spark/pull/9113 @rxin Is there any design about replacement? --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-11100][SQL]HiveThriftServer HA issue,Hi...

2016-05-31 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/9113#issuecomment-222619116 @xiaowangyu 1.Can you tell me how to work? Thrift server is easy to dump because of too many beelines. If beeline can choose less-load thrift server

[GitHub] spark issue #9113: [SPARK-11100][SQL]HiveThriftServer HA issue,HiveThriftSer...

2016-06-01 Thread viper-kun
Github user viper-kun commented on the issue: https://github.com/apache/spark/pull/9113 @xiaowangyu What I means is how to choose which active thrift server.It chooses random or by thrift server load. --- If your project is set up for it, you can reply to this email and have your

[GitHub] spark pull request: [SPARK-13112]CoarsedExecutorBackend register t...

2016-03-30 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/12078 [SPARK-13112]CoarsedExecutorBackend register to driver should wait Executor was ready ## What changes were proposed in this pull request? When CoarseGrainedExecutorBackend receives

[GitHub] spark pull request: Increase probability of using cached serialize...

2016-03-22 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/11886 Increase probability of using cached serialized status Increase probability of using cached serialized status You can merge this pull request into a Git repository by running

[GitHub] spark pull request: [SPARK-14065]Increase probability of using cac...

2016-03-22 Thread viper-kun
Github user viper-kun commented on a diff in the pull request: https://github.com/apache/spark/pull/11886#discussion_r57102821 --- Diff: core/src/main/scala/org/apache/spark/MapOutputTracker.scala --- @@ -443,13 +443,12 @@ private[spark] class MapOutputTrackerMaster(conf

[GitHub] spark pull request: [SPARK-13652][Core]Copy ByteBuffer in sendRpcS...

2016-03-05 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/11499#issuecomment-192824479 @zsxwing It is ok. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have

[GitHub] spark pull request: [SPARK-14065]Increase probability of using cac...

2016-04-01 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/11886#issuecomment-204319818 @srowen The logic is the same to me.If necessary, I will put serializeMapStatus into match...None. --- If your project is set up for it, you can reply

[GitHub] spark pull request: [SPARK-13112]CoarsedExecutorBackend register t...

2016-04-01 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/12078#issuecomment-204322877 I don't think so. It is a wrong order "after registed to drive first, it begin new Executor. --- If your project is set up for it, you can reply to this

[GitHub] spark pull request: [SPARK-4105][CORE] regenerate the shuffle file...

2016-04-26 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/12700#issuecomment-214948007 @jerryshao @srowen We met this problem in spark 1.4, spark 1.5 and spark 1.6 and just know shuffle file is broken. We can reproduce this problem by modify

[GitHub] spark pull request: [SPARK-14065]Increase probability of using cac...

2016-05-10 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/11886 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark pull request: [SPARK-14065]Increase probability of using cac...

2016-05-10 Thread viper-kun
Github user viper-kun commented on the pull request: https://github.com/apache/spark/pull/11886#issuecomment-218140398 @tgravescs ok, i close it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does

[GitHub] spark pull request #16632: [SPARK-19273]shuffle stage should retry when fetc...

2017-01-22 Thread viper-kun
Github user viper-kun closed the pull request at: https://github.com/apache/spark/pull/16632 --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature

[GitHub] spark issue #16632: [SPARK-19273]shuffle stage should retry when fetch shuff...

2017-01-22 Thread viper-kun
Github user viper-kun commented on the issue: https://github.com/apache/spark/pull/16632 ok, close it. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so

[GitHub] spark pull request #16632: [SPARK-19273]shuffle stage should retry when fetc...

2017-01-18 Thread viper-kun
GitHub user viper-kun opened a pull request: https://github.com/apache/spark/pull/16632 [SPARK-19273]shuffle stage should retry when fetch shuffle fail ## What changes were proposed in this pull request? shuffle stage should retry when fetch shuffle fail ## How

[GitHub] spark issue #16632: [SPARK-19273]shuffle stage should retry when fetch shuff...

2017-01-18 Thread viper-kun
Github user viper-kun commented on the issue: https://github.com/apache/spark/pull/16632 @srowen I have not test in master version. I will do it later. --- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does