[GitHub] spark pull request: [SPARK-4314][Streaming] Exception throws when ...

2014-12-23 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3203#issuecomment-68023838
  
refer to jira,  got it 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4314][Streaming] Exception throws when ...

2014-12-23 Thread maji2014
Github user maji2014 closed the pull request at:

https://github.com/apache/spark/pull/3203


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4691][Minor] Rewrite a few lines in shu...

2014-12-06 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3553#issuecomment-65904987
  
NP, done for title change


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [spark-4691][shuffle]Code improvement for aggr...

2014-12-05 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3553#issuecomment-65761712
  
@pwendell any idea about this title?/


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [spark-4691][shuffle]code optimization for jud...

2014-12-02 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/3553

[spark-4691][shuffle]code optimization for judgement

In HashShuffleReader.scala and HashShuffleWriter.scala, no need to judge 
dep.aggregator.isEmpty again as this is judged by dep.aggregator.isDefined

In SortShuffleWriter.scala, dep.aggregator.isEmpty  is better than 
!dep.aggregator.isDefined ?



You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark spark-4691

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/3553.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3553


commit d8f52dc7dc34bc6c1c368d790b7bdfe30c4eb529
Author: maji2014 ma...@asiainfo.com
Date:   2014-12-02T09:54:33Z

code optimization for judgement




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [spark-4691][shuffle]code optimization for jud...

2014-12-02 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3553#discussion_r21207386
  
--- Diff: 
core/src/main/scala/org/apache/spark/shuffle/hash/HashShuffleReader.scala ---
@@ -45,7 +45,7 @@ private[spark] class HashShuffleReader[K, C](
   } else {
 new InterruptibleIterator(context, 
dep.aggregator.get.combineValuesByKey(iter, context))
   }
-} else if (dep.aggregator.isEmpty  dep.mapSideCombine) {
+} else if (dep.mapSideCombine) {
--- End diff --

if(dep.aggregator.isDefined) else if (dep.aggregator.isEmpty)  seems 
duplicate. 
isEmpty == !isDefined
We need to do another one more judgement for dep.aggregator.isEmpty.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [spark-4691][shuffle]code optimization for jud...

2014-12-02 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3553#discussion_r21213166
  
--- Diff: 
core/src/main/scala/org/apache/spark/shuffle/hash/HashShuffleReader.scala ---
@@ -45,7 +45,7 @@ private[spark] class HashShuffleReader[K, C](
   } else {
 new InterruptibleIterator(context, 
dep.aggregator.get.combineValuesByKey(iter, context))
   }
-} else if (dep.aggregator.isEmpty  dep.mapSideCombine) {
+} else if (dep.mapSideCombine) {
--- End diff --

Yes, this seems simple and elegant


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [spark-4691][shuffle]code optimization for jud...

2014-12-02 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3553#discussion_r21213167
  
--- Diff: 
core/src/main/scala/org/apache/spark/shuffle/hash/HashShuffleReader.scala ---
@@ -45,7 +45,7 @@ private[spark] class HashShuffleReader[K, C](
   } else {
 new InterruptibleIterator(context, 
dep.aggregator.get.combineValuesByKey(iter, context))
   }
-} else if (dep.aggregator.isEmpty  dep.mapSideCombine) {
+} else if (dep.mapSideCombine) {
--- End diff --

Yes, this seems simple and elegant


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4619][Storage]delete redundant time suf...

2014-11-26 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/3475

[SPARK-4619][Storage]delete redundant time suffix

Time suffix exists in Utils.getUsedTimeMs(startTime), no need to append 
again, delete that 


You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark SPARK-4619

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/3475.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3475


commit df0da4edec947b7aab4fe3331878641a4ab65d37
Author: maji2014 ma...@asiainfo.com
Date:   2014-11-26T08:29:26Z

delete redundant time suffix




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4314][Streaming] Exception when textFil...

2014-11-13 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3203#issuecomment-62904632
  
any other place should be changed?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4314][Streaming] Exception when textFil...

2014-11-12 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3203#issuecomment-62714198
  
Yes,  not any other cases about hdfs should be modified from current 
situation!


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4314][Streaming] Exception when textFil...

2014-11-11 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3203#issuecomment-62551435
  
I know that this form is not simple and elegant. This issue is found in our 
project. The reason I define prefix and suffix variable is that i am not sure 
how many suffix should be filtered later. That's ok, i can change this to your 
more compact form.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4314][Streaming] Exception when textFil...

2014-11-11 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3203#issuecomment-62661477
  
About other places when an incomplete file might be read, from my point of 
view, HDFS file could be read by streaming[such as: HdfsWordCount] and 
spark[such as HdfsTest], the hdfs file is read dynamically in streaming which 
is handled by this patch. in spark core, the file isn't read dynamically .
Do you have other suggestions? or merge this patch now? 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4295][External]Fix exception in SparkSi...

2014-11-10 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3177#issuecomment-62363954
  
Yes,  the test cases are all passed although exception throws before each 
test case. 
you can run SparkSinkSuite.scala directly and know the appearance like
4/11/10 02:01:22 ERROR MonitoredCounterGroup: Failed to register monitored 
counter group for type: CHANNEL, name: null, 
javax.management.InstanceAlreadyExistsException: 
org.apache.flume.channel:type=null


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4295][External]Fix exception in SparkSi...

2014-11-10 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3177#discussion_r20130942
  
--- Diff: 
external/flume-sink/src/test/scala/org/apache/spark/streaming/flume/sink/SparkSinkSuite.scala
 ---
@@ -159,6 +159,7 @@ class SparkSinkSuite extends FunSuite {
 channelContext.put(transactionCapacity, 1000.toString)
 channelContext.put(keep-alive, 0.toString)
 channelContext.putAll(overrides)
+channel.setName(getClass.getDeclaredMethods.toString)
--- End diff --

As i have no good idea about how to get a random string name, such as how 
to get each test case name. In getClass.getDeclaredMethods.toString, we can get 
different string like[Ljava.lang.reflect.Method;@5f3ef269.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4295][External]Fix exception in SparkSi...

2014-11-10 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3177#discussion_r20132263
  
--- Diff: 
external/flume-sink/src/test/scala/org/apache/spark/streaming/flume/sink/SparkSinkSuite.scala
 ---
@@ -159,6 +159,7 @@ class SparkSinkSuite extends FunSuite {
 channelContext.put(transactionCapacity, 1000.toString)
 channelContext.put(keep-alive, 0.toString)
 channelContext.putAll(overrides)
+channel.setName(getClass.getDeclaredMethods.toString)
--- End diff --

That's ok


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4314][Streaming] Exception when textFil...

2014-11-10 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/3203

[SPARK-4314][Streaming] Exception when textFileStream attempts to read 
deleted _COPYING_ file

The ephemeral file(_COPYING_) is caught by FileInputDStream interface.
On one hand, the file could be counted two times including file itself and 
file._COPYING_ if _COPYING_  file is not deleted when spark handle this file.
On the other hand, exception appears when textFileStream attempts to read 
deleted _COPYING_ file if _COPYING is deleted when spark handle this file.
Therefore, add filter for intermediate partly written state 

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark spark-4314

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/3203.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3203


commit e298b8905c654c3d072d70f6590c3ea923e26e91
Author: maji2014 ma...@asiainfo.com
Date:   2014-11-11T06:22:47Z

fix exception for spark-4314




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: [SPARK-4295][External]Fix exception in SparkSi...

2014-11-09 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/3177

[SPARK-4295][External]Fix exception in SparkSinkSuite

Handle exception in SparkSinkSuite, please refer to [SPARK-4295]

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark spark-4295

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/3177.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3177


commit c807bf66a8d945708af0f620576255cc133ffe46
Author: maji2014 ma...@asiainfo.com
Date:   2014-11-09T15:58:50Z

Fix exception in SparkSinkSuite




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Sink2 and channel2 should be closed in Flume

2014-11-05 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/3037#issuecomment-61782535
  
I summit this request 5 days ago, but most codes are changed 2 days ago and 
including  sink2.stop and channel2.stop, so i close this request


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Sink2 and channel2 should be closed in Flume

2014-11-05 Thread maji2014
Github user maji2014 closed the pull request at:

https://github.com/apache/spark/pull/3037


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Sink2 and channel2 should be closed in Flume

2014-11-01 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3037#discussion_r19704384
  
--- Diff: 
external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala 
---
@@ -184,7 +184,7 @@ object FlumeUtils {
   hostname: String,
   port: Int
 ): JavaReceiverInputDStream[SparkFlumeEvent] = {
-createPollingStream(jssc, hostname, port, 
StorageLevel.MEMORY_AND_DISK_SER_2)
+createPollingStream(jssc.ssc, hostname, port, 
StorageLevel.MEMORY_AND_DISK_SER_2)
--- End diff --

In order to keep the same style for all java related method. such as 
createStream and createPollingStream.So several jssc have been changed.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: sink2 and channel2 should be closed

2014-10-31 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/3037

sink2 and channel2 should be closed

as title

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/3037.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #3037


commit ec8be81dce106f5af2bbd5762683489dcce8f1ad
Author: derek ma ma...@asiainfo-linkage.com
Date:   2014-10-31T13:35:11Z

sink2 and channel2 should be closed

as title




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Sink2 and channel2 should be closed in Flume

2014-10-31 Thread maji2014
Github user maji2014 commented on a diff in the pull request:

https://github.com/apache/spark/pull/3037#discussion_r19700506
  
--- Diff: 
external/flume/src/main/scala/org/apache/spark/streaming/flume/FlumeUtils.scala 
---
@@ -184,7 +184,7 @@ object FlumeUtils {
   hostname: String,
   port: Int
 ): JavaReceiverInputDStream[SparkFlumeEvent] = {
-createPollingStream(jssc, hostname, port, 
StorageLevel.MEMORY_AND_DISK_SER_2)
+createPollingStream(jssc.ssc, hostname, port, 
StorageLevel.MEMORY_AND_DISK_SER_2)
--- End diff --

Only one .count.map method in main method In al examples. I thought 
that's the additional call. 
Please keep original status if that's not the additional call.
 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---

-
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org



[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-19 Thread maji2014
Github user maji2014 closed the pull request at:

https://github.com/apache/spark/pull/1457


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-19 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/1494

Required AM memory is amMem, not args.amMemory

ERROR yarn.Client: Required AM memory (1024) is above the max threshold 
(1048) of this cluster appears if this code is not changed. obviously, 1024 is 
less than 1048, so change this

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/1494.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1494


commit b0f66400990befead3a6aaa1172112bd090272e8
Author: derek ma ma...@asiainfo-linkage.com
Date:   2014-07-19T10:53:08Z

Required AM memory is amMem, not args.amMemory

ERROR yarn.Client: Required AM memory (1024) is above the max threshold 
(1048) of this cluster appears if this code is not changed. obviously, 1024 is 
less than 1048, so change this




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-17 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/1457

Required AM memory is amMem, not args.amMemory

ERROR yarn.Client: Required AM memory (1024) is above the max threshold 
(1048) of this cluster appears if this code is not changed. obviously, 1024 is 
less than 1048, so change this

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/1457.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1457


commit 2cc1af6e69fa1dc1a76383e44d9b5389d4e133af
Author: maji2014 ma...@asiainfo-linkage.com
Date:   2014-06-08T11:19:01Z

Update run-example

Old code can only be ran under spark_home and use bin/run-example.
 Error ./run-example: line 55: ./bin/spark-submit: No such file or 
directory appears when running in other place. So change this

commit 9975f13235c4e4f325df8419878b0977db533e5f
Author: derek ma ma...@asiainfo-linkage.com
Date:   2014-07-17T06:35:30Z

Required AM memory is amMem, not args.amMemory

ERROR yarn.Client: Required AM memory (1024) is above the max threshold 
(1048) of this cluster appears if this code is not changed. obviously, 1024 is 
less than 1048. So change this.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-17 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/1457#issuecomment-49264699
  
Please focus on the second issue  as the first issue is a old patch on June.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-17 Thread maji2014
Github user maji2014 closed the pull request at:

https://github.com/apache/spark/pull/1457


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-17 Thread maji2014
GitHub user maji2014 reopened a pull request:

https://github.com/apache/spark/pull/1457

Required AM memory is amMem, not args.amMemory

ERROR yarn.Client: Required AM memory (1024) is above the max threshold 
(1048) of this cluster appears if this code is not changed. obviously, 1024 is 
less than 1048, so change this

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/1457.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1457


commit 2cc1af6e69fa1dc1a76383e44d9b5389d4e133af
Author: maji2014 ma...@asiainfo-linkage.com
Date:   2014-06-08T11:19:01Z

Update run-example

Old code can only be ran under spark_home and use bin/run-example.
 Error ./run-example: line 55: ./bin/spark-submit: No such file or 
directory appears when running in other place. So change this

commit 9975f13235c4e4f325df8419878b0977db533e5f
Author: derek ma ma...@asiainfo-linkage.com
Date:   2014-07-17T06:35:30Z

Required AM memory is amMem, not args.amMemory

ERROR yarn.Client: Required AM memory (1024) is above the max threshold 
(1048) of this cluster appears if this code is not changed. obviously, 1024 is 
less than 1048. So change this.

commit ce83892fc275eb31fb6b85248fbd1f8a76d984a7
Author: maji2014 ma...@asiainfo-linkage.com
Date:   2014-07-18T02:43:49Z

Revert Update run-example

This reverts commit 2cc1af6e69fa1dc1a76383e44d9b5389d4e133af.

commit e462d3ac85b3e9a5d3db637432c5feecb4661603
Author: maji2014 ma...@asiainfo-linkage.com
Date:   2014-07-18T02:44:07Z

Revert Revert Update run-example

This reverts commit ce83892fc275eb31fb6b85248fbd1f8a76d984a7.




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Required AM memory is amMem, not args.amMem...

2014-07-17 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/1457#issuecomment-49391500
  
Please focus on second issue as title. the first Update run-example is a 
old patch.


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Update run-example

2014-06-08 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/988#issuecomment-45434259
  
ok, i agree that. please merge this patch


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Update run-example

2014-06-08 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/1011

Update run-example

Old code can only be ran under spark_home and use bin/run-example.
 Error ./run-example: line 55: ./bin/spark-submit: No such file or 
directory appears when running in other place. So change this

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark master

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/1011.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #1011


commit 2cc1af6e69fa1dc1a76383e44d9b5389d4e133af
Author: maji2014 ma...@asiainfo-linkage.com
Date:   2014-06-08T11:19:01Z

Update run-example

Old code can only be ran under spark_home and use bin/run-example.
 Error ./run-example: line 55: ./bin/spark-submit: No such file or 
directory appears when running in other place. So change this




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Update run-example

2014-06-08 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/988#issuecomment-45434458
  
i agree that and  patch #1011 is opened against the master branch. you can 
merge it and back-port it into 1.0.1 


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Update run-example

2014-06-06 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/988#issuecomment-45315985
  
OK, SPARK-2057 is opened for tracing this issue.



---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Update run-example

2014-06-05 Thread maji2014
GitHub user maji2014 opened a pull request:

https://github.com/apache/spark/pull/988

Update run-example

Old code can only be ran under spark_home and use bin/run-example.
Error ./run-example: line 55: ./bin/spark-submit: No such file or 
directory appears when running in other place. So change this

You can merge this pull request into a Git repository by running:

$ git pull https://github.com/maji2014/spark branch-1.0

Alternatively you can review and apply these changes as the patch at:

https://github.com/apache/spark/pull/988.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

This closes #988


commit 5ac507ac59fb5ea6441ff19f922bec3b52216f65
Author: maji2014 ma...@asiainfo-linkage.com
Date:   2014-06-06T01:43:19Z

Update run-example

Old code can only be ran under spark_home and use bin/run-example.
Error ./run-example: line 55: ./bin/spark-submit: No such file or 
directory appears when running in other place. So change this




---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---


[GitHub] spark pull request: Update run-example

2014-06-05 Thread maji2014
Github user maji2014 commented on the pull request:

https://github.com/apache/spark/pull/988#issuecomment-45296549
  
Maybe it's better to change it to $FWDIR/bin/spark-submit and commit it 
into apache:master


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at infrastruct...@apache.org or file a JIRA ticket
with INFRA.
---