Re: Review Request 47676: Spark History Server heap size is not exposed (History Server crashed with OOM)

2016-05-25 Thread Srimanth Gunturi

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47676/#review134867
---


Ship it!




Ship It!

- Srimanth Gunturi


On May 25, 2016, 8:06 p.m., Weiqing Yang wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/47676/
> ---
> 
> (Updated May 25, 2016, 8:06 p.m.)
> 
> 
> Review request for Ambari, Sumit Mohanty and Srimanth Gunturi.
> 
> 
> Bugs: AMBARI-16757
> https://issues.apache.org/jira/browse/AMBARI-16757
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> Ambari is not exposing the heap size parameter for Spark History Server.
> The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for 
> example.
> The newer versions of Spark defaults this to 1g, but on the older versions, 
> it was defaulting to 512m it seems, and it was causing OOM.
> So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template 
> (default: 1G).
> 
> 
> Diffs
> -
> 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
>  8a5117a 
> 
> Diff: https://reviews.apache.org/r/47676/diff/
> 
> 
> Testing
> ---
> 
> Tested all the versions of Spark on Ambari trunk build successfully.
> 
> 
> Thanks,
> 
> Weiqing Yang
> 
>



Re: Review Request 47676: Spark History Server heap size is not exposed (History Server crashed with OOM)

2016-05-25 Thread Weiqing Yang

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47676/
---

(Updated May 25, 2016, 8:06 p.m.)


Review request for Ambari, Sumit Mohanty and Srimanth Gunturi.


Bugs: AMBARI-16757
https://issues.apache.org/jira/browse/AMBARI-16757


Repository: ambari


Description
---

Ambari is not exposing the heap size parameter for Spark History Server.
The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for 
example.
The newer versions of Spark defaults this to 1g, but on the older versions, it 
was defaulting to 512m it seems, and it was causing OOM.
So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template 
(default: 1G).


Diffs (updated)
-

  
ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
 8a5117a 

Diff: https://reviews.apache.org/r/47676/diff/


Testing
---

Tested all the versions of Spark on Ambari trunk build successfully.


Thanks,

Weiqing Yang



Re: Review Request 47676: Spark History Server heap size is not exposed (History Server crashed with OOM)

2016-05-24 Thread Srimanth Gunturi

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47676/#review134671
---




ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
 (line 69)


I would recommend putting the unit of 'M' and the value as 1024. This will 
give user better ability to provide smaller and in-between values instead of 
providing fractions (1.5G or 1.75G etc) - which is not possible with unit of 
'G'.


- Srimanth Gunturi


On May 24, 2016, 9:23 p.m., Weiqing Yang wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/47676/
> ---
> 
> (Updated May 24, 2016, 9:23 p.m.)
> 
> 
> Review request for Ambari, Sumit Mohanty and Srimanth Gunturi.
> 
> 
> Bugs: AMBARI-16757
> https://issues.apache.org/jira/browse/AMBARI-16757
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> Ambari is not exposing the heap size parameter for Spark History Server.
> The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for 
> example.
> The newer versions of Spark defaults this to 1g, but on the older versions, 
> it was defaulting to 512m it seems, and it was causing OOM.
> So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template 
> (default: 1G).
> 
> 
> Diffs
> -
> 
>   
> ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
>  8a5117a 
> 
> Diff: https://reviews.apache.org/r/47676/diff/
> 
> 
> Testing
> ---
> 
> Tested all the versions of Spark on Ambari trunk build successfully.
> 
> 
> Thanks,
> 
> Weiqing Yang
> 
>



Re: Review Request 47676: Spark History Server heap size is not exposed (History Server crashed with OOM)

2016-05-24 Thread Weiqing Yang

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47676/
---

(Updated May 24, 2016, 9:23 p.m.)


Review request for Ambari, Sumit Mohanty and Srimanth Gunturi.


Bugs: AMBARI-16757
https://issues.apache.org/jira/browse/AMBARI-16757


Repository: ambari


Description
---

Ambari is not exposing the heap size parameter for Spark History Server.
The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for 
example.
The newer versions of Spark defaults this to 1g, but on the older versions, it 
was defaulting to 512m it seems, and it was causing OOM.
So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template 
(default: 1G).


Diffs (updated)
-

  
ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
 8a5117a 

Diff: https://reviews.apache.org/r/47676/diff/


Testing (updated)
---

Tested all the versions of Spark on Ambari trunk build successfully.


Thanks,

Weiqing Yang



Re: Review Request 47676: Spark History Server heap size is not exposed (History Server crashed with OOM)

2016-05-20 Thread Weiqing Yang

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47676/
---

(Updated May 20, 2016, 10:24 p.m.)


Review request for Ambari and Sumit Mohanty.


Bugs: AMBARI-16757
https://issues.apache.org/jira/browse/AMBARI-16757


Repository: ambari


Description
---

Ambari is not exposing the heap size parameter for Spark History Server.
The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for 
example.
The newer versions of Spark defaults this to 1g, but on the older versions, it 
was defaulting to 512m it seems, and it was causing OOM.
So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template 
(default: 1G).


Diffs (updated)
-

  
ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
 8a5117a 

Diff: https://reviews.apache.org/r/47676/diff/


Testing
---

N/A


Thanks,

Weiqing Yang



Review Request 47676: Spark History Server heap size is not exposed (History Server crashed with OOM)

2016-05-20 Thread Weiqing Yang

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/47676/
---

Review request for Ambari and Sumit Mohanty.


Bugs: AMBARI-16757
https://issues.apache.org/jira/browse/AMBARI-16757


Repository: ambari


Description
---

Ambari is not exposing the heap size parameter for Spark History Server.
The workaround is to modify spark-env and add "SPARK_DAEMON_MEMORY=2g" for 
example.
The newer versions of Spark defaults this to 1g, but on the older versions, it 
was defaulting to 512m it seems, and it was causing OOM.
So in the patch, "SPARK_DAEMON_MEMORY=1G" is added in the spark-env template 
(default: 1G).


Diffs
-

  
ambari-server/src/main/resources/common-services/SPARK/1.2.1/configuration/spark-env.xml
 8a5117a 

Diff: https://reviews.apache.org/r/47676/diff/


Testing
---

N/A


Thanks,

Weiqing Yang