Re: Review Request 53188: AMBARI-18700 Add HDFS resources for HBase, Spark, Spark2, Zeppelin to AmbariPreupload script

2016-11-02 Thread Sebastian Toader

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/53188/#review154532
---


Ship it!




Ship It!

- Sebastian Toader


On Oct. 29, 2016, 6:36 p.m., Attila Doroszlai wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/53188/
> ---
> 
> (Updated Oct. 29, 2016, 6:36 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Jayush Luniya, Laszlo Puskas, 
> Sandor Magyari, and Sebastian Toader.
> 
> 
> Bugs: AMBARI-18700
> https://issues.apache.org/jira/browse/AMBARI-18700
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> 1. Create more directories in `Ambaripreupload.py`:
> * HBase: `/hbase`, `/apps/hbase/staging` and `/user/hbase`
> * Spark: `/user/spark`, `/user/livy`
> * Spark2: `/hdp/spark2-events`
> * Zeppelin: `/user/zeppelin`, `/user/zeppelin/test`, `/apps/zeppelin`
> 1. Copy `zeppelin-spark-dependencies` jar to HDFS in `Ambaripreupload.py`
> 1. Skip `make_tarfile` call in `spark_service.py` for sysprepped host
> 
> 
> Diffs
> -
> 
>   
> ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_service.py
>  1cbca8b 
>   ambari-server/src/main/resources/scripts/Ambaripreupload.py 1082b5e 
> 
> Diff: https://reviews.apache.org/r/53188/diff/
> 
> 
> Testing
> ---
> 
> Manual testing:
> 1. sysprepped case
>  * create sysprepped cluster
>  * submit blueprint with HBASE, SPARK, SPARK2, ZEPPELIN services and 
> `sysprep_skip_copy_tarballs_hdfs: true`
>  * submit `START_ONLY` cluster creation request
> 2. same blueprint in non-sysprepped cluster, with INSTALL_AND_START cluster 
> creation request
> 
> 
> Thanks,
> 
> Attila Doroszlai
> 
>



Re: Review Request 53188: AMBARI-18700 Add HDFS resources for HBase, Spark, Spark2, Zeppelin to AmbariPreupload script

2016-10-31 Thread Jayush Luniya

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/53188/#review154304
---


Ship it!




Ship It!

- Jayush Luniya


On Oct. 29, 2016, 4:36 p.m., Attila Doroszlai wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/53188/
> ---
> 
> (Updated Oct. 29, 2016, 4:36 p.m.)
> 
> 
> Review request for Ambari, Alejandro Fernandez, Jayush Luniya, Laszlo Puskas, 
> Sandor Magyari, and Sebastian Toader.
> 
> 
> Bugs: AMBARI-18700
> https://issues.apache.org/jira/browse/AMBARI-18700
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> 1. Create more directories in `Ambaripreupload.py`:
> * HBase: `/hbase`, `/apps/hbase/staging` and `/user/hbase`
> * Spark: `/user/spark`, `/user/livy`
> * Spark2: `/hdp/spark2-events`
> * Zeppelin: `/user/zeppelin`, `/user/zeppelin/test`, `/apps/zeppelin`
> 1. Copy `zeppelin-spark-dependencies` jar to HDFS in `Ambaripreupload.py`
> 1. Skip `make_tarfile` call in `spark_service.py` for sysprepped host
> 
> 
> Diffs
> -
> 
>   
> ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_service.py
>  1cbca8b 
>   ambari-server/src/main/resources/scripts/Ambaripreupload.py 1082b5e 
> 
> Diff: https://reviews.apache.org/r/53188/diff/
> 
> 
> Testing
> ---
> 
> Manual testing:
> 1. sysprepped case
>  * create sysprepped cluster
>  * submit blueprint with HBASE, SPARK, SPARK2, ZEPPELIN services and 
> `sysprep_skip_copy_tarballs_hdfs: true`
>  * submit `START_ONLY` cluster creation request
> 2. same blueprint in non-sysprepped cluster, with INSTALL_AND_START cluster 
> creation request
> 
> 
> Thanks,
> 
> Attila Doroszlai
> 
>



Re: Review Request 53188: AMBARI-18700 Add HDFS resources for HBase, Spark, Spark2, Zeppelin to AmbariPreupload script

2016-10-29 Thread Attila Doroszlai

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/53188/
---

(Updated Oct. 29, 2016, 6:36 p.m.)


Review request for Ambari, Alejandro Fernandez, Jayush Luniya, Laszlo Puskas, 
Sandor Magyari, and Sebastian Toader.


Bugs: AMBARI-18700
https://issues.apache.org/jira/browse/AMBARI-18700


Repository: ambari


Description
---

1. Create more directories in `Ambaripreupload.py`:
* HBase: `/hbase`, `/apps/hbase/staging` and `/user/hbase`
* Spark: `/user/spark`, `/user/livy`
* Spark2: `/hdp/spark2-events`
* Zeppelin: `/user/zeppelin`, `/user/zeppelin/test`, `/apps/zeppelin`
1. Copy `zeppelin-spark-dependencies` jar to HDFS in `Ambaripreupload.py`
1. Skip `make_tarfile` call in `spark_service.py` for sysprepped host


Diffs
-

  
ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_service.py
 1cbca8b 
  ambari-server/src/main/resources/scripts/Ambaripreupload.py 1082b5e 

Diff: https://reviews.apache.org/r/53188/diff/


Testing (updated)
---

Manual testing:
1. sysprepped case
 * create sysprepped cluster
 * submit blueprint with HBASE, SPARK, SPARK2, ZEPPELIN services and 
`sysprep_skip_copy_tarballs_hdfs: true`
 * submit `START_ONLY` cluster creation request
2. same blueprint in non-sysprepped cluster, with INSTALL_AND_START cluster 
creation request


Thanks,

Attila Doroszlai



Re: Review Request 53188: AMBARI-18700 Add HDFS resources for HBase, Spark, Spark2, Zeppelin to AmbariPreupload script

2016-10-29 Thread Attila Doroszlai

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/53188/
---

(Updated Oct. 29, 2016, 6:33 p.m.)


Review request for Ambari, Alejandro Fernandez, Jayush Luniya, Laszlo Puskas, 
Sandor Magyari, and Sebastian Toader.


Changes
---

1. Include more services
2. HdfsResource calls do not need to be guarded by sysprepped condition


Summary (updated)
-

AMBARI-18700 Add HDFS resources for HBase, Spark, Spark2, Zeppelin to 
AmbariPreupload script


Bugs: AMBARI-18700
https://issues.apache.org/jira/browse/AMBARI-18700


Repository: ambari


Description (updated)
---

1. Create more directories in `Ambaripreupload.py`:
* HBase: `/hbase`, `/apps/hbase/staging` and `/user/hbase`
* Spark: `/user/spark`, `/user/livy`
* Spark2: `/hdp/spark2-events`
* Zeppelin: `/user/zeppelin`, `/user/zeppelin/test`, `/apps/zeppelin`
1. Copy `zeppelin-spark-dependencies` jar to HDFS in `Ambaripreupload.py`
1. Skip `make_tarfile` call in `spark_service.py` for sysprepped host


Diffs (updated)
-

  
ambari-server/src/main/resources/common-services/SPARK2/2.0.0/package/scripts/spark_service.py
 1cbca8b 
  ambari-server/src/main/resources/scripts/Ambaripreupload.py 1082b5e 

Diff: https://reviews.apache.org/r/53188/diff/


Testing (updated)
---

Manual testing:
 * create sysprepped cluster
 * submit blueprint with HBASE, SPARK, SPARK2, ZEPPELIN services and 
`sysprep_skip_copy_tarballs_hdfs: true`
 * submit `START_ONLY` cluster creation request


Thanks,

Attila Doroszlai