Re: Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread Robert Levas

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/
---

(Updated July 27, 2016, 6:10 p.m.)


Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.


Bugs: AMBARI-17921
https://issues.apache.org/jira/browse/AMBARI-17921


Repository: ambari


Description
---

If both Spark and Spark2 is installed and each run as a different user, then 
the ACLs on the _shared_ keytab files may block access by components in either 
service to needed keytab files. 

For example if Spark is set to run as the user with username `spark` and Spark2 
is set to run as the user with username `spark2`:
```
spark-env/spark_user = spark
spark2-env/spark_user = spark2
```

Then the keytab file for the shared headless principal - spark.headless.keytab 
- will have an ACL set that either the spark or the spark2 user can read it 
(depending on the order the keytab file is written). 

In this case, the following error will be encountered 

```
Traceback (most recent call last):
  File 
"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
 line 87, in 
SparkThriftServer().execute()
  File 
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 280, in execute
method(env)
  File 
"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
 line 54, in start
spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
action='start')
  File 
"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
 line 57, in spark_service
Execute(spark_kinit_cmd, user=params.spark_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
line 155, in __init__
self.env.run()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 160, in run
self.run_action(resource, action)
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 124, in run_action
provider_action()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
 line 273, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 71, in inner
result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 93, in checked_call
tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 141, in _call_wrapper
result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 294, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
/etc/security/keytabs/spark.headless.keytab 
spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
Hortonworks #
This is MOTD message, added for testing in qe infra
kinit: Generic preauthentication failure while getting initial credentials
```

"kinit: Generic preauthentication failure while getting initial credentials" 
indicates, in this case, the the user running the Spark service does not have 
access to the specified keytab file.

To ensure this does not happen, keytab files for both services should have 
different file names.


Diffs (updated)
-

  ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
967adb0 

Diff: https://reviews.apache.org/r/50512/diff/


Testing
---

Manualy tested


Thanks,

Robert Levas



Re: Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread bikas

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/#review143801
---




ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
(line 15)


Should this also be spark2-env/spark_user like its in the next section. 
What if Spark is not installed so there is no spark-env


- bikassaha


On July 27, 2016, 10:10 a.m., Robert Levas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/50512/
> ---
> 
> (Updated July 27, 2016, 10:10 a.m.)
> 
> 
> Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-17921
> https://issues.apache.org/jira/browse/AMBARI-17921
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> If both Spark and Spark2 is installed and each run as a different user, then 
> the ACLs on the _shared_ keytab files may block access by components in 
> either service to needed keytab files. 
> 
> For example if Spark is set to run as the user with username `spark` and 
> Spark2 is set to run as the user with username `spark2`:
> ```
> spark-env/spark_user = spark
> spark2-env/spark_user = spark2
> ```
> 
> Then the keytab file for the shared headless principal - 
> spark.headless.keytab - will have an ACL set that either the spark or the 
> spark2 user can read it (depending on the order the keytab file is written). 
> 
> In this case, the following error will be encountered 
> 
> ```
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 87, in 
> SparkThriftServer().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 280, in execute
> method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 54, in start
> spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
> action='start')
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
>  line 57, in spark_service
> Execute(spark_kinit_cmd, user=params.spark_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
> self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
> self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
> provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
> tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 71, in inner
> result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 93, in checked_call
> tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 141, in _call_wrapper
> result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 294, in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
> /etc/security/keytabs/spark.headless.keytab 
> spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
> Hortonworks #
> This is MOTD message, added for testing in qe infra
> kinit: Generic preauthentication failure while getting initial credentials
> ```
> 
> "kinit: Generic preauthentication failure while getting initial credentials" 
> indicates, in this case, the the user running the Spark service does not have 
> access to the specified keytab file.
> 
> To ensure this does not happen, keytab files for both services should have 
> different file names.
> 
> 
> Diffs
> -
> 
>   ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
> 967adb0 
> 
> Diff: https://reviews.apache.org/r/50512/diff/
> 
> 
> Testing
> ---
> 
> Manualy tested
> 
> 
> Thanks,
> 
> Robert Levas
> 
>



Re: Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread Robert Levas


> On July 27, 2016, 2:35 p.m., bikassaha wrote:
> > I think the Spark and Spark2 design assumes the same keytab and user for 
> > Spark and Spark2 so that the transition between the two and concurrent use 
> > of both is seamless. So HDFS files, e.g for ATS data from both apps should 
> > be owned by the same user.

Even though the keytab files are different, the principal embedded in them are 
the same. Therefore the authenticated user is the same and this both behave as 
the same user.


- Robert


---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/#review143771
---


On July 27, 2016, 1:10 p.m., Robert Levas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/50512/
> ---
> 
> (Updated July 27, 2016, 1:10 p.m.)
> 
> 
> Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-17921
> https://issues.apache.org/jira/browse/AMBARI-17921
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> If both Spark and Spark2 is installed and each run as a different user, then 
> the ACLs on the _shared_ keytab files may block access by components in 
> either service to needed keytab files. 
> 
> For example if Spark is set to run as the user with username `spark` and 
> Spark2 is set to run as the user with username `spark2`:
> ```
> spark-env/spark_user = spark
> spark2-env/spark_user = spark2
> ```
> 
> Then the keytab file for the shared headless principal - 
> spark.headless.keytab - will have an ACL set that either the spark or the 
> spark2 user can read it (depending on the order the keytab file is written). 
> 
> In this case, the following error will be encountered 
> 
> ```
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 87, in 
> SparkThriftServer().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 280, in execute
> method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 54, in start
> spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
> action='start')
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
>  line 57, in spark_service
> Execute(spark_kinit_cmd, user=params.spark_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
> self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
> self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
> provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
> tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 71, in inner
> result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 93, in checked_call
> tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 141, in _call_wrapper
> result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 294, in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
> /etc/security/keytabs/spark.headless.keytab 
> spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
> Hortonworks #
> This is MOTD message, added for testing in qe infra
> kinit: Generic preauthentication failure while getting initial credentials
> ```
> 
> "kinit: Generic preauthentication failure while getting initial credentials" 
> indicates, in this case, the the user running the Spark service does not have 
> access to the specified keytab file.
> 
> To ensure this does not happen, keytab files for both services should have 
> different file names.
> 
> 
> Diffs
> -
> 
>   ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
> 967adb0 
> 
> Diff: https://reviews.apache.org/r/50512/diff/
> 
> 
> Testing
> ---
> 
> Manualy tested
> 
> 
> Thanks,
> 
> Robert Levas
> 
>



Re: Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread bikas

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/#review143771
---



I think the Spark and Spark2 design assumes the same keytab and user for Spark 
and Spark2 so that the transition between the two and concurrent use of both is 
seamless. So HDFS files, e.g for ATS data from both apps should be owned by the 
same user.

- bikassaha


On July 27, 2016, 10:10 a.m., Robert Levas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/50512/
> ---
> 
> (Updated July 27, 2016, 10:10 a.m.)
> 
> 
> Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-17921
> https://issues.apache.org/jira/browse/AMBARI-17921
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> If both Spark and Spark2 is installed and each run as a different user, then 
> the ACLs on the _shared_ keytab files may block access by components in 
> either service to needed keytab files. 
> 
> For example if Spark is set to run as the user with username `spark` and 
> Spark2 is set to run as the user with username `spark2`:
> ```
> spark-env/spark_user = spark
> spark2-env/spark_user = spark2
> ```
> 
> Then the keytab file for the shared headless principal - 
> spark.headless.keytab - will have an ACL set that either the spark or the 
> spark2 user can read it (depending on the order the keytab file is written). 
> 
> In this case, the following error will be encountered 
> 
> ```
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 87, in 
> SparkThriftServer().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 280, in execute
> method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 54, in start
> spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
> action='start')
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
>  line 57, in spark_service
> Execute(spark_kinit_cmd, user=params.spark_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
> self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
> self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
> provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
> tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 71, in inner
> result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 93, in checked_call
> tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 141, in _call_wrapper
> result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 294, in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
> /etc/security/keytabs/spark.headless.keytab 
> spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
> Hortonworks #
> This is MOTD message, added for testing in qe infra
> kinit: Generic preauthentication failure while getting initial credentials
> ```
> 
> "kinit: Generic preauthentication failure while getting initial credentials" 
> indicates, in this case, the the user running the Spark service does not have 
> access to the specified keytab file.
> 
> To ensure this does not happen, keytab files for both services should have 
> different file names.
> 
> 
> Diffs
> -
> 
>   ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
> 967adb0 
> 
> Diff: https://reviews.apache.org/r/50512/diff/
> 
> 
> Testing
> ---
> 
> Manualy tested
> 
> 
> Thanks,
> 
> Robert Levas
> 
>



Re: Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread Jayush Luniya

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/#review143765
---


Ship it!




Ship It!

- Jayush Luniya


On July 27, 2016, 5:10 p.m., Robert Levas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/50512/
> ---
> 
> (Updated July 27, 2016, 5:10 p.m.)
> 
> 
> Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-17921
> https://issues.apache.org/jira/browse/AMBARI-17921
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> If both Spark and Spark2 is installed and each run as a different user, then 
> the ACLs on the _shared_ keytab files may block access by components in 
> either service to needed keytab files. 
> 
> For example if Spark is set to run as the user with username `spark` and 
> Spark2 is set to run as the user with username `spark2`:
> ```
> spark-env/spark_user = spark
> spark2-env/spark_user = spark2
> ```
> 
> Then the keytab file for the shared headless principal - 
> spark.headless.keytab - will have an ACL set that either the spark or the 
> spark2 user can read it (depending on the order the keytab file is written). 
> 
> In this case, the following error will be encountered 
> 
> ```
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 87, in 
> SparkThriftServer().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 280, in execute
> method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 54, in start
> spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
> action='start')
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
>  line 57, in spark_service
> Execute(spark_kinit_cmd, user=params.spark_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
> self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
> self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
> provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
> tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 71, in inner
> result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 93, in checked_call
> tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 141, in _call_wrapper
> result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 294, in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
> /etc/security/keytabs/spark.headless.keytab 
> spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
> Hortonworks #
> This is MOTD message, added for testing in qe infra
> kinit: Generic preauthentication failure while getting initial credentials
> ```
> 
> "kinit: Generic preauthentication failure while getting initial credentials" 
> indicates, in this case, the the user running the Spark service does not have 
> access to the specified keytab file.
> 
> To ensure this does not happen, keytab files for both services should have 
> different file names.
> 
> 
> Diffs
> -
> 
>   ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
> 967adb0 
> 
> Diff: https://reviews.apache.org/r/50512/diff/
> 
> 
> Testing
> ---
> 
> Manualy tested
> 
> 
> Thanks,
> 
> Robert Levas
> 
>



Re: Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread Sumit Mohanty

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/#review143758
---


Ship it!




Ship It!

- Sumit Mohanty


On July 27, 2016, 5:10 p.m., Robert Levas wrote:
> 
> ---
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/50512/
> ---
> 
> (Updated July 27, 2016, 5:10 p.m.)
> 
> 
> Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.
> 
> 
> Bugs: AMBARI-17921
> https://issues.apache.org/jira/browse/AMBARI-17921
> 
> 
> Repository: ambari
> 
> 
> Description
> ---
> 
> If both Spark and Spark2 is installed and each run as a different user, then 
> the ACLs on the _shared_ keytab files may block access by components in 
> either service to needed keytab files. 
> 
> For example if Spark is set to run as the user with username `spark` and 
> Spark2 is set to run as the user with username `spark2`:
> ```
> spark-env/spark_user = spark
> spark2-env/spark_user = spark2
> ```
> 
> Then the keytab file for the shared headless principal - 
> spark.headless.keytab - will have an ACL set that either the spark or the 
> spark2 user can read it (depending on the order the keytab file is written). 
> 
> In this case, the following error will be encountered 
> 
> ```
> Traceback (most recent call last):
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 87, in 
> SparkThriftServer().execute()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 280, in execute
> method(env)
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
>  line 54, in start
> spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
> action='start')
>   File 
> "/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
>  line 57, in spark_service
> Execute(spark_kinit_cmd, user=params.spark_user)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
> line 155, in __init__
> self.env.run()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
> self.run_action(resource, action)
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
> provider_action()
>   File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 273, in action_run
> tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 71, in inner
> result = function(command, **kwargs)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 93, in checked_call
> tries=tries, try_sleep=try_sleep)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 141, in _call_wrapper
> result = _call(command, **kwargs_copy)
>   File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
> line 294, in _call
> raise Fail(err_msg)
> resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
> /etc/security/keytabs/spark.headless.keytab 
> spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
> Hortonworks #
> This is MOTD message, added for testing in qe infra
> kinit: Generic preauthentication failure while getting initial credentials
> ```
> 
> "kinit: Generic preauthentication failure while getting initial credentials" 
> indicates, in this case, the the user running the Spark service does not have 
> access to the specified keytab file.
> 
> To ensure this does not happen, keytab files for both services should have 
> different file names.
> 
> 
> Diffs
> -
> 
>   ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
> 967adb0 
> 
> Diff: https://reviews.apache.org/r/50512/diff/
> 
> 
> Testing
> ---
> 
> Manualy tested
> 
> 
> Thanks,
> 
> Robert Levas
> 
>



Review Request 50512: Spark and Spark2 should use different keytab files to avoid ACL issues

2016-07-27 Thread Robert Levas

---
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/50512/
---

Review request for Ambari, bikassaha, Saisai Shao, and Sumit Mohanty.


Bugs: AMBARI-17921
https://issues.apache.org/jira/browse/AMBARI-17921


Repository: ambari


Description
---

If both Spark and Spark2 is installed and each run as a different user, then 
the ACLs on the _shared_ keytab files may block access by components in either 
service to needed keytab files. 

For example if Spark is set to run as the user with username `spark` and Spark2 
is set to run as the user with username `spark2`:
```
spark-env/spark_user = spark
spark2-env/spark_user = spark2
```

Then the keytab file for the shared headless principal - spark.headless.keytab 
- will have an ACL set that either the spark or the spark2 user can read it 
(depending on the order the keytab file is written). 

In this case, the following error will be encountered 

```
Traceback (most recent call last):
  File 
"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
 line 87, in 
SparkThriftServer().execute()
  File 
"/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
 line 280, in execute
method(env)
  File 
"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_thrift_server.py",
 line 54, in start
spark_service('sparkthriftserver', upgrade_type=upgrade_type, 
action='start')
  File 
"/var/lib/ambari-agent/cache/common-services/SPARK/1.2.1/package/scripts/spark_service.py",
 line 57, in spark_service
Execute(spark_kinit_cmd, user=params.spark_user)
  File "/usr/lib/python2.6/site-packages/resource_management/core/base.py", 
line 155, in __init__
self.env.run()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 160, in run
self.run_action(resource, action)
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
line 124, in run_action
provider_action()
  File 
"/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
 line 273, in action_run
tries=self.resource.tries, try_sleep=self.resource.try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 71, in inner
result = function(command, **kwargs)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 93, in checked_call
tries=tries, try_sleep=try_sleep)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 141, in _call_wrapper
result = _call(command, **kwargs_copy)
  File "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", 
line 294, in _call
raise Fail(err_msg)
resource_management.core.exceptions.Fail: Execution of '/usr/bin/kinit -kt 
/etc/security/keytabs/spark.headless.keytab 
spark2rndygi0zfoo3ftqildwn5...@hwqe.hortonworks.com; ' returned 1.  
Hortonworks #
This is MOTD message, added for testing in qe infra
kinit: Generic preauthentication failure while getting initial credentials
```

"kinit: Generic preauthentication failure while getting initial credentials" 
indicates, in this case, the the user running the Spark service does not have 
access to the specified keytab file.

To ensure this does not happen, keytab files for both services should have 
different file names.


Diffs
-

  ambari-server/src/main/resources/common-services/SPARK2/2.0.0/kerberos.json 
967adb0 

Diff: https://reviews.apache.org/r/50512/diff/


Testing
---

Manualy tested


Thanks,

Robert Levas