-----------------------------------------------------------
This is an automatically generated e-mail. To reply, visit:
https://reviews.apache.org/r/64498/#review193390
-----------------------------------------------------------


Ship it!




Ship It!

- Eugene Chekanskiy


On Гру. 11, 2017, 10:39 до полудня, Andrew Onischuk wrote:
> 
> -----------------------------------------------------------
> This is an automatically generated e-mail. To reply, visit:
> https://reviews.apache.org/r/64498/
> -----------------------------------------------------------
> 
> (Updated Гру. 11, 2017, 10:39 до полудня)
> 
> 
> Review request for Ambari, Sid Wagle and Vitalyi Brodetskyi.
> 
> 
> Bugs: AMBARI-22622
>     https://issues.apache.org/jira/browse/AMBARI-22622
> 
> 
> Repository: ambari
> 
> 
> Description
> -------
> 
> Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py", line 
> 43, in <module>
>         BeforeStartHook().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 368, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/hook.py", line 
> 34, in hook
>         setup_hadoop()
>       File 
> "/var/lib/ambari-agent/cache/stack-hooks/before-START/scripts/shared_initialization.py",
>  line 45, in setup_hadoop
>         cd_access='a',
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 166, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 185, in action_create
>         sudo.makedirs(path, self.resource.mode or 0755)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/sudo.py", line 
> 102, in makedirs
>         os.makedirs(path, mode)
>       File "/usr/lib64/python2.7/os.py", line 157, in makedirs
>         mkdir(name, mode)
>     OSError: [Errno 17] File exists: '/grid/0/log/hdfs'
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 88, in <module>
>         NFSGateway().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 368, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 53, in start
>         nfsgateway(action="start")
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
>  line 74, in nfsgateway
>         create_log_dir=True
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
>  line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, 
> environment=hadoop_env_exports)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 166, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 72, in inner
>         result = function(command, **kwargs)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 102, in checked_call
>         tries=tries, try_sleep=try_sleep, 
> timeout_kill_strategy=timeout_kill_strategy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 
> 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  
> /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config 
> /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: 
> HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using 
> value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using 
> value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 88, in <module>
>         NFSGateway().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 368, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 53, in start
>         nfsgateway(action="start")
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
>  line 74, in nfsgateway
>         create_log_dir=True
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
>  line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, 
> environment=hadoop_env_exports)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 166, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 72, in inner
>         result = function(command, **kwargs)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 102, in checked_call
>         tries=tries, try_sleep=try_sleep, 
> timeout_kill_strategy=timeout_kill_strategy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 
> 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  
> /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config 
> /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: 
> HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using 
> value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using 
> value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 88, in <module>
>         NFSGateway().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 368, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 53, in start
>         nfsgateway(action="start")
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
>  line 74, in nfsgateway
>         create_log_dir=True
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
>  line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, 
> environment=hadoop_env_exports)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 166, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 72, in inner
>         result = function(command, **kwargs)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 102, in checked_call
>         tries=tries, try_sleep=try_sleep, 
> timeout_kill_strategy=timeout_kill_strategy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 
> 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  
> /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config 
> /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: 
> HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using 
> value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using 
> value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
>     Traceback (most recent call last):
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 88, in <module>
>         NFSGateway().execute()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py",
>  line 368, in execute
>         method(env)
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/nfsgateway.py",
>  line 53, in start
>         nfsgateway(action="start")
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/hdfs_nfsgateway.py",
>  line 74, in nfsgateway
>         create_log_dir=True
>       File 
> "/var/lib/ambari-agent/cache/common-services/HDFS/3.0.0.3.0/package/scripts/utils.py",
>  line 273, in service
>         Execute(daemon_cmd, not_if=process_id_exists_command, 
> environment=hadoop_env_exports)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/base.py", line 
> 166, in __init__
>         self.env.run()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 160, in run
>         self.run_action(resource, action)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/environment.py", 
> line 124, in run_action
>         provider_action()
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/providers/system.py",
>  line 262, in action_run
>         tries=self.resource.tries, try_sleep=self.resource.try_sleep)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 72, in inner
>         result = function(command, **kwargs)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 102, in checked_call
>         tries=tries, try_sleep=try_sleep, 
> timeout_kill_strategy=timeout_kill_strategy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 150, in _call_wrapper
>         result = _call(command, **kwargs_copy)
>       File 
> "/usr/lib/python2.6/site-packages/resource_management/core/shell.py", line 
> 303, in _call
>         raise ExecutionFailed(err_msg, code, out, err)
>     resource_management.core.exceptions.ExecutionFailed: Execution of 
> 'ambari-sudo.sh su hdfs -l -s /bin/bash -c 'ulimit -c unlimited ;  
> /usr/hdp/3.0.0.0-555/hadoop/bin/hdfs --config 
> /usr/hdp/3.0.0.0-555/hadoop/conf --daemon start nfs3'' returned 1. WARNING: 
> HADOOP_PRIVILEGED_NFS_USER has been replaced by HDFS_NFS3_SECURE_USER. Using 
> value of HADOOP_PRIVILEGED_NFS_USER.
>     WARNING: HADOOP_NFS3_OPTS has been replaced by HDFS_NFS3_OPTS. Using 
> value of HADOOP_NFS3_OPTS.
>     ERROR: You must be a privileged user in order to run a secure service.
> 
> 
> Diffs
> -----
> 
>   
> ambari-server/src/main/resources/common-services/HDFS/3.0.0.3.0/package/scripts/status_params.py
>  153f9a6ca6 
> 
> 
> Diff: https://reviews.apache.org/r/64498/diff/1/
> 
> 
> Testing
> -------
> 
> mvn clean test
> 
> 
> Thanks,
> 
> Andrew Onischuk
> 
>

Reply via email to