[
https://issues.apache.org/jira/browse/AMBARI-8477?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Robert Levas updated AMBARI-8477:
---------------------------------
Description:
The HDFS service components should indicate security state when queried by
Ambari Agent via STATUS_COMMAND. Each component should determine it's state as
follows:
h2. NAMENODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.namenode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(namenode principal) && kinit(https principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. DATANODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.datanode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.datanode.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(datanode principal) && kinit(https principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. SECONDARY_NAMENODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.secondary.namenode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.secondary.namenode.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(namenode principal) && kinit(https principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. HDFS_CLIENT
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.web.authentication.kerberos.keytab
*** not empty
*** path exists and is readable
*** required
** dfs.web.authentication.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(hdfs web principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. JOURNALNODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.journalnode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.journalnode.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
state = SECURED_KERBEROS
else
state = UNSECURED
{code}
h2. ZKFC
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
state = SECURED_KERBEROS
else
state = UNSECURED
{code}
_*Note*_: Due to the _cost_ of calling {{kinit}} results should be cached for a
period of time before retrying. This may be an issue depending on the
frequency of the heartbeat timeout.
was:
The HDFS service components should indicate security state when queried by
Ambari Agent via STATUS_COMMAND. Each component should determine it's state as
follows:
h2. NAMENODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.namenode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.kerberos.principal
*** not empty
*** required
** dfs.namenode.kerberos.https.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(namenode principal) && kinit(https principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. DATANODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.datanode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.datanode.kerberos.principal
*** not empty
*** required
** dfs.datanode.kerberos.https.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(datanode principal) && kinit(https principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. SECONDARY_NAMENODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.namenode.secondary.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.namenode.secondary.kerberos.principal
*** not empty
*** required
** dfs.namenode.secondary.kerberos.https.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(namenode principal) && kinit(https principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. HDFS_CLIENT
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.web.authentication.kerberos.keytab
*** not empty
*** path exists and is readable
*** required
** dfs.web.authentication.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
if kinit(hdfs web principal) succeeds
state = SECURED_KERBEROS
else
state = ERROR
else
state = UNSECURED
{code}
h2. JOURNALNODE
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
* Configuration File: /params.hadoop_conf_dir + '/hdfs-site.xml'
** dfs.journalnode.keytab.file
*** not empty
*** path exists and is readable
*** required
** dfs.journalnode.kerberos.principal
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
state = SECURED_KERBEROS
else
state = UNSECURED
{code}
h2. ZKFC
h3. Indicators
* Command JSON
** config\['configurations']\['cluster-env']\['security_enabled']
*** = “true”
* Configuration File: params.hadoop_conf_dir + '/core-site.xml'
** hadoop.security.authentication
*** = “kerberos”
*** required
** hadoop.security.authorization
*** = “true”
*** required
** hadoop.rpc.protection
*** = “authentication”
*** required
** hadoop.security.auth_to_local
*** not empty
*** required
h3. Pseudocode
{code}
if indicators imply security is on and validate
state = SECURED_KERBEROS
else
state = UNSECURED
{code}
_*Note*_: Due to the _cost_ of calling {{kinit}} results should be cached for a
period of time before retrying. This may be an issue depending on the
frequency of the heartbeat timeout.
> HDFS service components should indicate security state
> ------------------------------------------------------
>
> Key: AMBARI-8477
> URL: https://issues.apache.org/jira/browse/AMBARI-8477
> Project: Ambari
> Issue Type: Improvement
> Components: ambari-server, stacks
> Affects Versions: 2.0.0
> Reporter: Robert Levas
> Assignee: Robert Levas
> Labels: agent, kerberos, lifecycle, security
> Fix For: 2.0.0
>
> Attachments: AMBARI-8477_01.patch, AMBARI-8477_01.patch,
> AMBARI-8477_01.patch, AMBARI-8477_02.patch
>
>
> The HDFS service components should indicate security state when queried by
> Ambari Agent via STATUS_COMMAND. Each component should determine it's state
> as follows:
> h2. NAMENODE
> h3. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled']
> *** = “true”
> * Configuration File: params.hadoop_conf_dir + '/core-site.xml'
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: /params.hadoop_conf_dir + '/hdfs-site.xml'
> ** dfs.namenode.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.namenode.kerberos.principal
> *** not empty
> *** required
> h3. Pseudocode
> {code}
> if indicators imply security is on and validate
> if kinit(namenode principal) && kinit(https principal) succeeds
> state = SECURED_KERBEROS
> else
> state = ERROR
> else
> state = UNSECURED
> {code}
> h2. DATANODE
> h3. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled']
> *** = “true”
> * Configuration File: params.hadoop_conf_dir + '/core-site.xml'
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
> ** dfs.datanode.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.datanode.kerberos.principal
> *** not empty
> *** required
> h3. Pseudocode
> {code}
> if indicators imply security is on and validate
> if kinit(datanode principal) && kinit(https principal) succeeds
> state = SECURED_KERBEROS
> else
> state = ERROR
> else
> state = UNSECURED
> {code}
> h2. SECONDARY_NAMENODE
> h3. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled']
> *** = “true”
> * Configuration File: params.hadoop_conf_dir + '/core-site.xml'
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
> ** dfs.secondary.namenode.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.secondary.namenode.kerberos.principal
> *** not empty
> *** required
> h3. Pseudocode
> {code}
> if indicators imply security is on and validate
> if kinit(namenode principal) && kinit(https principal) succeeds
> state = SECURED_KERBEROS
> else
> state = ERROR
> else
> state = UNSECURED
> {code}
> h2. HDFS_CLIENT
> h3. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled']
> *** = “true”
> * Configuration File: params.hadoop_conf_dir + '/core-site.xml'
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: params.hadoop_conf_dir + '/hdfs-site.xml'
> ** dfs.web.authentication.kerberos.keytab
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.web.authentication.kerberos.principal
> *** not empty
> *** required
> h3. Pseudocode
> {code}
> if indicators imply security is on and validate
> if kinit(hdfs web principal) succeeds
> state = SECURED_KERBEROS
> else
> state = ERROR
> else
> state = UNSECURED
> {code}
> h2. JOURNALNODE
> h3. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled']
> *** = “true”
> * Configuration File: params.hadoop_conf_dir + '/core-site.xml'
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> * Configuration File: /params.hadoop_conf_dir + '/hdfs-site.xml'
> ** dfs.journalnode.keytab.file
> *** not empty
> *** path exists and is readable
> *** required
> ** dfs.journalnode.kerberos.principal
> *** not empty
> *** required
> h3. Pseudocode
> {code}
> if indicators imply security is on and validate
> state = SECURED_KERBEROS
> else
> state = UNSECURED
> {code}
> h2. ZKFC
> h3. Indicators
> * Command JSON
> ** config\['configurations']\['cluster-env']\['security_enabled']
> *** = “true”
> * Configuration File: params.hadoop_conf_dir + '/core-site.xml'
> ** hadoop.security.authentication
> *** = “kerberos”
> *** required
> ** hadoop.security.authorization
> *** = “true”
> *** required
> ** hadoop.rpc.protection
> *** = “authentication”
> *** required
> ** hadoop.security.auth_to_local
> *** not empty
> *** required
> h3. Pseudocode
> {code}
> if indicators imply security is on and validate
> state = SECURED_KERBEROS
> else
> state = UNSECURED
> {code}
> _*Note*_: Due to the _cost_ of calling {{kinit}} results should be cached for
> a period of time before retrying. This may be an issue depending on the
> frequency of the heartbeat timeout.
--
This message was sent by Atlassian JIRA
(v6.3.4#6332)