[
https://issues.apache.org/jira/browse/HDFS-15859?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
hamado dene updated HDFS-15859:
-------------------------------
Description:
I trying to install hadoop cluster HA , but datanode does not start properly; I
get this errror:
2021-02-23 17:13:26,934 ERROR org.apache.hadoop.hdfs.server.datanode.DataNode:
Exception in secureMain
java.io.IOException: java.security.GeneralSecurityException: The property
'ssl.server.keystore.location' has not been set in the ssl configuration file.
at
org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:199)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:905)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1303)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:481)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2609)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2497)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2544)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2729)
at
org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2753)
Caused by: java.security.GeneralSecurityException: The property
'ssl.server.keystore.location' has not been set in the ssl configuration file.
at
org.apache.hadoop.security.ssl.FileBasedKeyStoresFactory.init(FileBasedKeyStoresFactory.java:152)
at org.apache.hadoop.security.ssl.SSLFactory.init(SSLFactory.java:148)
at
org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:197)
... 8 more
But in my ssl-server.xml i correctly set this property:
<property>
<name>ssl.server.keystore.location</name>
<value>/data/hadoop/server.jks</value>
<description>Keystore to be used by clients like distcp. Must be
specified.
</description>
</property>
<property>
<name>ssl.server.keystore.password</name>
<value>xxxx</value>
<description>Optional. Default value is "".
</description>
</property>
<property>
<name>ssl.server.keystore.keypassword</name>
<value>xxxxx</value>
<description>Optional. Default value is "".
</description>
</property>
<property>
<name>ssl.server.keystore.type</name>
<value>jks</value>
<description>Optional. The keystore file format, default value is "jks".
</description>
</property>
Do you have any suggestion to solve this problem?
my hadoop version is: 2.8.5
java version: 8
SO: centos 7
> property 'ssl.server.keystore.location' has not been set in the ssl
> configuration file
> --------------------------------------------------------------------------------------
>
> Key: HDFS-15859
> URL: https://issues.apache.org/jira/browse/HDFS-15859
> Project: Hadoop HDFS
> Issue Type: Bug
> Components: datanode
> Affects Versions: 2.8.5
> Environment: I trying to install hadoop cluster HA , but datanode
> does not start properly; I get this errror:
> {code:java}
> 2021-02-23 17:13:26,934 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
> java.io.IOException: java.security.GeneralSecurityException: The property
> 'ssl.server.keystore.location' has not been set in the ssl configuration file.
> at
> org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:199)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:905)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1303)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:481)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2609)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2497)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2544)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2729)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2753)
> Caused by: java.security.GeneralSecurityException: The property
> 'ssl.server.keystore.location' has not been set in the ssl configuration file.
> at
> org.apache.hadoop.security.ssl.FileBasedKeyStoresFactory.init(FileBasedKeyStoresFactory.java:152)
> at org.apache.hadoop.security.ssl.SSLFactory.init(SSLFactory.java:148)
> at
> org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:197)
> ... 8 more
> {code}
> But in my ssl-server.xml i correctly set this property:
> <property>
> <name>ssl.server.keystore.location</name>
> <value>/data/hadoop/server.jks</value>
> <description>Keystore to be used by clients like distcp. Must be
> specified.
> </description>
> </property>
> <property>
> <name>ssl.server.keystore.password</name>
> <value>xxxx</value>
> <description>Optional. Default value is "".
> </description>
> </property>
> <property>
> <name>ssl.server.keystore.keypassword</name>
> <value>xxxxx</value>
> <description>Optional. Default value is "".
> </description>
> </property>
> <property>
> <name>ssl.server.keystore.type</name>
> <value>jks</value>
> <description>Optional. The keystore file format, default value is "jks".
> </description>
> </property>
> Do you have any suggestion to solve this problem?
> my hadoop version is: 2.8.5
> java version: 8
> SO: centos 7
> Reporter: hamado dene
> Priority: Major
>
> I trying to install hadoop cluster HA , but datanode does not start properly;
> I get this errror:
> 2021-02-23 17:13:26,934 ERROR
> org.apache.hadoop.hdfs.server.datanode.DataNode: Exception in secureMain
> java.io.IOException: java.security.GeneralSecurityException: The property
> 'ssl.server.keystore.location' has not been set in the ssl configuration file.
> at
> org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:199)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startInfoServer(DataNode.java:905)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.startDataNode(DataNode.java:1303)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.<init>(DataNode.java:481)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.makeInstance(DataNode.java:2609)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.instantiateDataNode(DataNode.java:2497)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.createDataNode(DataNode.java:2544)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.secureMain(DataNode.java:2729)
> at
> org.apache.hadoop.hdfs.server.datanode.DataNode.main(DataNode.java:2753)
> Caused by: java.security.GeneralSecurityException: The property
> 'ssl.server.keystore.location' has not been set in the ssl configuration file.
> at
> org.apache.hadoop.security.ssl.FileBasedKeyStoresFactory.init(FileBasedKeyStoresFactory.java:152)
> at org.apache.hadoop.security.ssl.SSLFactory.init(SSLFactory.java:148)
> at
> org.apache.hadoop.hdfs.server.datanode.web.DatanodeHttpServer.<init>(DatanodeHttpServer.java:197)
> ... 8 more
> But in my ssl-server.xml i correctly set this property:
> <property>
> <name>ssl.server.keystore.location</name>
> <value>/data/hadoop/server.jks</value>
> <description>Keystore to be used by clients like distcp. Must be
> specified.
> </description>
> </property>
> <property>
> <name>ssl.server.keystore.password</name>
> <value>xxxx</value>
> <description>Optional. Default value is "".
> </description>
> </property>
> <property>
> <name>ssl.server.keystore.keypassword</name>
> <value>xxxxx</value>
> <description>Optional. Default value is "".
> </description>
> </property>
> <property>
> <name>ssl.server.keystore.type</name>
> <value>jks</value>
> <description>Optional. The keystore file format, default value is "jks".
> </description>
> </property>
> Do you have any suggestion to solve this problem?
> my hadoop version is: 2.8.5
> java version: 8
> SO: centos 7
--
This message was sent by Atlassian Jira
(v8.3.4#803005)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]