Copilot commented on code in PR #690:
URL: https://github.com/apache/ranger/pull/690#discussion_r2481852571
##########
hdfs-agent/src/test/java/org/apache/ranger/services/hdfs/client/HdfsClientTest.java:
##########
@@ -178,4 +208,326 @@ public void testValidHaConfig() throws
IllegalArgumentException {
HdfsClient.validateConnectionConfigs(configs);
}
+
+ // ===== JUnit 5 additional tests appended (preserving existing code
above) =====
+
+ @Test
+ public void test_validate_valid_multi_nn_transforms_config() {
+ Map<String, String> configs = new HashMap<>();
+ configs.put("username", "hdfsuser");
+ configs.put("password", "hdfsuser");
+ configs.put("hadoop.security.authentication", "simple");
+ configs.put("fs.default.name",
"node-1.example.com:8020,node-2.example.com:8020");
+ HdfsClient.validateConnectionConfigs(configs);
+ Assertions.assertEquals("hdfscluster",
configs.get("dfs.nameservices"));
+ Assertions.assertEquals("hdfs://hdfscluster",
configs.get("fs.default.name"));
+
Assertions.assertEquals("org.apache.hadoop.hdfs.server.namenode.ha.ConfiguredFailoverProxyProvider",
configs.get("dfs.client.failover.proxy.provider.hdfscluster"));
+ Assertions.assertEquals("namenode1,namenode2",
configs.get("dfs.ha.namenodes.hdfscluster"));
+ Assertions.assertEquals("node-1.example.com:8020",
configs.get("dfs.namenode.rpc-address.hdfscluster.namenode1"));
+ Assertions.assertEquals("node-2.example.com:8020",
configs.get("dfs.namenode.rpc-address.hdfscluster.namenode2"));
+ }
Review Comment:
This test method is duplicated with
test10_validate_valid_multi_nn_transforms_config() at line 330. Consider
removing this duplicate to avoid redundant test logic.
```suggestion
```
##########
hdfs-agent/src/test/java/org/apache/ranger/services/hdfs/HDFSRangerTest.java:
##########
@@ -259,9 +269,9 @@ public void readTestUsingTagPolicy() throws Exception {
IOUtils.copy(in, output);
- String content = new String(output.toByteArray());
+ String content = output.toString();
Review Comment:
Using ByteArrayOutputStream.toString() without specifying charset is
deprecated and may cause encoding issues. Consider using
output.toString(StandardCharsets.UTF_8) for explicit charset handling.
##########
hdfs-agent/src/test/java/org/apache/ranger/services/hdfs/HDFSRangerTest.java:
##########
@@ -386,9 +396,9 @@ void hdfsReadTest(String fileName) throws Exception {
IOUtils.copy(in, output);
- String content = new String(output.toByteArray());
+ String content = output.toString();
Review Comment:
Using ByteArrayOutputStream.toString() without specifying charset is
deprecated and may cause encoding issues. Consider using
output.toString(StandardCharsets.UTF_8) for explicit charset handling.
##########
hdfs-agent/src/test/java/org/apache/ranger/services/hdfs/HDFSRangerTest.java:
##########
@@ -282,9 +292,9 @@ public void readTestUsingTagPolicy() throws Exception {
IOUtils.copy(in, output);
- String content = new String(output.toByteArray());
+ String content = output.toString();
Review Comment:
Using ByteArrayOutputStream.toString() without specifying charset is
deprecated and may cause encoding issues. Consider using
output.toString(StandardCharsets.UTF_8) for explicit charset handling.
##########
hdfs-agent/src/test/java/org/apache/ranger/services/hdfs/HDFSRangerTest.java:
##########
@@ -409,9 +419,9 @@ void hdfsReadTest(String fileName) throws Exception {
IOUtils.copy(in, output);
- String content = new String(output.toByteArray());
+ String content = output.toString();
Review Comment:
Using ByteArrayOutputStream.toString() without specifying charset is
deprecated and may cause encoding issues. Consider using
output.toString(StandardCharsets.UTF_8) for explicit charset handling.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]