[ 
https://issues.apache.org/jira/browse/HDFS-16564?focusedWorklogId=763719&page=com.atlassian.jira.plugin.system.issuetabpanels:worklog-tabpanel#worklog-763719
 ]

ASF GitHub Bot logged work on HDFS-16564:
-----------------------------------------

                Author: ASF GitHub Bot
            Created on: 28/Apr/22 17:07
            Start Date: 28/Apr/22 17:07
    Worklog Time Spent: 10m 
      Work Description: GauthamBanasandra opened a new pull request, #4245:
URL: https://github.com/apache/hadoop/pull/4245

   <!--
     Thanks for sending a pull request!
       1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
       2. Make sure your PR title starts with JIRA issue id, e.g., 
'HADOOP-17799. Your PR title ...'.
   -->
   
   ### Description of PR
   `hdfs_find` uses `u_int32_t` type for storing the value for the `max-depth` 
command line argument - 
https://github.com/apache/hadoop/blob/a631f45a99c7abf8c9a2dcfb10afb668c8ff6b09/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tools/hdfs-find/hdfs-find.cc#L43.
   The type `u_int32_t` isn't standard, isn't available on Windows and thus 
breaks cross-platform compatibility. We need to replace this with `uint32_t` 
which is available on all platforms since it's part of the C++ standard.
   
   ### How was this patch tested?
   The existing unit tests exercise this PR sufficiently.
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   




Issue Time Tracking
-------------------

            Worklog Id:     (was: 763719)
    Remaining Estimate: 0h
            Time Spent: 10m

> Use uint32_t for hdfs_find
> --------------------------
>
>                 Key: HDFS-16564
>                 URL: https://issues.apache.org/jira/browse/HDFS-16564
>             Project: Hadoop HDFS
>          Issue Type: Bug
>          Components: libhdfs++
>    Affects Versions: 3.4.0
>            Reporter: Gautham Banasandra
>            Assignee: Gautham Banasandra
>            Priority: Major
>              Labels: libhdfscpp
>          Time Spent: 10m
>  Remaining Estimate: 0h
>
> *hdfs_find* uses *u_int32_t* type for storing the value for the *max-depth* 
> command line argument - 
> https://github.com/apache/hadoop/blob/a631f45a99c7abf8c9a2dcfb10afb668c8ff6b09/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/libhdfspp/tools/hdfs-find/hdfs-find.cc#L43.
> The type u_int32_t isn't standard, isn't available on Windows and thus breaks 
> cross-platform compatibility. We need to replace this with *uint32_t* which 
> is available on all platforms.



--
This message was sent by Atlassian Jira
(v8.20.7#820007)

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to