chaohengstudent opened a new pull request, #5449:
URL: https://github.com/apache/hadoop/pull/5449

   <!--
     Thanks for sending a pull request!
       1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
       2. Make sure your PR title starts with JIRA issue id, e.g., 
'HADOOP-17799. Your PR title ...'.
   -->
   
   ### Description of PR
   update path with LIBHDFS_PATH
   add path to LD_LIBRARY_PATH with compatible  to java11
   edit CLASSPATH
   
   ### How was this patch tested?
   manual test
   ```
   root@6b76602de66c:/opt/hadoop# ./fuse_dfs_wrapper.sh -d 
hdfs://192.168.103.44:14370 /mnt/hdfs
   INFO 
/home/chaoheng/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/fuse-dfs/fuse_options.c:115
 Ignoring option -d
   INFO 
/home/chaoheng/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/fuse-dfs/fuse_options.c:164
 Adding FUSE arg /mnt/hdfs
   FUSE library version: 2.9.9
   nullpath_ok: 0
   nopath: 0
   utime_omit_ok: 0
   unique: 2, opcode: INIT (26), nodeid: 0, insize: 104, pid: 0
   INIT: 7.36
   flags=0x73fffffb
   max_readahead=0x00020000
   INFO 
/home/chaoheng/hadoop/hadoop-hdfs-project/hadoop-hdfs-native-client/src/main/native/fuse-dfs/fuse_init.c:98
 Mounting with options: [ protected=(NULL), nn_uri=hdfs://192.168.103.44:14370, 
nn_port=0, debug=0, read_only=0, initchecks=0, no_permissions=0, usetrash=0, 
entry_timeout=60, attribute_timeout=60, rdbuffer_size=10485760, direct_io=0 ]
   fuseConnectInit: initialized with timer period 5, expiry period 300
      INIT: 7.19
      flags=0x00000039
      max_readahead=0x00020000
      max_write=0x00020000
      max_background=0
      congestion_threshold=0
      unique: 2, success, outsize: 40
   
   
   ```
   
   ### For code changes:
   
   - [x] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to