[ 
https://issues.apache.org/jira/browse/HADOOP-19541?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17946943#comment-17946943
 ] 

ASF GitHub Bot commented on HADOOP-19541:
-----------------------------------------

fuchaohong commented on code in PR #7611:
URL: https://github.com/apache/hadoop/pull/7611#discussion_r2057602437


##########
hadoop-tools/hadoop-archives/src/test/java/org/apache/hadoop/tools/TestHadoopArchives.java:
##########
@@ -804,5 +804,31 @@ public void testCopyToLocal() throws Exception {
       localFs.delete(tmpPath, true);      
     }
   }
-  
+
+  @Test
+  public void testBlockSize() throws Exception {
+    conf.set(HadoopArchives.HAR_BLOCKSIZE_LABEL, "1m");
+
+    final String inputPathStr = inputPath.toUri().getPath();
+    System.out.println("inputPathStr = " + inputPathStr);

Review Comment:
   @pan3793 I have already removed it, please help review it again.





> Make HadoopArchives support human-friendly units about blocksize and partsize.
> ------------------------------------------------------------------------------
>
>                 Key: HADOOP-19541
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19541
>             Project: Hadoop Common
>          Issue Type: Improvement
>            Reporter: fuchaohong
>            Priority: Major
>              Labels: pull-request-available
>
> You can use the following suffix (case insensitive): k(kilo), m(mega), 
> g(giga), t(tera), p(peta), e(exa) to specify the size (such as 128k, 512m, 
> 1g, etc.), Or provide complete size in bytes (such as 134217728 for 128 MB).



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to