[ 
https://issues.apache.org/jira/browse/HADOOP-13498?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15434041#comment-15434041
 ] 

Genmao Yu commented on HADOOP-13498:
------------------------------------

[~mingfei] new patch is available and the result of unit test is:

{code}
[INFO] Scanning for projects...
[INFO]                                                                         
[INFO] ------------------------------------------------------------------------
[INFO] Building Apache Hadoop Aliyun OSS support 3.0.0-alpha2-SNAPSHOT
[INFO] ------------------------------------------------------------------------
[INFO] 
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ hadoop-aliyun ---
[INFO] Deleting /home/yugm/apps/hadoop/hadoop-tools/hadoop-aliyun/target
[INFO] 
[INFO] --- maven-antrun-plugin:1.7:run (create-testdirs) @ hadoop-aliyun ---
[INFO] Executing tasks

main:
    [mkdir] Created dir: 
/home/yugm/apps/hadoop/hadoop-tools/hadoop-aliyun/target/test-dir
[INFO] Executed tasks
[INFO] 
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @ hadoop-aliyun 
---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ 
hadoop-aliyun ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory 
/home/yugm/apps/hadoop/hadoop-tools/hadoop-aliyun/src/main/resources
[INFO] Copying 2 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ hadoop-aliyun 
---
[INFO] Compiling 7 source files to 
/home/yugm/apps/hadoop/hadoop-tools/hadoop-aliyun/target/classes
[INFO] 
[INFO] --- maven-dependency-plugin:2.2:list (deplist) @ hadoop-aliyun ---
[INFO] 
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ 
hadoop-aliyun ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 5 resources
[INFO] Copying 2 resources
[INFO] 
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
hadoop-aliyun ---
[INFO] Compiling 12 source files to 
/home/yugm/apps/hadoop/hadoop-tools/hadoop-aliyun/target/test-classes
[INFO] 
[INFO] --- maven-surefire-plugin:2.17:test (default-test) @ hadoop-aliyun ---
[INFO] Surefire report directory: 
/home/yugm/apps/hadoop/hadoop-tools/hadoop-aliyun/target/surefire-reports

-------------------------------------------------------
 T E S T S
-------------------------------------------------------

-------------------------------------------------------
 T E S T S
-------------------------------------------------------
Running org.apache.hadoop.fs.aliyun.oss.TestOSSTemporaryCredentials
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 2.433 sec - in 
org.apache.hadoop.fs.aliyun.oss.TestOSSTemporaryCredentials
Running org.apache.hadoop.fs.aliyun.oss.TestOSSOutputStream
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.912 sec - in 
org.apache.hadoop.fs.aliyun.oss.TestOSSOutputStream
Running org.apache.hadoop.fs.aliyun.oss.TestOSSFileSystemContract
Tests run: 46, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 26.263 sec - 
in org.apache.hadoop.fs.aliyun.oss.TestOSSFileSystemContract
Running org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractRename
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.221 sec - in 
org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractRename
Running org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractMkdir
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 4.503 sec - in 
org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractMkdir
Running org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractDelete
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.154 sec - in 
org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractDelete
Running org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractOpen
Tests run: 6, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 3.777 sec - in 
org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractOpen
Running org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractCreate
Tests run: 6, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 4.031 sec - in 
org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractCreate
Running org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractSeek
Tests run: 18, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 9.896 sec - in 
org.apache.hadoop.fs.aliyun.oss.contract.TestOSSContractSeek
Running org.apache.hadoop.fs.aliyun.oss.TestOSSInputStream
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 7.736 sec - in 
org.apache.hadoop.fs.aliyun.oss.TestOSSInputStream

Results :

Tests run: 101, Failures: 0, Errors: 0, Skipped: 1

[INFO] ------------------------------------------------------------------------
[INFO] BUILD SUCCESS
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:34 min
[INFO] Finished at: 2016-08-24T09:44:40+08:00
[INFO] Final Memory: 34M/440M
[INFO] ------------------------------------------------------------------------
{code}

> the number of multi-part upload part should not bigger than 10000
> -----------------------------------------------------------------
>
>                 Key: HADOOP-13498
>                 URL: https://issues.apache.org/jira/browse/HADOOP-13498
>             Project: Hadoop Common
>          Issue Type: Sub-task
>          Components: fs
>    Affects Versions: HADOOP-12756
>            Reporter: Genmao Yu
>            Assignee: Genmao Yu
>             Fix For: HADOOP-12756
>
>         Attachments: HADOOP-13498-HADOOP-12756.001.patch, 
> HADOOP-13498-HADOOP-12756.002.patch, HADOOP-13498-HADOOP-12756.003.patch, 
> HADOOP-13498-HADOOP-12756.004.patch
>
>
> We should not only throw exception when exceed 10000 limit of multi-part 
> number, but should guarantee to upload any object no matter how big it is. 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to