wojiaodoubao opened a new pull request, #7194:
URL: https://github.com/apache/hadoop/pull/7194

   <!--
     Thanks for sending a pull request!
       1. If this is your first time, please read our contributor guidelines: 
https://cwiki.apache.org/confluence/display/HADOOP/How+To+Contribute
       2. Make sure your PR title starts with JIRA issue id, e.g., 
'HADOOP-17799. Your PR title ...'.
   -->
   
   ### Description of PR
   Volcano Engine is a fast growing cloud vendor launched by ByteDance, and TOS 
is the object storage service of Volcano Engine. A common way is to store data 
into TOS and run Hadoop/Spark/Flink applications to access TOS. But there is no 
original support for TOS in hadoop, thus it is not easy for users to build 
their Big Data System based on TOS.
    
   This work aims to integrate TOS with Hadoop to help users run their 
applications on TOS. Users only need to do some simple configuration, then 
their applications can read/write TOS without any code change. This work is 
similar to AWS S3, AzureBlob, AliyunOSS, Tencnet COS and HuaweiCloud Object 
Storage in Hadoop.
   
    Please see the issue for more details. 
https://issues.apache.org/jira/browse/HADOOP-19236
   
   ### How was this patch tested?
   
   Unit tests need to connect to tos service. Setting the 6 environment 
variables below to run unit tests.
   
   ```bash
   export TOS_ACCESS_KEY_ID={YOUR_ACCESS_KEY}
   export TOS_SECRET_ACCESS_KEY={YOUR_SECRET_ACCESS_KEY}
   export TOS_ENDPOINT={TOS_SERVICE_ENDPOINT}
   export FILE_STORAGE_ROOT=/tmp/local_dev/
   export TOS_BUCKET={YOUR_BUCKET_NAME}
   export TOS_UNIT_TEST_ENABLED=true
   ```
   
   Then cd to hadoop project root directory, and run the test command below.
   
   ```bash
   mvn -Dtest=org.apache.hadoop.fs.tosfs.** test -pl 
org.apache.hadoop:hadoop-tos-core
   ```
   
   ### For code changes:
   
   - [ ] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [ ] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to