[ 
https://issues.apache.org/jira/browse/HADOOP-19343?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=18017116#comment-18017116
 ] 

ASF GitHub Bot commented on HADOOP-19343:
-----------------------------------------

cnauroth opened a new pull request, #7921:
URL: https://github.com/apache/hadoop/pull/7921

   ### Description of PR
   
   Generate hadoop-gcp shell profile.
   
   ### How was this patch tested?
   
   Build the distro:
   
   ```
   mvn -Pdist -Dtar clean package -DskipTests
   ```
   
   Untar the distro and verify the presence of a new shell profile:
   
   ```
   > cat libexec/shellprofile.d/hadoop-gcp.sh
   #!/usr/bin/env bash
   #
   # Licensed to the Apache Software Foundation (ASF) under one or more
   # contributor license agreements.  See the NOTICE file distributed with
   # this work for additional information regarding copyright ownership.
   # The ASF licenses this file to You under the Apache License, Version 2.0
   # (the "License"); you may not use this file except in compliance with
   # the License.  You may obtain a copy of the License at
   #
   #     http://www.apache.org/licenses/LICENSE-2.0
   #
   # Unless required by applicable law or agreed to in writing, software
   # distributed under the License is distributed on an "AS IS" BASIS,
   # WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
   # See the License for the specific language governing permissions and
   # limitations under the License.
   #
   #
   #
   # IMPORTANT: This file is automatically generated by hadoop-dist at
   #            -Pdist time.
   #
   #
   if hadoop_verify_entry HADOOP_TOOLS_OPTIONS "hadoop-gcp"; then
     hadoop_add_profile "hadoop-gcp"
   fi
   
   function _hadoop-gcp_hadoop_classpath
   {
     hadoop_add_classpath 
"${HADOOP_TOOLS_HOME}/${HADOOP_TOOLS_LIB_JARS_DIR}/hadoop-gcp-3.5.0-SNAPSHOT.jar"
   }
   ```
   
   By default, hadoop-gcp is not on the classpath:
   
   ```
   > bin/hadoop fs -ls gs://cnauroth-hive-metastore-proxy-dist
   -ls: Fatal internal error
   java.lang.RuntimeException: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.gs.GoogleHadoopFileSystem not found
        at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2724)
        at 
org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:3569)
        at 
org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:3612)
        at org.apache.hadoop.fs.FileSystem.access$300(FileSystem.java:172)
        at 
org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:3716)
        at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:3667)
        at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:557)
        at org.apache.hadoop.fs.Path.getFileSystem(Path.java:373)
        at org.apache.hadoop.fs.shell.PathData.expandAsGlob(PathData.java:347)
        at org.apache.hadoop.fs.shell.Command.expandArgument(Command.java:265)
        at org.apache.hadoop.fs.shell.Command.expandArguments(Command.java:248)
        at 
org.apache.hadoop.fs.shell.FsCommand.processRawArguments(FsCommand.java:105)
        at org.apache.hadoop.fs.shell.Command.run(Command.java:192)
        at org.apache.hadoop.fs.FsShell.run(FsShell.java:327)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:82)
        at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:97)
        at org.apache.hadoop.fs.FsShell.main(FsShell.java:390)
   Caused by: java.lang.ClassNotFoundException: Class 
org.apache.hadoop.fs.gs.GoogleHadoopFileSystem not found
        at 
org.apache.hadoop.conf.Configuration.getClassByName(Configuration.java:2628)
        at 
org.apache.hadoop.conf.Configuration.getClass(Configuration.java:2722)
        ... 16 more
   ```
   
   Add this line to `etc/hadoop/hadoop-env.sh`:
   
   ```
   export HADOOP_OPTIONAL_TOOLS=hadoop-gcp
   ```
   
   Now it's on the classpath:
   
   ```
   bin/hadoop fs -ls gs://cnauroth-bucket
   
   Found 80 items
   ...
   ```
   
   ### For code changes:
   
   - [X] Does the title or this PR starts with the corresponding JIRA issue id 
(e.g. 'HADOOP-17799. Your PR title ...')?
   - [X] Object storage: have the integration tests been executed and the 
endpoint declared according to the connector-specific documentation?
   - [ ] If adding new dependencies to the code, are these dependencies 
licensed in a way that is compatible for inclusion under [ASF 
2.0](http://www.apache.org/legal/resolved.html#category-a)?
   - [ ] If applicable, have you updated the `LICENSE`, `LICENSE-binary`, 
`NOTICE-binary` files?
   
   




> Add native support for GCS connector
> ------------------------------------
>
>                 Key: HADOOP-19343
>                 URL: https://issues.apache.org/jira/browse/HADOOP-19343
>             Project: Hadoop Common
>          Issue Type: Improvement
>          Components: fs
>    Affects Versions: 3.5.0
>            Reporter: Abhishek Modi
>            Assignee: Arunkumar Chacko
>            Priority: Major
>              Labels: pull-request-available
>         Attachments: GCS connector for Hadoop.pdf, Google Cloud Storage 
> connector for Hadoop-1.pdf, Google Cloud Storage connector for Hadoop.pdf, 
> Google Cloud Storage connector for Hadoop.v1.pdf, Google Cloud Storage 
> connector for Hadoop_v1.pdf
>
>




--
This message was sent by Atlassian Jira
(v8.20.10#820010)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to