[ 
https://issues.apache.org/jira/browse/HADOOP-18090?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17481853#comment-17481853
 ] 

Sean Busbey commented on HADOOP-18090:
--------------------------------------

I'm not sure what kind of validation you're looking for. Generally, just about 
anyone can submit things as a bug against hadoop (as you have done here). I 
believe you have filed it against the correct component.

If you mean "can someone fix this", that's all done essentially by folks 
volunteering. If no one picks up this issue you might try discussing it on the 
dev list. It will get much more attention if you attempt to fix things.

To me, this looks like a mismatch in expectations around SFTPFileSystem and our 
classpath isolated client libraries. The two questions that I would work out are

1. _should_ SFTPFileSystem be included in the client libraries or is it 
misplaced? This feels akin to the issue we had with s3a in HADOOP-16080 and 
ideally solved akin to HADOOP-15387
1. Presuming this should stay where it is, fixing it means changing the set of 
included relocated classes. Why isn't this class already present if it's 
reachable from a class we include? How extensive a change is correcting that.

> Exclude com/jcraft/jsch classes from being shaded/relocated
> -----------------------------------------------------------
>
>                 Key: HADOOP-18090
>                 URL: https://issues.apache.org/jira/browse/HADOOP-18090
>             Project: Hadoop Common
>          Issue Type: Bug
>          Components: build
>    Affects Versions: 3.3.1
>            Reporter: mkv
>            Priority: Major
>
> Spark 3.2.0 transitively introduces hadoop-client-api and 
> hadoop-client-runtime dependencies.
> When we create a SFTPFileSystem instance 
> (org.apache.hadoop.fs.sftp.SFTPFileSystem) it tries to load the relocated 
> classes from _com.jcraft.jsch_ package.
> The filesystem instance creation fails with error:
> {code:java}
> java.lang.ClassNotFoundException: 
> org.apache.hadoop.shaded.com.jcraft.jsch.SftpException
>     at java.net.URLClassLoader.findClass(URLClassLoader.java:382)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:349)
>     at java.lang.ClassLoader.loadClass(ClassLoader.java:357) {code}
> Excluding client from transitive load of spark and directly using 
> hadoop-common/hadoop-client is the way its working for us.



--
This message was sent by Atlassian Jira
(v8.20.1#820001)

---------------------------------------------------------------------
To unsubscribe, e-mail: common-issues-unsubscr...@hadoop.apache.org
For additional commands, e-mail: common-issues-h...@hadoop.apache.org

Reply via email to