Hi Brian,

Thanks for the reply. 

Is there a way to compile NiFi using the Hadoop 2.8.0 libraries?

It's of course unfortunate, but the libraries you mentioned before works in 
their very specific version. Once you use a newer version (like 
azure-storage-2.2.0) then things seem to break.

Maybe this jira [^1] could be reopened then? 😊

Cheers,

Giovanni

[^1]: https://issues.apache.org/jira/browse/NIFI-1922

> -----Original Message-----
> From: Bryan Bende [mailto:[email protected]]
> Sent: Tuesday, April 4, 2017 3:59 PM
> To: [email protected]
> Subject: Re: GetHDFS from Azure Blob
> 
> Giovanni,
> 
> I'm not that familiar with using a key provider, but NiFi currently bundles 
> the
> Hadoop 2.7.3 client, and looking at ProviderUtils from 2.7.3, there doesn't
> appear to be a method
> "excludeIncompatibleCredentialProviders":
> 
> https://github.com/apache/hadoop/blob/release-2.7.3-RC2/hadoop-common-
> project/hadoop-
> common/src/main/java/org/apache/hadoop/security/ProviderUtils.java
> 
> It looks like it is introduced in 2.8.0:
> 
> https://github.com/apache/hadoop/blob/release-2.8.0-RC3/hadoop-common-
> project/hadoop-
> common/src/main/java/org/apache/hadoop/security/ProviderUtils.java
> 
> Most likely some code that is present in one of the JARs specified through
> Additional Resources is dependent on Hadoop 2.8.0, and since NiFi is bundling
> 2.7.3, there are some things not lining up.
> 
> -Bryan
> 
> 
> On Tue, Apr 4, 2017 at 9:50 AM, Giovanni Lanzani
> <[email protected]> wrote:
> > Bryan,
> >
> > Allow me to chime in (to ask for help).
> >
> > What about when I'm using an encrypted key?
> >
> > In my case I have (in core-site.xml)
> >
> >    <property>
> >
> <name>fs.azure.account.keyprovider.nsanalyticsstorage.blob.core.windows.ne
> t</name>
> >       <value>org.apache.hadoop.fs.azure.ShellDecryptionKeyProvider</value>
> >     </property>
> >
> > Everything works from the command line (hdfs dfs).
> >
> > But NiFi complains with:
> >
> > java.lang.NoSuchMethodError:
> > org.apache.hadoop.security.ProviderUtils.excludeIncompatibleCredential
> > Providers
> >
> > Any ideas? I've already linked hadoop-commons.jar as well (besides what you
> suggested below).
> >
> > Cheers,
> >
> > Giovanni
> >
> >
> >> -----Original Message-----
> >> From: Bryan Bende [mailto:[email protected]]
> >> Sent: Tuesday, March 28, 2017 7:41 PM
> >> To: [email protected]
> >> Subject: Re: GetHDFS from Azure Blob
> >>
> >> Austin,
> >>
> >> Can you provide the full error message and stacktrace for  the
> >> IllegalArgumentException from nifi-app.log?
> >>
> >> When you start the processor it creates a FileSystem instance based
> >> on the config files provided to the processor, which in turn causes
> >> all of the corresponding classes to load.
> >>
> >> I'm not that familiar with Azure, but if "Azure blob store" is WASB,
> >> then I have successfully done the following...
> >>
> >> In core-site.xml:
> >>
> >> <configuration>
> >>
> >>     <property>
> >>       <name>fs.defaultFS</name>
> >>       <value>wasb://YOUR_USER@YOUR_HOST/</value>
> >>     </property>
> >>
> >>     <property>
> >>       <name>fs.azure.account.key.nifi.blob.core.windows.net</name>
> >>       <value>YOUR_KEY</value>
> >>     </property>
> >>
> >>     <property>
> >>       <name>fs.AbstractFileSystem.wasb.impl</name>
> >>       <value>org.apache.hadoop.fs.azure.Wasb</value>
> >>     </property>
> >>
> >>     <property>
> >>       <name>fs.wasb.impl</name>
> >>       <value>org.apache.hadoop.fs.azure.NativeAzureFileSystem</value>
> >>     </property>
> >>
> >>     <property>
> >>       <name>fs.azure.skip.metrics</name>
> >>       <value>true</value>
> >>     </property>
> >>
> >> </configuration>
> >>
> >> In Additional Resources property of an HDFS processor, point to a
> >> directory
> >> with:
> >>
> >> azure-storage-2.0.0.jar
> >> commons-codec-1.6.jar
> >> commons-lang3-3.3.2.jar
> >> commons-logging-1.1.1.jar
> >> guava-11.0.2.jar
> >> hadoop-azure-2.7.3.jar
> >> httpclient-4.2.5.jar
> >> httpcore-4.2.4.jar
> >> jackson-core-2.2.3.jar
> >> jsr305-1.3.9.jar
> >> slf4j-api-1.7.5.jar
> >>
> >>
> >> Thanks,
> >>
> >> Bryan
> >>
> >>
> >> On Tue, Mar 28, 2017 at 1:15 PM, Austin Heyne <[email protected]> wrote:
> >> > Hi all,
> >> >
> >> > Thanks for all the help you've given me so far. Today I'm trying to
> >> > pull files from an Azure blob store. I've done some reading on this
> >> > and from previous tickets [1] and guides [2] it seems the
> >> > recommended approach is to place the required jars, to use the HDFS
> >> > Azure protocol, in 'Additional Classpath Resoures' and the hadoop
> >> > core-site and hdfs-site configs into the 'Hadoop Configuration
> >> > Resources'. I have my local HDFS properly configured to access wasb
> >> > urls. I'm able to ls,
> >> copy to and from, etc with out problem.
> >> > Using the same HDFS config files and trying both all the jars in my
> >> > hadoop-client/lib directory (hdp) and using the jars recommend in
> >> > [1] I'm still seeing the "java.lang.IllegalArgumentException: Wrong FS: "
> >> > error in my NiFi logs and am unable to pull files from Azure blob 
> >> > storage.
> >> >
> >> > Interestingly, it seems the processor is spinning up way to fast,
> >> > the errors appear in the log as soon as I start the processor. I'm
> >> > not sure how it could be loading all of those jars that quickly.
> >> >
> >> > Does anyone have any experience with this or recommendations to try?
> >> >
> >> > Thanks,
> >> > Austin
> >> >
> >> > [1] https://issues.apache.org/jira/browse/NIFI-1922
> >> > [2]
> >> > https://community.hortonworks.com/articles/71916/connecting-to-azur
> >> > e-d
> >> > ata-lake-from-a-nifi-dataflow.html
> >> >
> >> >

Reply via email to