I'm trying to set up a single NiFi server that can connect to two HDFS
clusters, each with it's own Kerberos realm.
According to the NiFi docs:
"At this time, only a single krb5 file is allowed to be specified per NiFi
instance"
Is there a workaround that would allow me to connect to both
checking without
having to involve a cache.
On Fri, Jun 24, 2016 at 1:07 PM, Michael Dyer <michael.d...@trapezoid.com>
wrote:
> I'm looking for assistance in how to configure a set of processors to so
> that I only retrieve 'new' files:
>
> - A GetSFTP processor that execute
I'm looking for assistance in how to configure a set of processors to so
that I only retrieve 'new' files:
- A GetSFTP processor that executes on a daily basis.
- The GetSFTP processor has read-only access to the remote site
- Large (Multi-GB) files are added to the remote site daily.
- Naming of
Brian,
Thank you - that was the problem.
I updated my xml files from the cluster in question and all is working fine
now.
Michael
I'm looking for any additional steps that I could take to troubleshoot the
problem below.
I have two ListHDFS processors, both using Kerberos, each pointing to a
different HDFS system.
My first process works perfectly, but my second processor is throwing the
following error:
15:53:41 UTC ERROR
best we can do for Kafka is what’s described here
> > https://issues.apache.org/jira/browse/NIFI-1629 and will go into the
> > upcoming 0.6 release. Hopefully that will bring some relief.
> >
> > Cheers
> > Oleg
> >
> > On Mar 15, 2016, at 1:21 PM, Michael Dyer
I am attempting to configure a GetSFTP processor to retrieve a single file
for 'yesterday'. I do not have any control over what is located in the
remote file system and I definitely don't want to pull everything.
I have not been able to find a way to target the single file using a
dynamically