This sounds great. Thanks.
Joe Witt schrieb am Mo. 19. Feb. 2018 um 23:07:
> Georg
>
> Expanding on where I think Andrew was going...
>
> With the NiFi 1.5.0 release and the associated NiFi Registry release
> you can now do this in a really powerful way. You can now create
The issue with RouteOnAttribute is that the Regular Expression must be known
when the NiFi Workflow is built. There are hundreds of unique patterns and each
match must go to a different location in the target system. It also needs to
be updatable without changing the NiFi workflow.
Thanks
I saw an issue in a test :(. I will continue looking into current approach.
On Mon, 19 Feb 2018 at 22:23 Juan Pablo Gardella <
gardellajuanpa...@gmail.com> wrote:
> Hello team,
>
> I filed an issue at https://issues.apache.org/jira/browse/NIFI-4893. I
> discovered using a complex Avro schema.
Hello team,
I filed an issue at https://issues.apache.org/jira/browse/NIFI-4893. I
discovered using a complex Avro schema. I've isolated the issue and also
did a patch. At least, this solve the issue but actually I don't know well
the implications on that solution.
Please let me know what do you
You could also RouteOnAttribute [1] doing the same regular
expression checks using NIFI expression language [2].
For instance, I can create a route for "DATA A" if my attribute "filename"
matches a regular expression.
${filename:find'SomeJavaRegEx')}
If you can control your workflow maybe bring
The problem I see with the Scan Processors is they don’t return anything other
than matched or not matched. And Lookup Services assume you have a specific key
you’re trying to match. I need to look for regular expression matches by
comparing against a list of regular expressions not specific
Georg
Expanding on where I think Andrew was going...
With the NiFi 1.5.0 release and the associated NiFi Registry release
you can now do this in a really powerful way. You can now create an
'Error Handling' flow which you can version control in the nifi
registry. You can then use this
A Processing Group would serve as a reusable unit, which then has version
control applied to it. You would model your flows to abstract interactions
with e.g. in/output port objects.
They also mentioned that nested PG can be versioned separately, looks like
git submodules behavior, but I haven't
Maybe take a look at one of the ScanContent/Attribute processors? It allows
to map in a reloadable external file.
Or, better yet, one of the LookupService variants, which is more generic.
HTH,
Andrew
On Mon, Feb 19, 2018, 3:10 PM Shawn Weeks wrote:
> Hi, I’m looking
How can I modularize and reuse parts of my nifi flows?
I.e. Have a shared logging and error handling strategy which can be
centrally configureed for all processor groups.
Best Georg
Hello all,
I'm trying to use NiFi behind the proxy (tried both options: Nginx and
Apache2). In every case I get the NiFi UI loaded properly and I'm able to
compose a simple flow, define service controller. But when I try to
disable/edit service controller I receive an ERROR dialog with the
Hi, I'm looking for some ideas on how to handle a workflow I'm developing. I
have NiFi monitoring a drop off location where files are delivered. Files from
this drop off location need to be routed to various target directories based on
a series of file name patterns. Currently I have a custom
Matt,
Regarding all mentioned points:
> The post you mention has a comment in there suggesting perhaps the OS
> user running NiFi/Java might somehow be restricted from obtaining its own
> IP address, could that be an issue? Alternatively, if it cannot get a
> SessionState for some reason, I
After some trial and error I discovered that my check whether the sys.argv
list was empty was failing. I had been trying this
if not sys.argv
which I expected to work in the case where I was running from ExecuteScript
rather than the linux command line. In the case running from the command
line, I
The post you mention has a comment in there suggesting perhaps the OS
user running NiFi/Java might somehow be restricted from obtaining its
own IP address, could that be an issue? Alternatively, if it cannot
get a SessionState for some reason, I would presume there would be
error(s) in the hive
Hi Matt,
Thank you for reply.
NiFi is compiled exactly like you mentioned. Otherwise we could not use
PutHiveQL processors.
We compiled nifi 1.6 snapshot using exactly:
mvn -T C2.0 clean install -Phortonworks -Dhive.version=1.2.1000.2.6.4.0-91
-Dhive.hadoop.version=2.7.3.2.6.4.0-91
Thanks Joe!
I hope there will be simple solution of this...
Please note the same error exists also on hortonworks version of nifi...
I can only suspect the problem can connected to authentication process when
nifi and cluster is unsecured. Unlike HiveQL processors, HiveStreaming has no
place to
Mike,
Joe is correct, in order for Apache NiFi to interact with HDP Hive,
the Hive client dependencies need to be swapped out, as HDP Hive 1.x
components are not 100% compatible with Apache Hive 1.x components.
This can be done (in general) while building NiFi with Maven, by using
a vendor
Mike - ah i see . Thanks for clarifying.
I think the issue is that to interact properly with the Hive version
in HDP we have to swap out some dependencies at build time.
Mattyb is definitely more knowledgeable so hopefully he can comment soon.
On Mon, Feb 19, 2018 at 8:06 AM, Michal Tomaszewski
Hi Joe,
Thanks for prompt answer.
As I wrote - I'm currently using NiFi 1.6 snapshot compiled from latest
community sources https://github.com/apache/nifi.
I don't use Hortonworks' NiFi version. I verified Hortonworks' NiFi version
only in order to be sure the problem is not connected with my
Mike,
Dev is fine but I think part of the difficultly for this group here is
you're referring to a vendor distribution the bundles apache nifi.
The libraries involved are different than what we build/provide
directly. If you can recreate the same problem using apache nifi as
we provide it as a
Hi Team,
Should I send this question to NiFi dev list instead of NiFi users?
Regards,
Mike
>> 2018-02-15 17:42:29,901 ERROR [Timer-Driven Process Thread-11] hive.log Got
>> exception: java.lang.NullPointerException null
>> java.lang.NullPointerException: null
>>at
>>
22 matches
Mail list logo