Aaron,
Excellent! Glad that you're seeing better results. Sorry about that. Let us
know if you run into any other strangeness!
Thanks
-Mark
> On Aug 3, 2016, at 6:18 PM, Aaron Longfield wrote:
>
> I backported the patch from the master branch and it applies without
I backported the patch from the master branch and it applies without
changing much at all. Workflow processing works fine by my eye, but I do
see quite a few provenance warnings logged. I haven't tried out to see how
that repository is working yet, but I just pushed a few million flowfiles
Carl,
This is a great suggestion. I think you can improve the performance of your
script by moving the StringBuilder outside of the loop (or even further, simply
reading all the lines into a list and joining them with the delimiter you want):
import java.nio.charset.StandardCharsets
Hello,
Adding the dev alias as well to see if anyone else knows the answer.
-Bryan
On Fri, Jul 29, 2016 at 10:37 AM, Mariama Barr-Dallas wrote:
> Hello,
> I am attempting to add a Controller Service to a processor property via
> the rest API by changing the descriptors
Hi,
I have exactly the same use case to periodically get rows from some
security appliances with just a read only access.
Currently (without NiFi), we use an SQL query to track the maximum
value, depending on the DB/appliance/vendor, it could be a simple
"SELECT getdate()" or "select
For that approach I would think either the MapCache or the File would work. The
trick will be getting the max value out of the flow file. After
QueryDatabaseTable you could split the Avro and convert to JSON (or vice
versa), then update the MapCache or File. I'm not sure the order of records is
Hi,
Thanks for this.
I did think about a MV but unfortunately I haven’t access to create views –
just read access. That would have been my simplest option ;-) Life’s never that
easy though is it?
The only part of the sql I need to be dynamic is the date parameter (I could
even use the id
Conrad,
Is it possible to add a view (materialized or not) to the RDBMS? That
view could take care of the denormalization and then
QueryDatabaseTable could point at the view. The DB would take care of
the push-down filters, which functionally is like if you had a
QueryDatabaseTable for each table
Hi,
My use case is that I want to ship a load of rows from an RDMS periodically and
put in HDFS as Avro.
QueryTable processor has functionality that would be great i.e. maxcolumn value
(there are couple of columns I could use for this from the data) and it is this
functionality I am looking