Re: Desire to view more than 100 items within a queue

2024-04-24 Thread Michael Moser
This issue was discussed in the past but never reached a consensus for an
implementation that managed the concerns.  My email is only to provide a
link to that pull request and discussion, for those who are curious.

https://github.com/apache/nifi/pull/4641

-- Mike



On Tue, Apr 23, 2024 at 5:47 PM Joe Witt  wrote:

> Edward
>
> Moved those cc'd to bcc including yourself.  To really send/receive notes
> you'll want to subscribe to the mailing list.
>
> We have to effectively draw the line somewhere at how much data is
> retrieved and the click to content mechanism supported isn't meant to be
> exhaustive as-is.  We can/should add some pagination but when you consider
> most queues are actually changing/evolving so fast it didn't make sense.
> For the pattern you use which is effectively a dead-letter queue though
> your ask makes perfect sense.
>
> Thanks
>
> On Tue, Apr 23, 2024 at 2:38 PM Edward Wang
>  wrote:
>
> > To whom it may concern,
> >
> > We are using NiFi to pass FlowFiles into ExecuteStreamCommand processors.
> > Occasionally, running the scripts using that processor results in errors
> > that route into the nonzero status relationship.
> > We are pointing the nonzero status relationships to non-functioning
> > processors, so that they will remain within the connection indefinitely
> and
> > we can track them.
> >
> > However, whenever we list the contents of a connection, we can only see
> > the details of 100 FlowFiles at a time, despite the connection holding
> more
> > than 100.
> > This is the case using the web interface and through calling the API.
> >
> > I was wondering if anyone had experience or any ideas regarding using
> list
> > queue to show all items.
> >
> > Thank you.
> >
> > Sincerely,
> > Edward
> >
>


Re: Desire to view more than 100 items within a queue

2024-04-23 Thread Joe Witt
Edward

Moved those cc'd to bcc including yourself.  To really send/receive notes
you'll want to subscribe to the mailing list.

We have to effectively draw the line somewhere at how much data is
retrieved and the click to content mechanism supported isn't meant to be
exhaustive as-is.  We can/should add some pagination but when you consider
most queues are actually changing/evolving so fast it didn't make sense.
For the pattern you use which is effectively a dead-letter queue though
your ask makes perfect sense.

Thanks

On Tue, Apr 23, 2024 at 2:38 PM Edward Wang
 wrote:

> To whom it may concern,
>
> We are using NiFi to pass FlowFiles into ExecuteStreamCommand processors.
> Occasionally, running the scripts using that processor results in errors
> that route into the nonzero status relationship.
> We are pointing the nonzero status relationships to non-functioning
> processors, so that they will remain within the connection indefinitely and
> we can track them.
>
> However, whenever we list the contents of a connection, we can only see
> the details of 100 FlowFiles at a time, despite the connection holding more
> than 100.
> This is the case using the web interface and through calling the API.
>
> I was wondering if anyone had experience or any ideas regarding using list
> queue to show all items.
>
> Thank you.
>
> Sincerely,
> Edward
>


Desire to view more than 100 items within a queue

2024-04-23 Thread Edward Wang
To whom it may concern,

We are using NiFi to pass FlowFiles into ExecuteStreamCommand processors. 
Occasionally, running the scripts using that processor results in errors that 
route into the nonzero status relationship.
We are pointing the nonzero status relationships to non-functioning processors, 
so that they will remain within the connection indefinitely and we can track 
them.

However, whenever we list the contents of a connection, we can only see the 
details of 100 FlowFiles at a time, despite the connection holding more than 
100.
This is the case using the web interface and through calling the API.

I was wondering if anyone had experience or any ideas regarding using list 
queue to show all items.

Thank you.

Sincerely,
Edward