Hi Joe,

I think there is definitely some power in being able to enforce consistent
mock nifi performance with single threading.    I honestly haven't thought
it out completely, but eyeballing it and just keeping track of how many
milliseconds things took to run would be a good start.  Maybe using
something like JunitPerf would work?

Thanks,

On Wed, Jan 6, 2016 at 12:43 AM, Joe Witt <[email protected]> wrote:

> Vincent,
>
> As Aldrin mentioned there is an effort underway to help with
> integration testing.  But I'm not sure yet if it will help with what
> you're looking to do from a performance baseline perspective.  I
> totally get Mark's cautionary comments but given your response it does
> seem like there is something we can do to help.  So let's say we have
> a nice way for someone to create an integration test which wires
> together processors to do something meaningful.  This obviously
> verifies functionality of a chain of processors.  But, you also seem
> to have an idea in mind for comparing performance.
>
> I agree there could be something telling about that given the test
> environment could enforce single threading and such which might help
> it reveal interesting deviations.  Not sure. How do you see that
> working in terms of you being able to capture the data in successive
> test runs to validate whether things have deviated?  Would
> manual/eyeball processes be sufficient as a first start?
>
> Thanks
> Joe
>
> On Tue, Jan 5, 2016 at 5:59 PM, Vincent Russell
> <[email protected]> wrote:
> > Thanks Mark,
> >
> > I didn't mean to suggest that I would like to compare the results of my
> test
> > with actual nifi performance, but I would think that I would be able to
> > compare different run iterations with each other.
> >
> > I did notice that the MockFlowFile is pretty much a wrapper around a byte
> > array, which may or may not be a problem for my small scale testing.
> >
> >
> > On Tue, Jan 5, 2016 at 5:40 PM, Mark Payne <[email protected]> wrote:
> >>
> >> Vincent,
> >>
> >> I would be vary wary about trusting performance results that you obtain
> by
> >> using the Mock Framework.
> >> The mock framework is intended to be used only for testing correctness,
> >> not performance. It has very
> >> different threading characteristics than the "actual" NiFi framework,
> and
> >> it uses very different FlowFile,
> >> Content, and Provenance Repositories. Processor A may perform far better
> >> than Processor B in the
> >> mock framework, but that does not by any means guarantee that it will
> also
> >> perform better in a live
> >> environment.
> >>
> >> Thanks
> >> -Mark
> >>
> >> On Jan 5, 2016, at 5:17 PM, Vincent Russell <[email protected]>
> >> wrote:
> >>
> >> Hello Aldrin,
> >>
> >> Thanks for the response.
> >>
> >> My current use case is that I would like to chain several processors
> >> together and write a performance test against that mini flow and then be
> >> free to modify the processors that are in the chain and see how
> performance
> >> changes.    I think I may be able to chain several TestRunners together
> to
> >> achieve my goal, although this isn't ideal.
> >>
> >> Ideally I'd be able to provide the TestRunner with multiple Processors
> and
> >> identify how the processors are chained together.
> >>
> >> Thanks,
> >>
> >>
> >>
> >> On Mon, Jan 4, 2016 at 3:26 PM, Aldrin Piri <[email protected]>
> wrote:
> >>>
> >>> Hello Vincent,
> >>>
> >>> This is something that does not exist and there have been a few threads
> >>> on this topic [1][2].
> >>>
> >>> Summarily, these tools do not currently exist due to the preference of
> >>> using the interactive and real-time command and control over the flow
> as
> >>> well as the increasing difficulty of maintaining flows as they grow and
> >>> evolve.
> >>>
> >>> There are some good tips on how other people have tackled the problem
> in
> >>> the linked message threads.  One alternative suggestion is making use
> of
> >>> NiFi's template functionality [3] to work on stubbing out flows on a
> >>> different instance and use that to promote an entire flow or segment to
> >>> another system.  Templates are an area we are planning to both enhance
> and
> >>> mature as laid out in some of our feature proposals [4][5].
> >>>
> >>> Please let us know if this is accomplishes the functionality you are
> >>> looking for or if we are coming up a bit short on some of what your
> needs
> >>> are for integration level testing.  Your case is common and certainly
> one we
> >>> need to execute on well.  Any feedback you can provide from your
> perspective
> >>> both in view of the current state of templates and the path forward as
> laid
> >>> out in the proposals would be much appreciated!
> >>>
> >>> Thanks!
> >>>
> >>> --aldrin
> >>>
> >>> [1]
> >>>
> https://mail-archives.apache.org/mod_mbox/nifi-dev/201502.mbox/%[email protected]%3E
> >>> [2]
> >>>
> http://apache-nifi-developer-list.39713.n7.nabble.com/Great-question-on-nifi-IRC-room-today-NiFi-BPM-sharing-configuration-td787.html#a811
> >>> [3]
> https://nifi.apache.org/docs/nifi-docs/html/user-guide.html#templates
> >>> [4]
> >>>
> https://cwiki.apache.org/confluence/display/NIFI/Extension%2C+Template%2C+Dataset+Registry
> >>> [5]
> >>>
> https://cwiki.apache.org/confluence/display/NIFI/Configuration+Management+of+Flows
> >>>
> >>>
> >>> On Mon, Jan 4, 2016 at 1:37 PM, Vincent Russell
> >>> <[email protected]> wrote:
> >>>>
> >>>> All,
> >>>>
> >>>> I see that there is a way to test a single processor with the
> TestRunner
> >>>> (StandardProcessorTestRunner) class, but is there a way to set up an
> >>>> integration test to test a complete flow or a subset of a flow?
> >>>>
> >>>> Thank you,
> >>>> Vincent
> >>>
> >>>
> >>
> >>
> >
>

Reply via email to