We can work with INFRA to set up creds for remote services. I think
normally what we'd do is define a property in the pom for the
credentials and then keep those as config items on the jenkins job.
Such configs are visible to those with logins; on ASF that's
restricted to committers from any project who have been granted karma
by any PMC.

For things that aren't remote services we can either make something
for our job that e.g. spins up a kafka instance or rely on testing on
some infra that offers quick set up of such things. For example,
Travis-CI will let you say "I need a mongo / mysql / etc" for several
kinds of data stores. I'll have to talk with infra@ to see what might
already be available.

On Wed, Oct 14, 2015 at 2:47 PM, dan bress <[email protected]> wrote:
> I agree with what JoeS is saying.   Having these core integration aspects
> of the system tested in an automated way only when needed(like before a
> release) would be a huge plus for NiFi.
>
> I think Sean provides a great mechanism for implementing this with
> Categories.  Sean, can you speak to the build server bit?  Is this
> something we have?  Is this something we are working towards?
>
> We would still need a mechanism for capturing the credentials required to
> run tests that integrate with remote services.  Maybe this lives on the
> build server?  Maybe we have a stub .properties file that shows credentials
> you need to interact with AWS, kafka, Twitter, etc..
>
> Also if we can tag TestJdbcHugeStream with @SlowTests so I don't have to
> wait for it every time I run tests then I will be t h r i l l e d.
>
> Dan
>
> On Wed, Oct 14, 2015 at 3:19 PM Sean Busbey <[email protected]> wrote:
>
>> JUnit Categories would work perfectly for this:
>>
>> https://github.com/junit-team/junit/wiki/Categories
>>
>> You can group either at the class test layer or on individual tests.
>>
>> we can then run the categories that require external resources on
>> build servers that have the needed instances present.
>>
>> On Wed, Oct 14, 2015 at 1:46 PM, Joe Skora <[email protected]> wrote:
>> > I agree with you that these shouldn't run as normal unit tests.  But, I'm
>> > worried that it means this functionality will not be regularly tested as
>> > external APIs change.  Mocking an external API doesn't validate much, if
>> I
>> > don't understand the external resource I'll probably make the same
>> mistakes
>> > in the mock as I do in the client, so testing against the real external
>> > resource is the only way to know things work.
>> >
>> > Perhaps a set of test scripts are needed to automate groups of @ignored
>> > tests that share an external dependency such as an AWS script, a Hadoop
>> > script, a Kafka script, etc?
>> >
>> > I have never used JUnit's filtering capability, but it might be possible
>> to
>> > use that in combination with a Maven profile to automate those test
>> groups
>> > instead of requiring scripts be maintained.
>> >
>> > Just brainstorming.
>> >
>> > Regards,
>> > JoeS
>> >
>> > On Mon, Oct 12, 2015 at 8:22 PM, Joe Witt <[email protected]> wrote:
>> >
>> >> Dan,
>> >>
>> >> Yeah this is a good housekeeping item to bring up.  My view here is
>> >> that B is the only answer.  Unit tests should not be calling out to
>> >> external services - period.
>> >>
>> >> Thanks
>> >> Joe
>> >>
>> >> On Mon, Oct 12, 2015 at 8:17 PM, dan bress <[email protected]> wrote:
>> >> > Devs,
>> >> >    While working on integrating and testing the work Yu did for
>> >> > NIFI-774/DeleteS3Object, I noticed that a few of the unit tests for
>> the
>> >> > processors in that AWS bundle(Put and Fetch S3) actually interact
>> with S3
>> >> > directly, and were marked as @Ignore.  If I wanted to un @Ignore them
>> and
>> >> > actually run them I needed to set up an AWS account, then copy the
>> >> > credentials into an aws-credentials.properties file and put that in my
>> >> home
>> >> > directory to get the tests to pass.  This dashed my hopes of a
>> relatively
>> >> > simple merge and turned it into a bit of work for me.  I'm not
>> blaming Yu
>> >> > or anyone for this, just wanted to open up a discussion on better
>> ways of
>> >> > solving this.
>> >> >
>> >> > Problem:
>> >> > @Ignore'd tests don't get run, probably ever.  Why?  Because running
>> them
>> >> > is a pain in the butt, I agree with NIFI-438[1] "If tests could talk
>> they
>> >> > would say don't @Ignore me".  I appreciate that there are special
>> >> > circumstances for using this, but it would probably benefit everyone
>> if
>> >> we
>> >> > sure we used it only for the truly special circumstances.
>> >> >
>> >> >
>> >> > Solutions:
>> >> > a. Run these tests using the failsafe plugin[2] instead of the
>> surefire
>> >> > plugin.  This way they get run every time, and if they fail that
>> >> > information gets reported but it does not stop the build.
>> >> > b. Mock out the service(I appreciate that this may not always be
>> >> possible)
>> >> > c. Provide instructions somewhere so that someone with no experience
>> with
>> >> > these processors/tests can run them.
>> >> >
>> >> > Anyone have thoughts on this?
>> >> > My votes would be B, and if that is not possible A and C.
>> >> >
>> >> > Thanks,
>> >> > Dan
>> >> >
>> >> > [1] https://issues.apache.org/jira/browse/NIFI-438
>> >> > [2] https://maven.apache.org/surefire/maven-failsafe-plugin/
>> >>
>>
>>
>>
>> --
>> Sean
>>



-- 
Sean

Reply via email to