[ 
https://issues.apache.org/jira/browse/NIFI-3666?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15978184#comment-15978184
 ] 

Koji Kawamura commented on NIFI-3666:
-------------------------------------

[~joewitt] I looked at following tests, using Windows Server 2016 Datacenter, 
JDK 1.8.0_121

h4.SIte-to-Site
nifi-commons/nifi-site-to-site-client/src/test/java/org/apache/nifi/remote/client/http/TestHttpClient.java
- testSendTimeout: Worked successfully.
- testSendTimeoutAfterDataExchange: Failed. The test expects a timeout error 
occurs when a client sends data. However, the test can fail when a client 
attempts to create a transaction before sending data. It seems on Windows (at 
least on my VM or test server) the initial connection takes longer than other 
environment. Initial connection timed out, took longer than 500ms. Changing 
timeout from 500ms to 1000ms made the test run successfully. It doesn't seem an 
OS dependent test but certainly depends on server spec.

h4.WebSocket
nifi-nar-bundles/nifi-websocket-bundle/nifi-websocket-services-jetty/src/test/java/org/apache/nifi/websocket/TestJettyWebSocketCommunication.java
- testClientServerCommunication: It works fine on my environment. However, it 
uses CountDownLatch to assert async events. That can fail on certain 
environment if took longer than expected. Timeout is set as 5 seconds.

h4.Kafka
nifi-nar-bundles/nifi-kafka-bundle/nifi-kafka-0-9-processors/src/test/java/org/apache/nifi/processors/kafka/pubsub/ConsumeKafkaTest.java
- validateConsumerRetainer: It works fine. But I was able to make it fail with 
shorter timeout setting. 

[~JPercivall], I ran AttributeRollingWindow test on my Windows machine and it 
succeeded without issue. I think above S2S, WebSocket and Kafka tests are the 
same as AttributeRollingWindow test. It can fail but not Windows specific.

Supposedly, these time related test can be fixed by designing timeout long 
enough, but it requires longer build time, too. I like idea of quarantine tests 
based on condition. I think it would be great if we can annotate these tests as 
time-consuming ones and only run these tests when we planned to release or some 
other important phase of development cycle. I certainly do not want to abandon 
these tests.


> Skipped tests on windows need to be validated or fixed
> ------------------------------------------------------
>
>                 Key: NIFI-3666
>                 URL: https://issues.apache.org/jira/browse/NIFI-3666
>             Project: Apache NiFi
>          Issue Type: Bug
>          Components: Core Framework, Extensions
>            Reporter: Joseph Witt
>            Priority: Critical
>
> In NIFI-3440 a number of relatively recently created tests were failing on 
> windows.  These tests were skipped when running on windows to help keep the 
> build moving along and to continue to test regressions on older more stable 
> tests.  However, this approach leaves room for error because we must go 
> through each and validate whether it was a bad test that needs to be fixed to 
> be more stable/portable OR whether the test exposed a bug in the code and its 
> behavior on windows.



--
This message was sent by Atlassian JIRA
(v6.3.15#6346)

Reply via email to