See 
<https://builds.apache.org/job/kafka-trunk-jdk8/3667/display/redirect?page=changes>

Changes:

[bbejeck] KAFKA-8399: bring back internal.leave.group.on.close config for 
KStream

------------------------------------------
[...truncated 2.45 MB...]
> Task :streams:upgrade-system-tests-21:processResources NO-SOURCE
> Task :streams:upgrade-system-tests-21:classes UP-TO-DATE
> Task :streams:upgrade-system-tests-21:checkstyleMain NO-SOURCE
> Task :streams:upgrade-system-tests-21:compileTestJava
> Task :streams:upgrade-system-tests-21:processTestResources NO-SOURCE
> Task :streams:upgrade-system-tests-21:testClasses
> Task :streams:upgrade-system-tests-21:checkstyleTest
> Task :streams:upgrade-system-tests-21:spotbugsMain NO-SOURCE
> Task :streams:upgrade-system-tests-21:test
> Task :streams:streams-scala:spotbugsMain

> Task :streams:streams-scala:test

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes
 > testShouldCountClicksPerRegionWithNamedRepartitionTopic STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes
 > testShouldCountClicksPerRegionWithNamedRepartitionTopic PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes
 > testShouldCountClicksPerRegionJava STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes
 > testShouldCountClicksPerRegionJava PASSED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes
 > testShouldCountClicksPerRegion STARTED

org.apache.kafka.streams.scala.StreamToTableJoinScalaIntegrationTestImplicitSerdes
 > testShouldCountClicksPerRegion PASSED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaJoin STARTED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaJoin PASSED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaSimple STARTED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaSimple PASSED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaAggregate STARTED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaAggregate PASSED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaProperties STARTED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaProperties PASSED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaTransform STARTED

org.apache.kafka.streams.scala.TopologyTest > 
shouldBuildIdenticalTopologyInJavaNScalaTransform PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized 
STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsMaterialized 
PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsJava STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWordsJava PASSED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWords STARTED

org.apache.kafka.streams.scala.WordCountTest > testShouldCountWords PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized 
should create a Materialized with Serdes STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialized 
should create a Materialized with Serdes PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a store name should create a Materialized with Serdes and a store name 
STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a store name should create a Materialized with Serdes and a store name 
PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a window store supplier should create a Materialized with Serdes and a 
store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a window store supplier should create a Materialized with Serdes and a 
store supplier PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a key value store supplier should create a Materialized with Serdes and a 
store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a key value store supplier should create a Materialized with Serdes and a 
store supplier PASSED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a session store supplier should create a Materialized with Serdes and a 
store supplier STARTED

org.apache.kafka.streams.scala.kstream.MaterializedTest > Create a Materialize 
with a session store supplier should create a Materialized with Serdes and a 
store supplier PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should 
filter records satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > filter a KTable should 
filter records satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should 
filter records not satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > filterNot a KTable should 
filter records not satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join 
correctly records STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables should join 
correctly records PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a 
Materialized should join correctly records and state store STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > join 2 KTables with a 
Materialized should join correctly records and state store PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > windowed KTable#suppress 
should correctly suppress results using Suppressed.untilTimeLimit STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > windowed KTable#suppress 
should correctly suppress results using Suppressed.untilTimeLimit PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > windowed KTable#suppress 
should correctly suppress results using Suppressed.untilWindowCloses STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > windowed KTable#suppress 
should correctly suppress results using Suppressed.untilWindowCloses PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > session windowed 
KTable#suppress should correctly suppress results using 
Suppressed.untilWindowCloses STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > session windowed 
KTable#suppress should correctly suppress results using 
Suppressed.untilWindowCloses PASSED

org.apache.kafka.streams.scala.kstream.KTableTest > non-windowed 
KTable#suppress should correctly suppress results using 
Suppressed.untilTimeLimit STARTED

org.apache.kafka.streams.scala.kstream.KTableTest > non-windowed 
KTable#suppress should correctly suppress results using 
Suppressed.untilTimeLimit PASSED

org.apache.kafka.streams.scala.kstream.SuppressedTest > 
Suppressed.untilWindowCloses should produce the correct suppression STARTED

org.apache.kafka.streams.scala.kstream.SuppressedTest > 
Suppressed.untilWindowCloses should produce the correct suppression PASSED

org.apache.kafka.streams.scala.kstream.SuppressedTest > 
Suppressed.untilTimeLimit should produce the correct suppression STARTED

org.apache.kafka.streams.scala.kstream.SuppressedTest > 
Suppressed.untilTimeLimit should produce the correct suppression PASSED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxRecords 
should produce the correct buffer config STARTED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxRecords 
should produce the correct buffer config PASSED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxBytes 
should produce the correct buffer config STARTED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.maxBytes 
should produce the correct buffer config PASSED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.unbounded 
should produce the correct buffer config STARTED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig.unbounded 
should produce the correct buffer config PASSED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig should 
support very long chains of factory methods STARTED

org.apache.kafka.streams.scala.kstream.SuppressedTest > BufferConfig should 
support very long chains of factory methods PASSED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should 
create a Grouped with Serdes STARTED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped should 
create a Grouped with Serdes PASSED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with 
repartition topic name should create a Grouped with Serdes, and repartition 
topic name STARTED

org.apache.kafka.streams.scala.kstream.GroupedTest > Create a Grouped with 
repartition topic name should create a Grouped with Serdes, and repartition 
topic name PASSED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should 
create a Produced with Serdes STARTED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced should 
create a Produced with Serdes PASSED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with 
timestampExtractor and resetPolicy should create a Consumed with Serdes, 
timestampExtractor and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ProducedTest > Create a Produced with 
timestampExtractor and resetPolicy should create a Consumed with Serdes, 
timestampExtractor and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should 
filter records satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > filter a KStream should 
filter records satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should 
filter records not satisfying the predicate STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > filterNot a KStream should 
filter records not satisfying the predicate PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should 
run foreach actions on records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > foreach a KStream should 
run foreach actions on records PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run 
peek actions on records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > peek a KStream should run 
peek actions on records PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should 
select a new key STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > selectKey a KStream should 
select a new key PASSED

org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should 
join correctly records STARTED

org.apache.kafka.streams.scala.kstream.KStreamTest > join 2 KStreams should 
join correctly records PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should 
create a Consumed with Serdes STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed should 
create a Consumed with Serdes PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with 
timestampExtractor and resetPolicy should create a Consumed with Serdes, 
timestampExtractor and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with 
timestampExtractor and resetPolicy should create a Consumed with Serdes, 
timestampExtractor and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with 
timestampExtractor should create a Consumed with Serdes and timestampExtractor 
STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with 
timestampExtractor should create a Consumed with Serdes and timestampExtractor 
PASSED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with 
resetPolicy should create a Consumed with Serdes and resetPolicy STARTED

org.apache.kafka.streams.scala.kstream.ConsumedTest > Create a Consumed with 
resetPolicy should create a Consumed with Serdes and resetPolicy PASSED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should 
create a Joined with Serdes STARTED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should 
create a Joined with Serdes PASSED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should 
create a Joined with Serdes and repartition topic name STARTED

org.apache.kafka.streams.scala.kstream.JoinedTest > Create a Joined should 
create a Joined with Serdes and repartition topic name PASSED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':core:test'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/tests/test/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.4.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2h 39m 25s
174 actionable tasks: 149 executed, 25 up-to-date
Build step 'Execute shell' marked build as failure
[FINDBUGS] Collecting findbugs analysis files...
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
[FINDBUGS] Searching for all files in 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/> that match the pattern 
**/build/reports/*bugs/*.xml
[FINDBUGS] Parsing 17 files in 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/clients/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/api/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/basic-auth-extension/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/file/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/json/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/runtime/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/connect/transforms/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/core/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/examples/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/generator/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/jmh-benchmarks/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/log4j-appender/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/examples/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/streams-scala/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/streams/test-utils/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
[FINDBUGS] Successfully parsed file 
<https://builds.apache.org/job/kafka-trunk-jdk8/ws/tools/build/reports/spotbugs/main.xml>
 with 0 unique warnings and 0 duplicates.
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
No credentials specified
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
<Git Blamer> Using GitBlamer to create author and commit information for all 
warnings.
<Git Blamer> GIT_COMMIT=cafdc1e7df7725c92a374ba9528f7974fe592196, 
workspace=<https://builds.apache.org/job/kafka-trunk-jdk8/ws/>
[FINDBUGS] Computing warning deltas based on reference build #3623
Recording test results
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Setting GRADLE_4_8_1_HOME=/home/jenkins/tools/gradle/4.8.1
Not sending mail to unregistered user ism...@juma.me.uk
Not sending mail to unregistered user nore...@github.com
Not sending mail to unregistered user csh...@gmail.com
Not sending mail to unregistered user wangg...@gmail.com

Reply via email to