See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/72/display/redirect?page=changes>

Changes:

[Kenneth Knowles] Fix checkArgument format string in ByteKeyRange

[chamikaramj] Fix release date

[chamikaramj] Few more fixes to the Website


------------------------------------------
[...truncated 331.00 KB...]
Task ':sdks:java:extensions:protobuf:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:protobuf:compileTestJava' 
with cache key b12b7c20cdbc38d89f6997323dae10ad
:sdks:java:extensions:protobuf:compileTestJava (Thread[Execution **** Thread 
2,5,main]) completed. Took 0.151 secs.
Resolve mutations for :sdks:java:extensions:protobuf:testClasses 
(Thread[included builds,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:testClasses 
(Thread[included builds,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:testClasses (Thread[Execution **** Thread 
7,5,main]) started.

> Task :sdks:java:extensions:protobuf:testClasses
Skipping task ':sdks:java:extensions:protobuf:testClasses' as it has no actions.
:sdks:java:extensions:protobuf:testClasses (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:extensions:protobuf:testJar (Thread[Execution 
**** Thread 2,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:testJar (Thread[Execution 
**** Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:testJar (Thread[included builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
> FROM-CACHE
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:****:legacy-****:compileJava'.
Build cache key for task 
':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is 
700ebccf0cab2e59ba7bb8fa589ffd6d
Task ':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' is not 
up-to-date because:
  No history is available.
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:****:legacy-****:compileJava' with cache 
key 700ebccf0cab2e59ba7bb8fa589ffd6d
:runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.454 secs.
Resolve mutations for 
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution 
**** Thread 7,5,main]) started.
Resolve mutations for 
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution 
**** Thread 7,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution 
**** Thread 6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:****:legacy-****:classes' as 
it has no actions.
:runners:google-cloud-dataflow-java:****:legacy-****:classes (Thread[Execution 
**** Thread 6,5,main]) completed. Took 0.0 secs.
Resolve mutations for 
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for 
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** Thread 5,5,main]) started.

> Task :sdks:java:extensions:protobuf:testJar
Caching disabled for task ':sdks:java:extensions:protobuf:testJar' because:
  Not worth caching
Task ':sdks:java:extensions:protobuf:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:protobuf:testJar (Thread[included builds,5,main]) 
completed. Took 0.071 secs.
work action resolve beam-sdks-java-extensions-protobuf-tests.jar (project 
:sdks:java:extensions:protobuf) (Thread[Execution **** Thread 6,5,main]) 
started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:compileTestJava 
(Thread[Execution ****,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:compileTestJava 
(Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** 
Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar'.
Build cache key for task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is 
cf135fc68ed2695089a461112c6496f9
Task ':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' is not 
up-to-date because:
  No history is available.
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:****:legacy-****:shadowJar' with cache key 
cf135fc68ed2695089a461112c6496f9
:runners:google-cloud-dataflow-java:****:legacy-****:shadowJar 
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.366 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar 
(project :runners:google-cloud-dataflow-java:****:legacy-****) 
(Thread[Execution **** Thread 7,5,main]) started.
work action null (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 
secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar 
(project :runners:google-cloud-dataflow-java:****:legacy-****) 
(Thread[Execution ****,5,main]) started.
work action null (Thread[Execution ****,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task 
':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' 
is a2e9f8223fbd4227565cb5d0d9dbeec5
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date 
because:
  No history is available.
Loaded cache entry for task 
':sdks:java:io:google-cloud-platform:compileTestJava' with cache key 
a2e9f8223fbd4227565cb5d0d9dbeec5
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** 
Thread 3,5,main]) completed. Took 0.774 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses 
(Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses 
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 
6,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no 
actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 
6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar 
(Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar 
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[included builds,5,main]) 
started.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Not worth caching
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[included builds,5,main]) 
completed. Took 0.312 secs.
work action resolve beam-sdks-java-io-google-cloud-platform-tests.jar (project 
:sdks:java:io:google-cloud-platform) (Thread[Execution **** Thread 6,5,main]) 
started.
work action null (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava 
(Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava 
(Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** 
Thread 3,5,main]) started.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' 
is 31abcf8b33a79cd4c19c4c9ed4dc6936
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date 
because:
  No history is available.
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:compileTestJava' with cache key 
31abcf8b33a79cd4c19c4c9ed4dc6936
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** 
Thread 3,5,main]) completed. Took 0.499 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses 
(Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses 
(Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 
2,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 
2,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar 
(Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar 
(Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 
4,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
 not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 
4,5,main]) completed. Took 0.032 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project 
:runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 3,5,main]) 
started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest 
(Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest 
(Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 
4,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) 
started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) 
completed. Took 0.0 secs.

> Task :sdks:java:io:sparkreceiver:2:integrationTest
Custom actions are attached to task 
':sdks:java:io:sparkreceiver:2:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is 
938288397faba27691b2eec9df70331c
Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 69'. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:[email protected]:5672","--streamName=rabbitMqTestStream","--readTimeout=900","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.44.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 
-Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work>
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 69'
Successfully started process 'Gradle Test Executor 69'

Gradle Test Executor 69 started executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/legacy-****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.44.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > 
testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - 
5000000 records were successfully written to RabbitMQ
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO - 
ReadFromSparkReceiverWithOffsetDoFn started reading
    [pool-10-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 0
    [pool-14-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-10-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-18-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-14-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-22-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-18-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-22-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-26-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-26-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-30-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-30-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-34-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-34-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-38-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-38-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-42-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-42-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-46-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-46-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-50-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-50-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-54-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-54-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-58-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-58-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-62-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-62-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-66-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-66-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-70-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-70-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-74-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-74-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-78-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-78-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-82-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-82-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Stopping 
receiver
    [pool-86-thread-1] INFO 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Starting 
receiver with offset 5000000
    [pool-86-thread-1] ERROR 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset - Can not basic 
consume
    [AMQP Connection 34.170.36.4:5672] WARN 
com.rabbitmq.client.impl.ForgivingExceptionHandler - An unexpected connection 
driver error occurred (Exception message: Socket closed)
    java.util.concurrent.TimeoutException
        at com.rabbitmq.utility.BlockingCell.get(BlockingCell.java:77)
        at 
com.rabbitmq.utility.BlockingCell.uninterruptibleGet(BlockingCell.java:120)
        at 
com.rabbitmq.utility.BlockingValueOrException.uninterruptibleGetValue(BlockingValueOrException.java:36)
        at 
com.rabbitmq.client.impl.AMQChannel$BlockingRpcContinuation.getReply(AMQChannel.java:502)
        at com.rabbitmq.client.impl.AMQConnection.start(AMQConnection.java:326)
        at 
com.rabbitmq.client.impl.recovery.RecoveryAwareAMQConnectionFactory.newConnection(RecoveryAwareAMQConnectionFactory.java:65)
        at 
com.rabbitmq.client.impl.recovery.AutorecoveringConnection.init(AutorecoveringConnection.java:160)
        at 
com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1216)
        at 
com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1173)
        at 
com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1131)
        at 
com.rabbitmq.client.ConnectionFactory.newConnection(ConnectionFactory.java:1294)
        at 
org.apache.beam.sdk.io.sparkreceiver.RabbitMqReceiverWithOffset.receive(RabbitMqReceiverWithOffset.java:98)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    [Test ****] ERROR org.apache.beam.sdk.testutils.metrics.MetricsReader - 
Failed to get metric spark_read_element_count, from namespace 
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > 
testSparkReceiverIOReadsInStreamingWithOffset FAILED
    java.lang.AssertionError: expected:<5000000> but was:<-1>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:347)

Gradle Test Executor 69 finished executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest

1 test completed, 1 failed
Finished generating test XML results (0.006 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.004 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest>

> Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 
4,5,main]) completed. Took 25 mins 58.537 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 26m 15s
138 actionable tasks: 81 executed, 55 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/w6df74ups6eam

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to