See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/463/display/redirect>

Changes:


------------------------------------------
[...truncated 351.61 KB...]
Loaded cache entry for task ':sdks:java:io:sparkreceiver:2:compileTestJava' 
with cache key 1ea79f7fd9a13e58ff0395bd46caa2d9
:sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 
5,5,main]) completed. Took 0.41 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses 
(Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses 
(Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread 
3,5,main]) started.

> Task :sdks:java:io:sparkreceiver:2:testClasses UP-TO-DATE
Skipping task ':sdks:java:io:sparkreceiver:2:testClasses' as it has no actions.
:sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread 
3,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:extensions:protobuf:generateTestProto FROM-CACHE
Build cache key for task ':sdks:java:extensions:protobuf:generateTestProto' is 
9f70d7c24f9164273841148fb10b9de9
Task ':sdks:java:extensions:protobuf:generateTestProto' is not up-to-date 
because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:protobuf:generateTestProto' 
with cache key 9f70d7c24f9164273841148fb10b9de9
:sdks:java:extensions:protobuf:generateTestProto (Thread[included 
builds,5,main]) completed. Took 0.021 secs.
Resolve mutations for :sdks:java:extensions:protobuf:compileTestJava 
(Thread[included builds,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:compileTestJava 
(Thread[included builds,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:compileTestJava (Thread[Execution **** Thread 
3,5,main]) started.

> Task :sdks:java:extensions:protobuf:compileTestJava FROM-CACHE
Custom actions are attached to task 
':sdks:java:extensions:protobuf:compileTestJava'.
Build cache key for task ':sdks:java:extensions:protobuf:compileTestJava' is 
3c0407a7097203ed02e7424714157623
Task ':sdks:java:extensions:protobuf:compileTestJava' is not up-to-date because:
  No history is available.
Loaded cache entry for task ':sdks:java:extensions:protobuf:compileTestJava' 
with cache key 3c0407a7097203ed02e7424714157623
:sdks:java:extensions:protobuf:compileTestJava (Thread[Execution **** Thread 
3,5,main]) completed. Took 0.198 secs.
Resolve mutations for :sdks:java:extensions:protobuf:testClasses 
(Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:testClasses 
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:testClasses (Thread[included builds,5,main]) 
started.

> Task :sdks:java:extensions:protobuf:testClasses
Skipping task ':sdks:java:extensions:protobuf:testClasses' as it has no actions.
:sdks:java:extensions:protobuf:testClasses (Thread[included builds,5,main]) 
completed. Took 0.0 secs.
Resolve mutations for :sdks:java:extensions:protobuf:testJar (Thread[included 
builds,5,main]) started.
Resolve mutations for :sdks:java:extensions:protobuf:testJar (Thread[included 
builds,5,main]) completed. Took 0.0 secs.
:sdks:java:extensions:protobuf:testJar (Thread[Execution **** Thread 5,5,main]) 
started.

> Task :sdks:java:extensions:protobuf:testJar
Caching disabled for task ':sdks:java:extensions:protobuf:testJar' because:
  Not worth caching
Task ':sdks:java:extensions:protobuf:testJar' is not up-to-date because:
  No history is available.
:sdks:java:extensions:protobuf:testJar (Thread[Execution **** Thread 5,5,main]) 
completed. Took 0.074 secs.
work action resolve beam-sdks-java-extensions-protobuf-tests.jar (project 
:sdks:java:extensions:protobuf) (Thread[Execution **** Thread 2,5,main]) 
started.
work action null (Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:compileTestJava 
(Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:compileTestJava 
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.001 secs.
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[included 
builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:****:shadowJar FROM-CACHE
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:****:shadowJar'.
Build cache key for task ':runners:google-cloud-dataflow-java:****:shadowJar' 
is ef10afa5ea8e3974890467f081f5bc14
Task ':runners:google-cloud-dataflow-java:****:shadowJar' is not up-to-date 
because:
  No history is available.
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:****:shadowJar' with cache key 
ef10afa5ea8e3974890467f081f5bc14
:runners:google-cloud-dataflow-java:****:shadowJar (Thread[Execution **** 
Thread 7,5,main]) completed. Took 0.386 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar 
(project :runners:google-cloud-dataflow-java:****) (Thread[Execution **** 
Thread 7,5,main]) started.
work action null (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 
secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar 
(project :runners:google-cloud-dataflow-java:****) (Thread[Execution **** 
Thread 4,5,main]) started.
work action null (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 
secs.

> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task 
':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' 
is a02ef3413f07bd75b75b556d207d9ea1
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date 
because:
  No history is available.
Loaded cache entry for task 
':sdks:java:io:google-cloud-platform:compileTestJava' with cache key 
a02ef3413f07bd75b75b556d207d9ea1
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[included 
builds,5,main]) completed. Took 0.945 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses 
(Thread[Execution **** Thread 7,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses 
(Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 
3,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no 
actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 
3,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar 
(Thread[Execution ****,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar 
(Thread[Execution ****,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 
2,5,main]) started.

> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
  Not worth caching
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
  No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 
2,5,main]) completed. Took 0.329 secs.
work action resolve beam-sdks-java-io-google-cloud-platform-tests.jar (project 
:sdks:java:io:google-cloud-platform) (Thread[Execution **** Thread 3,5,main]) 
started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava 
(Thread[Execution **** Thread 3,5,main]) started.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** 
Thread 7,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava 
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.001 secs.

> Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' 
is 925d65530557bcf9f85a357362027419
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date 
because:
  No history is available.
Loaded cache entry for task 
':runners:google-cloud-dataflow-java:compileTestJava' with cache key 
925d65530557bcf9f85a357362027419
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** 
Thread 7,5,main]) completed. Took 0.499 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses 
(Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses 
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 
6,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no 
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 
6,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar 
(Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar 
(Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 
7,5,main]) started.

> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
  Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
 not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 
7,5,main]) completed. Took 0.036 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project 
:runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 3,5,main]) 
started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 
secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest 
(Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest 
(Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 
3,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) 
started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) 
completed. Took 0.0 secs.
Gradle Test Executor 1 started executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest
Custom actions are attached to task 
':sdks:java:io:sparkreceiver:2:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is 
212944878302eae033db5fb174f24634
Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 1'. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:[email protected]:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=1","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.49.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 
-Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work>
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 1'
Successfully started process 'Gradle Test Executor 1'

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.49.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > 
testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - 
5000000 records were successfully written to RabbitMQ
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO - 
ReadFromSparkReceiverWithOffsetDoFn started reading
    [Test ****] WARN org.apache.beam.runners.dataflow.DataflowRunner - Prefer 
--sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
    [Test ****] INFO 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory - 
No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    [Test ****] WARN 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request 
failed with code 409, performed 0 retries due to IOExceptions, performed 0 
retries due to unsuccessful status codes, HTTP framework says request can be 
retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    [Test ****] INFO 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 - No stagingLocation provided, falling back to gcpTempLocation
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath: will stage 427 files. Enable logging at DEBUG level to see which 
files will be staged.
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - 
Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading 428 files from PipelineOptions.filesToStage to staging location to 
prepare for execution.
    [pool-8-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.49.0-SNAPSHOT-NjK7EMYMJApmUrAO-XsFK7DyVs6EHDuc3Cs6jBUn5aA.jar
    [pool-8-thread-17] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading /tmp/test6897490177666483431.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-sjNfVTFTRrx-VyZCbYrC2neVdQffAC-t31FJVGdGWLY.jar
    [pool-8-thread-30] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading /tmp/test3880380616652807554.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-8iEGArtTYymcOAC1erESnCahhmGwEnJ84ufr3lfZr-s.jar
    [pool-8-thread-19] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading /tmp/main4746778566700390649.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-cmm5XTAow8AFZzKYLHYYgvD62pf4gKnWDkeTI3AGSxY.jar
    [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Staging files complete: 425 files cached, 3 files newly uploaded in 2 seconds
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Staging 
portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    [pool-15-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading <160956 bytes, hash 
d17df8967d0f314a741e0cb2de0840183eaf4bf10aacad6816ac8a237f1705da> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-0X34ln0PMUp0Hgyy3ghAGD6vS_EKrK1oFqyKI38XBdo.pb
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse as step 
s1
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction as step s2
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction as step s3
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows as step s4
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map as step s5
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/ProcessKeyedElements
 as step s6
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Measure 
read time as step s7
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Counting 
element as step s8
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Dataflow 
SDK version: 2.49.0-SNAPSHOT
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To 
access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-05-23_19_26_20-2594505962094058868?project=apache-beam-testing
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - 
Submitted job: 2023-05-23_19_26_20-2594505962094058868
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To 
cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2023-05-23_19_26_20-2594505962094058868
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:23.761Z: The workflow name is not a valid Cloud Label. Labels 
applied to Cloud resources (such as GCE Instances) for monitoring will be 
labeled with this modified job name: 
sparkreceiverioit0testsparkreceiverioreadsinstreamingwitho-rd0v. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:40.792Z: Worker configuration: e2-standard-4 in us-central1-b.
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.177Z: Querying the configuration of Pub/Sub subscription 
_starting_signal/ failed.  If the ack deadline on the subscription has been 
changed from the default value, this may affect performance.  Also, this 
prevents Dataflow from checking for unsupported settings on the subscription.  
See 
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
 for further details.  If the error is not correctable, the job may be updated 
with a valid Pub/Sub configuration.  Specific error: INVALID_ARGUMENT:  
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
 { stack_top_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } address: 93833268861963 address: 93833451728724 address: 
93833451708165 address: 93833451695325 address: 93833451699115 address: 
93833452117787 address: 93833408628690 address: 93833416879117 address: 
93833360476404 address: 139996944910297 address: 139996944340639 } 
[dist_proc.dax.internal.TrailProto] { trail_point { source_file_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } } } [dist_proc.dax.MessageCode] { argument { string_val: 
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid 
resource name given (name=_starting_signal/). Refer to 
https://cloud.google.com/pubsub/docs/admin#resource_names for more 
information." key: "error" } origin_id: 17657936690485424902 
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED 
}']
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.243Z: Expanding CoGroupByKey operations into optimizable 
parts.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.283Z: Expanding SplittableProcessKeyed operations into 
optimizable parts.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.311Z: Expanding GroupByKey operations into streaming 
Read/Write steps
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.359Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.441Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.469Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.490Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.567Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.602Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.634Z: Fusing consumer s6/GroupByKeyRaw/WriteStream into 
Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.671Z: Fusing consumer s6/SplittableProcess into 
s6/GroupByKeyRaw/ReadStream
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.693Z: Fusing consumer Measure read time into 
s6/SplittableProcess
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:43.720Z: Fusing consumer Counting element into Measure read 
time
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:44.116Z: Executing operation 
s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting 
element
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:44.206Z: Starting 1 ****s in us-central1-b...
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:45.295Z: Executing operation Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:26:52.277Z: Your project already contains 100 Dataflow-created 
metric descriptors, so new user metrics of the form custom.googleapis.com/* 
will not be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:30:21.404Z: Executing operation 
s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting 
element
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:30:21.405Z: Executing operation Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:30:38.382Z: Worker configuration: e2-standard-4 in us-central1-b.
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:30:40.156Z: Querying the configuration of Pub/Sub subscription 
_starting_signal/ failed.  If the ack deadline on the subscription has been 
changed from the default value, this may affect performance.  Also, this 
prevents Dataflow from checking for unsupported settings on the subscription.  
See 
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
 for further details.  If the error is not correctable, the job may be updated 
with a valid Pub/Sub configuration.  Specific error: INVALID_ARGUMENT:  
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
 { stack_top_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } address: 94590042938379 address: 94590225805140 address: 
94590225784581 address: 94590225771741 address: 94590225775531 address: 
94590226194203 address: 94590225765892 address: 94590134561657 address: 
139899516717017 address: 139899516147359 } [dist_proc.dax.internal.TrailProto] 
{ trail_point { source_file_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } } } [dist_proc.dax.MessageCode] { argument { string_val: 
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid 
resource name given (name=_starting_signal/). Refer to 
https://cloud.google.com/pubsub/docs/admin#resource_names for more 
information." key: "error" } origin_id: 5281600288170018335 
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED 
}']
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:30:52.064Z: Your project already contains 100 Dataflow-created 
metric descriptors, so new user metrics of the form custom.googleapis.com/* 
will not be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:31:41.421Z: Autoscaling: Raised the number of ****s to 1 so that 
the pipeline can catch up with its backlog and keep up with its input rate.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:31:50.330Z: Workers have started successfully.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:32:36.265Z: All ****s have finished the startup processes and 
began to receive work requests.
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:42:39.037Z: Querying the configuration of Pub/Sub subscription 
_starting_signal/ failed.  If the ack deadline on the subscription has been 
changed from the default value, this may affect performance.  Also, this 
prevents Dataflow from checking for unsupported settings on the subscription.  
See 
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
 for further details.  If the error is not correctable, the job may be updated 
with a valid Pub/Sub configuration.  Specific error: INVALID_ARGUMENT:  
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
 { stack_top_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } address: 94590042938379 address: 94590225805140 address: 
94590225784581 address: 94590225771741 address: 94590225775531 address: 
94590226194203 address: 94590225765892 address: 94590134561657 address: 
139899516717017 address: 139899516147359 } [dist_proc.dax.internal.TrailProto] 
{ trail_point { source_file_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } } } [dist_proc.dax.MessageCode] { argument { string_val: 
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid 
resource name given (name=_starting_signal/). Refer to 
https://cloud.google.com/pubsub/docs/admin#resource_names for more 
information." key: "error" } origin_id: 5281600288170017590 
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED 
}']
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-05-24T02:54:38.447Z: Querying the configuration of Pub/Sub subscription 
_starting_signal/ failed.  If the ack deadline on the subscription has been 
changed from the default value, this may affect performance.  Also, this 
prevents Dataflow from checking for unsupported settings on the subscription.  
See 
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
 for further details.  If the error is not correctable, the job may be updated 
with a valid Pub/Sub configuration.  Specific error: INVALID_ARGUMENT:  
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
 { stack_top_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } address: 94590042938379 address: 94590225805140 address: 
94590225784581 address: 94590225771741 address: 94590225775531 address: 
94590226194203 address: 94590225765892 address: 94590134561657 address: 
139899516717017 address: 139899516147359 } [dist_proc.dax.internal.TrailProto] 
{ trail_point { source_file_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 598 } } } [dist_proc.dax.MessageCode] { argument { string_val: 
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid 
resource name given (name=_starting_signal/). Refer to 
https://cloud.google.com/pubsub/docs/admin#resource_names for more 
information." key: "error" } origin_id: 5281600288170016845 
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED 
}']
    [Test ****] WARN org.apache.beam.runners.dataflow.DataflowPipelineJob - No 
terminal state was returned within allotted timeout. State value RUNNING

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > 
testSparkReceiverIOReadsInStreamingWithOffset FAILED
    java.lang.AssertionError: expected:<5000000> but was:<4290000>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:337)

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest

1 test completed, 1 failed
Finished generating test XML results (0.028 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.034 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest>

> Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 
3,5,main]) completed. Took 34 mins 49.111 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 35m 25s
145 actionable tasks: 86 executed, 57 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/y6g5yupdq3o32

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to