See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/352/display/redirect>

Changes:


------------------------------------------
[...truncated 386.20 KB...]
:sdks:java:io:synthetic:jar (Thread[Execution **** Thread 2,5,main]) started.

> Task :sdks:java:io:synthetic:jar
Caching disabled for task ':sdks:java:io:synthetic:jar' because:
  Not worth caching
Task ':sdks:java:io:synthetic:jar' is not up-to-date because:
  No history is available.
file or directory 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/synthetic/build/resources/main',>
 not found
:sdks:java:io:synthetic:jar (Thread[Execution **** Thread 2,5,main]) completed. 
Took 0.012 secs.
work action resolve beam-sdks-java-io-synthetic.jar (project 
:sdks:java:io:synthetic) (Thread[Execution ****,5,main]) started.
work action null (Thread[Execution ****,5,main]) completed. Took 0.0 secs.

> Task :sdks:java:io:sparkreceiver:2:compileJava
Stored cache entry for task ':sdks:java:io:sparkreceiver:2:compileJava' with 
cache key fb3d3f4c528894260234bc461a6e0a9c
:sdks:java:io:sparkreceiver:2:compileJava (Thread[Execution **** Thread 
7,5,main]) completed. Took 13.341 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:classes (Thread[Execution 
**** Thread 4,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:classes (Thread[Execution 
**** Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:classes (Thread[Execution ****,5,main]) started.

> Task :sdks:java:io:sparkreceiver:2:classes
Skipping task ':sdks:java:io:sparkreceiver:2:classes' as it has no actions.
:sdks:java:io:sparkreceiver:2:classes (Thread[Execution ****,5,main]) 
completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:compileTestJava 
(Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:compileTestJava 
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 
4,5,main]) started.

> Task :sdks:java:io:sparkreceiver:2:compileTestJava
Custom actions are attached to task 
':sdks:java:io:sparkreceiver:2:compileTestJava'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:compileTestJava' is 
344983fb258e27e504ed797abf42f885
Task ':sdks:java:io:sparkreceiver:2:compileTestJava' is not up-to-date because:
  No history is available.
The input changes require a full rebuild for incremental task 
':sdks:java:io:sparkreceiver:2:compileTestJava'.
Full recompilation is required because no incremental change information is 
available. This is usually caused by clean builds or changing compiler 
arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Compiling with JDK Java compiler API.
Class dependency analysis for incremental compilation took 0.043 secs.
Created classpath snapshot for incremental compilation in 0.307 secs.
Stored cache entry for task ':sdks:java:io:sparkreceiver:2:compileTestJava' 
with cache key 344983fb258e27e504ed797abf42f885
:sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 
4,5,main]) completed. Took 3.485 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses 
(Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses 
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread 
2,5,main]) started.

> Task :sdks:java:io:sparkreceiver:2:testClasses
Skipping task ':sdks:java:io:sparkreceiver:2:testClasses' as it has no actions.
:sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread 
2,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest 
(Thread[Execution **** Thread 6,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest 
(Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 
5,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) 
started.
producer locations for task group 0 (Thread[Execution **** Thread 6,5,main]) 
completed. Took 0.0 secs.
Gradle Test Executor 3 started executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest
Custom actions are attached to task 
':sdks:java:io:sparkreceiver:2:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is 
4d756da66374261468d9f1256b8afbca
Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because:
  Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2>
 Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java 
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:[email protected]:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT.jar","--region=us-central1";]>
 
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
 
-Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work>
 -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US 
-Duser.language=en -Duser.variant -ea -cp 
/home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar 
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
    SLF4J: Class path contains multiple SLF4J bindings.
    SLF4J: Found binding in 
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
    SLF4J: Found binding in 
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
    SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an 
explanation.
    SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > 
testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - 
5000000 records were successfully written to RabbitMQ
    [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO - 
ReadFromSparkReceiverWithOffsetDoFn started reading
    [Test ****] WARN org.apache.beam.runners.dataflow.DataflowRunner - Prefer 
--sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
    [Test ****] INFO 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory - 
No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    [Test ****] WARN 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request 
failed with code 409, performed 0 retries due to IOExceptions, performed 0 
retries due to unsuccessful status codes, HTTP framework says request can be 
retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    [Test ****] INFO 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 - No stagingLocation provided, falling back to gcpTempLocation
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath: will stage 428 files. Enable logging at DEBUG level to see which 
files will be staged.
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - 
Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading 429 files from PipelineOptions.filesToStage to staging location to 
prepare for execution.
    [pool-8-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT-MMCuEYwPYdGiPxjVjXX9rQvJSsVVzLbx43buGdMPrSw.jar
    [pool-8-thread-22] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading /tmp/main8829966533648966662.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-4dFL6nXBJlwDP_025ytxKgcD2bgQGXkVLcGjPpvQGmM.jar
    [pool-8-thread-18] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading /tmp/test2072630998877454857.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-O8-_h9PAwWcivB1HhAQkWeoetEugtOsDFTTne9lXjBM.jar
    [pool-8-thread-7] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading /tmp/test5728466884203300756.zip to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-LiNs7MmSaDfj2XyK3X8mYvFzwFi9AEIxGKZ70AnxohM.jar
    [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Staging files complete: 426 files cached, 3 files newly uploaded in 0 seconds
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Staging 
portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    [pool-15-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - 
Uploading <161276 bytes, hash 
4fe3228e35cf48d7b38057cf1ae0a54d74f0091d9b0c27cddb06de3e8679e740> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-T-MijjXPSNezgFfPGuClTXTwCR2bDCfN2wbePoZ550A.pb
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse as step 
s1
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction as step s2
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction as step s3
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows as step s4
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map as step s5
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/ProcessKeyedElements
 as step s6
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Measure 
read time as step s7
    [Test ****] INFO 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Counting 
element as step s8
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Dataflow 
SDK version: 2.48.0-SNAPSHOT
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To 
access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-06_03_41_57-6965879967075332291?project=apache-beam-testing
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - 
Submitted job: 2023-04-06_03_41_57-6965879967075332291
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To 
cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2023-04-06_03_41_57-6965879967075332291
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:01.186Z: The workflow name is not a valid Cloud Label. Labels 
applied to Cloud resources (such as GCE Instances) for monitoring will be 
labeled with this modified job name: 
sparkreceiverioit0testsparkreceiverioreadsinstreamingwitho-ttl3. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:20.703Z: Worker configuration: e2-standard-4 in us-central1-b.
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:22.884Z: Querying the configuration of Pub/Sub subscription 
_starting_signal/ failed.  If the ack deadline on the subscription has been 
changed from the default value, this may affect performance.  Also, this 
prevents Dataflow from checking for unsupported settings on the subscription.  
See 
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
 for further details.  If the error is not correctable, the job may be updated 
with a valid Pub/Sub configuration.  Specific error: INVALID_ARGUMENT:  
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
 { stack_top_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 592 } address: 94436958225195 address: 94437139146817 address: 
94437139125592 address: 94437139113460 address: 94437139116907 address: 
94437139557703 address: 94437085050162 address: 94437101928934 address: 
94437018591173 address: 139809065469913 address: 139809064900255 } 
[dist_proc.dax.internal.TrailProto] { trail_point { source_file_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val: 
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid 
resource name given (name=_starting_signal/). Refer to 
https://cloud.google.com/pubsub/docs/admin#resource_names for more 
information." key: "error" } origin_id: 3403797184338075933 
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED 
}']
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:22.953Z: Expanding CoGroupByKey operations into optimizable 
parts.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:22.985Z: Expanding SplittableProcessKeyed operations into 
optimizable parts.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.021Z: Expanding GroupByKey operations into streaming 
Read/Write steps
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.085Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.157Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.185Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.201Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.225Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.254Z: Fusing consumer Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map into Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.288Z: Fusing consumer s6/GroupByKeyRaw/WriteStream into 
Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.321Z: Fusing consumer s6/SplittableProcess into 
s6/GroupByKeyRaw/ReadStream
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.346Z: Fusing consumer Measure read time into 
s6/SplittableProcess
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.375Z: Fusing consumer Counting element into Measure read 
time
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.709Z: Executing operation 
s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting 
element
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:23.777Z: Starting 5 ****s in us-central1-b...
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:24.845Z: Executing operation Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:42:43.347Z: Your project already contains 100 Dataflow-created 
metric descriptors, so new user metrics of the form custom.googleapis.com/* 
will not be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:43:07.244Z: Autoscaling: Raised the number of ****s to 5 so that 
the pipeline can catch up with its backlog and keep up with its input rate.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:43:43.413Z: Workers have started successfully.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:43:44.678Z: All ****s have finished the startup processes and 
began to receive work requests.
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:44:44.688Z: java.lang.IllegalArgumentException: Trying to claim 
offset 88166 before start of the range [106150, 9223372036854775807)
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:97)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:45:04.779Z: java.lang.IllegalArgumentException: Trying to claim 
offset 334716 before start of the range [374266, 9223372036854775807)
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:97)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:45:26.887Z: java.lang.IllegalArgumentException: Trying to claim 
offset 621716 while last attempted was 705615
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:45:57.019Z: java.lang.IllegalArgumentException: Trying to claim 
offset 984888 while last attempted was 1045515
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:46:17.102Z: java.lang.IllegalArgumentException: Trying to claim 
offset 1259388 while last attempted was 1324787
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:46:37.216Z: java.lang.IllegalArgumentException: Trying to claim 
offset 1536338 while last attempted was 1590287
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:46:57.293Z: java.lang.IllegalArgumentException: Trying to claim 
offset 1814038 while last attempted was 1845587
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:47:17.384Z: java.lang.IllegalArgumentException: Trying to claim 
offset 2053038 while last attempted was 2102287
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:47:37.465Z: java.lang.IllegalArgumentException: Trying to claim 
offset 2342038 while last attempted was 2375587
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:47:57.560Z: java.lang.IllegalArgumentException: Trying to claim 
offset 2660338 while last attempted was 2677087
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:48:47.721Z: java.lang.IllegalArgumentException: Trying to claim 
offset 3237928 while last attempted was 3246737
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] ERROR 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:50:58.082Z: java.lang.IllegalArgumentException: Trying to claim 
offset 4872978 while last attempted was 4929977
            
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.base.Preconditions.checkArgument(Preconditions.java:440)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:92)
            
org.apache.beam.sdk.transforms.splittabledofn.OffsetRangeTracker.tryClaim(OffsetRangeTracker.java:38)
            
org.apache.beam.runners.dataflow.****.repackaged.org.apache.beam.sdk.fn.splittabledofn.RestrictionTrackers$RestrictionTrackerObserver.tryClaim(RestrictionTrackers.java:59)
            
org.apache.beam.sdk.io.sparkreceiver.ReadFromSparkReceiverWithOffsetDoFn.processElement(ReadFromSparkReceiverWithOffsetDoFn.java:328)
    [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T10:54:40.014Z: Querying the configuration of Pub/Sub subscription 
_starting_signal/ failed.  If the ack deadline on the subscription has been 
changed from the default value, this may affect performance.  Also, this 
prevents Dataflow from checking for unsupported settings on the subscription.  
See 
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
 for further details.  If the error is not correctable, the job may be updated 
with a valid Pub/Sub configuration.  Specific error: INVALID_ARGUMENT:  
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
 { stack_top_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 592 } address: 94436958225195 address: 94437139146817 address: 
94437139125592 address: 94437139113460 address: 94437139116907 address: 
94437139557703 address: 94437139107812 address: 94437018593602 address: 
139809065469913 address: 139809064900255 } [dist_proc.dax.internal.TrailProto] 
{ trail_point { source_file_loc { filepath: 
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" 
line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val: 
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid 
resource name given (name=_starting_signal/). Refer to 
https://cloud.google.com/pubsub/docs/admin#resource_names for more 
information." key: "error" } origin_id: 14937997910113353838 
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED 
}']
    [Test ****] WARN org.apache.beam.sdk.metrics.MetricsEnvironment - Reporting 
metrics are not supported in the current execution environment.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:03:19.119Z: Cancel request is committed for workflow job: 
2023-04-06_03_41_57-6965879967075332291.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:03:19.142Z: Finished operation Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from 
unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
 with initial restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
 restriction+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
 windows+Read from unbounded 
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
 unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:03:19.143Z: Finished operation 
s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting 
element
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:03:19.236Z: Cleaning up.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:03:19.323Z: Stopping **** pool...
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:05:31.770Z: Autoscaling: Reduced the number of ****s to 0 based 
on low average **** CPU utilization, and the pipeline having sufficiently low 
backlog and keeping up with input rate.
    [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2023-04-06T11:05:31.824Z: Worker pool stopped.
    [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineJob - Job 
2023-04-06_03_41_57-6965879967075332291 finished with status CANCELLED.

org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > 
testSparkReceiverIOReadsInStreamingWithOffset FAILED
    java.lang.AssertionError: expected:<5000000> but was:<307324>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:337)

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED

1 test completed, 1 failed
Finished generating test XML results (0.037 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.037 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest>
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 
5,5,main]) completed. Took 28 mins 20.991 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html>

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 29m 4s
145 actionable tasks: 89 executed, 54 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/6d63jrhmiy2uq

Stopped 2 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]


Reply via email to