See
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/370/display/redirect>
Changes:
------------------------------------------
[...truncated 356.58 KB...]
No history is available.
Loaded cache entry for task
':runners:google-cloud-dataflow-java:****:shadowJar' with cache key
172d6db0e53ce55573e7f83f28df593e
:runners:google-cloud-dataflow-java:****:shadowJar (Thread[included
builds,5,main]) completed. Took 0.459 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar
(project :runners:google-cloud-dataflow-java:****) (Thread[Execution
****,5,main]) started.
work action null (Thread[Execution ****,5,main]) completed. Took 0.0 secs.
work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar
(project :runners:google-cloud-dataflow-java:****) (Thread[Execution ****
Thread 7,5,main]) started.
work action null (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0
secs.
> Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE
Custom actions are attached to task
':sdks:java:io:google-cloud-platform:compileTestJava'.
Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava'
is 81ac90136f4e0d1611237ab9102dde1f
Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date
because:
No history is available.
Loaded cache entry for task
':sdks:java:io:google-cloud-platform:compileTestJava' with cache key
81ac90136f4e0d1611237ab9102dde1f
:sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution ****
Thread 5,5,main]) completed. Took 1.419 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses
(Thread[Execution **** Thread 4,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses
(Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread
2,5,main]) started.
> Task :sdks:java:io:google-cloud-platform:testClasses
Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no
actions.
:sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread
2,5,main]) completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar
(Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :sdks:java:io:google-cloud-platform:testJar
(Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread
4,5,main]) started.
> Task :sdks:java:io:google-cloud-platform:testJar
Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because:
Not worth caching
Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because:
No history is available.
:sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread
4,5,main]) completed. Took 0.421 secs.
work action resolve beam-sdks-java-io-google-cloud-platform-tests.jar (project
:sdks:java:io:google-cloud-platform) (Thread[Execution **** Thread 3,5,main])
started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0
secs.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava
(Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava
(Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution ****
Thread 5,5,main]) started.
This JVM does not support getting OS memory, so no OS memory status updates
will be broadcast
> Task :runners:google-cloud-dataflow-java:compileTestJava
Custom actions are attached to task
':runners:google-cloud-dataflow-java:compileTestJava'.
Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava'
is 95d62ac4b18d1b1f58cd82d03fd1bcdf
Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date
because:
No history is available.
The input changes require a full rebuild for incremental task
':runners:google-cloud-dataflow-java:compileTestJava'.
Full recompilation is required because no incremental change information is
available. This is usually caused by clean builds or changing compiler
arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Starting process 'Gradle Worker Daemon 2'. Working directory:
/home/jenkins/.gradle/****s Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
-Xbootclasspath/p:/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.errorprone/javac/9+181-r4173-1/bdf4c0aa7d540ee1f7bf14d47447aea4bbf450c5/javac-9+181-r4173-1.jar
-Xmx512m -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en
-Duser.variant -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Worker Daemon 2'
Successfully started process 'Gradle Worker Daemon 2'
Started Gradle **** daemon (0.382 secs) with fork options
DaemonForkOptions{executable=/usr/lib/jvm/java-8-openjdk-amd64/bin/java,
minHeapSize=null, maxHeapSize=null,
jvmArgs=[-Xbootclasspath/p:/home/jenkins/.gradle/caches/modules-2/files-2.1/com.google.errorprone/javac/9+181-r4173-1/bdf4c0aa7d540ee1f7bf14d47447aea4bbf450c5/javac-9+181-r4173-1.jar],
keepAliveMode=SESSION}.
Compiling with JDK Java compiler API.
> Task :sdks:java:io:sparkreceiver:2:compileJava
Class dependency analysis for incremental compilation took 0.079 secs.
Created classpath snapshot for incremental compilation in 0.716 secs.
Stored cache entry for task ':sdks:java:io:sparkreceiver:2:compileJava' with
cache key 106fab98282da7d36d385add4e5c5eac
:sdks:java:io:sparkreceiver:2:compileJava (Thread[Execution **** Thread
6,5,main]) completed. Took 11.224 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:classes (Thread[Execution
**** Thread 7,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:classes (Thread[Execution
**** Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:classes (Thread[Execution **** Thread 7,5,main])
started.
> Task :sdks:java:io:sparkreceiver:2:classes
Skipping task ':sdks:java:io:sparkreceiver:2:classes' as it has no actions.
:sdks:java:io:sparkreceiver:2:classes (Thread[Execution **** Thread 7,5,main])
completed. Took 0.0 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:compileTestJava
(Thread[Execution **** Thread 7,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:compileTestJava
(Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread
7,5,main]) started.
> Task :sdks:java:io:sparkreceiver:2:compileTestJava
Custom actions are attached to task
':sdks:java:io:sparkreceiver:2:compileTestJava'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:compileTestJava' is
f0f350724f16fe5ea1d56e91e42c21bb
Task ':sdks:java:io:sparkreceiver:2:compileTestJava' is not up-to-date because:
No history is available.
The input changes require a full rebuild for incremental task
':sdks:java:io:sparkreceiver:2:compileTestJava'.
Full recompilation is required because no incremental change information is
available. This is usually caused by clean builds or changing compiler
arguments.
Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'.
Compiling with JDK Java compiler API.
Class dependency analysis for incremental compilation took 0.03 secs.
Created classpath snapshot for incremental compilation in 0.298 secs.
Stored cache entry for task ':sdks:java:io:sparkreceiver:2:compileTestJava'
with cache key f0f350724f16fe5ea1d56e91e42c21bb
:sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread
7,5,main]) completed. Took 3.992 secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses
(Thread[Execution **** Thread 2,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses
(Thread[Execution **** Thread 2,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread
6,5,main]) started.
> Task :sdks:java:io:sparkreceiver:2:testClasses
Skipping task ':sdks:java:io:sparkreceiver:2:testClasses' as it has no actions.
:sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread
6,5,main]) completed. Took 0.0 secs.
> Task :runners:google-cloud-dataflow-java:compileTestJava
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Class dependency analysis for incremental compilation took 0.286 secs.
Created classpath snapshot for incremental compilation in 0.369 secs.
Stored cache entry for task
':runners:google-cloud-dataflow-java:compileTestJava' with cache key
95d62ac4b18d1b1f58cd82d03fd1bcdf
:runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution ****
Thread 5,5,main]) completed. Took 12.995 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses
(Thread[Execution **** Thread 3,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testClasses
(Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread
3,5,main]) started.
> Task :runners:google-cloud-dataflow-java:testClasses
Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no
actions.
:runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread
3,5,main]) completed. Took 0.0 secs.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar
(Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :runners:google-cloud-dataflow-java:testJar
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread
4,5,main]) started.
> Task :runners:google-cloud-dataflow-java:testJar
Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because:
Not worth caching
Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because:
No history is available.
file or directory
'<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',>
not found
:runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread
4,5,main]) completed. Took 0.031 secs.
work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project
:runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 3,5,main])
started.
work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0
secs.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest
(Thread[Execution **** Thread 5,5,main]) started.
Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest
(Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs.
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread
4,5,main]) started.
producer locations for task group 0 (Thread[Execution **** Thread 5,5,main])
started.
producer locations for task group 0 (Thread[Execution **** Thread 5,5,main])
completed. Took 0.0 secs.
Gradle Test Executor 3 started executing tests.
> Task :sdks:java:io:sparkreceiver:2:integrationTest
Custom actions are attached to task
':sdks:java:io:sparkreceiver:2:integrationTest'.
Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is
00d710d8cd22f77e983509844b726dee
Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because:
Task.upToDateWhen is false.
Starting process 'Gradle Test Executor 3'. Working directory:
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2>
Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java
-DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:[email protected]:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT.jar","--region=us-central1"]>
-Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager
-Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work>
-Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US
-Duser.language=en -Duser.variant -ea -cp
/home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar
****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 3'
Successfully started process 'Gradle Test Executor 3'
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in
[jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in
[jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]>
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an
explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory]
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT >
testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR
[Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT -
5000000 records were successfully written to RabbitMQ
[Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO -
ReadFromSparkReceiverWithOffsetDoFn started reading
[Test ****] WARN org.apache.beam.runners.dataflow.DataflowRunner - Prefer
--sdkContainerImage over deprecated legacy option --****HarnessContainerImage.
[Test ****] INFO
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory -
No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
[Test ****] WARN
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request
failed with code 409, performed 0 retries due to IOExceptions, performed 0
retries due to unsuccessful status codes, HTTP framework says request can be
retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
[Test ****] INFO
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
- No stagingLocation provided, falling back to gcpTempLocation
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner -
PipelineOptions.filesToStage was not specified. Defaulting to files from the
classpath: will stage 428 files. Enable logging at DEBUG level to see which
files will be staged.
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner -
Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
[Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Uploading 429 files from PipelineOptions.filesToStage to staging location to
prepare for execution.
[pool-8-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.48.0-SNAPSHOT-MMCuEYwPYdGiPxjVjXX9rQvJSsVVzLbx43buGdMPrSw.jar
[pool-8-thread-21] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Uploading /tmp/test4404503858427917633.zip to
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-wMOfhiKWbWUzM4C-iBM_F6U9_eLCaZfOD0EDDT3_M3c.jar
[pool-8-thread-20] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Uploading /tmp/main2265577627382573694.zip to
gs://dataflow-staging-us-central1-844138762903/temp/staging/main-MKofF4soHKEmv-gg-s50hghDdW8uACvmxLjS8buwT9U.jar
[pool-8-thread-18] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Uploading /tmp/test7394440248743383075.zip to
gs://dataflow-staging-us-central1-844138762903/temp/staging/test-DDPw0kuOIcEcdZxAQDMS1OaUdF_em7uMeLrGT1bV_8w.jar
[Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Staging files complete: 426 files cached, 3 files newly uploaded in 0 seconds
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Staging
portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
[pool-15-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil -
Uploading <161323 bytes, hash
9c9f79105974f6e8ce751f993a28c4d355f582d1d9c9f973e34dba41baa28ecb> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-nJ95EFl09ujOdR-ZOijE01X1gtHZyflz4026Qbqijss.pb
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from
unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse as step
s1
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
with initial restriction as step s2
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
restriction as step s3
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
windows as step s4
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
unique key/AddKeys/Map as step s5
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/ProcessKeyedElements
as step s6
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Measure
read time as step s7
[Test ****] INFO
org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Counting
element as step s8
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Dataflow
SDK version: 2.48.0-SNAPSHOT
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To
access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-12_04_55_37-1482351204628370748?project=apache-beam-testing
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner -
Submitted job: 2023-04-12_04_55_37-1482351204628370748
[Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To
cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2023-04-12_04_55_37-1482351204628370748
[Test ****] WARN
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:55:41.399Z: The workflow name is not a valid Cloud Label. Labels
applied to Cloud resources (such as GCE Instances) for monitoring will be
labeled with this modified job name:
sparkreceiverioit0testsparkreceiverioreadsinstreamingwitho-wk7u. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:55:58.154Z: Worker configuration: e2-standard-4 in us-central1-b.
[Test ****] WARN
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.416Z: Querying the configuration of Pub/Sub subscription
_starting_signal/ failed. If the ack deadline on the subscription has been
changed from the default value, this may affect performance. Also, this
prevents Dataflow from checking for unsupported settings on the subscription.
See
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
for further details. If the error is not correctable, the job may be updated
with a valid Pub/Sub configuration. Specific error: INVALID_ARGUMENT:
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
{ stack_top_loc { filepath:
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc"
line: 592 } address: 94235765623985 address: 94235765603983 address:
94235765591590 address: 94235765595211 address: 94235766052738 address:
94235718774577 address: 94235650880854 address: 94235727427033 address:
94235643674270 address: 139910964234201 address: 139910963664543 }
[dist_proc.dax.internal.TrailProto] { trail_point { source_file_loc { filepath:
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc"
line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val:
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid
resource name given (name=_starting_signal/). Refer to
https://cloud.google.com/pubsub/docs/admin#resource_names for more
information." key: "error" } origin_id: 14023884861357736677
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED
}']
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.480Z: Expanding CoGroupByKey operations into optimizable
parts.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.509Z: Expanding SplittableProcessKeyed operations into
optimizable parts.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.537Z: Expanding GroupByKey operations into streaming
Read/Write steps
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.592Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.673Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.698Z: Fusing consumer Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
with initial restriction into Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.730Z: Fusing consumer Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
restriction into Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
with initial restriction
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.764Z: Fusing consumer Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
windows into Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
restriction
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.798Z: Fusing consumer Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
unique key/AddKeys/Map into Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
windows
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.821Z: Fusing consumer s6/GroupByKeyRaw/WriteStream into
Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
unique key/AddKeys/Map
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.852Z: Fusing consumer s6/SplittableProcess into
s6/GroupByKeyRaw/ReadStream
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.885Z: Fusing consumer Measure read time into
s6/SplittableProcess
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:00.916Z: Fusing consumer Counting element into Measure read
time
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:01.279Z: Executing operation
s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting
element
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:01.355Z: Starting 5 ****s in us-central1-b...
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:02.415Z: Executing operation Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
with initial restriction+Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
restriction+Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
windows+Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:03.046Z: Your project already contains 100 Dataflow-created
metric descriptors, so new user metrics of the form custom.googleapis.com/*
will not be created. However, all user metrics are also available in the metric
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics,
you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:30.408Z: Autoscaling: Raised the number of ****s to 1 so that
the pipeline can catch up with its backlog and keep up with its input rate.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:30.443Z: Autoscaling: Resized **** pool to 1, though goal was
5. This could be a quota issue.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:56:40.052Z: Autoscaling: Raised the number of ****s to 5 so that
the pipeline can catch up with its backlog and keep up with its input rate.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:57:08.478Z: Workers have started successfully.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T11:57:11.253Z: All ****s have finished the startup processes and
began to receive work requests.
[Test ****] WARN
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:08:17.201Z: Querying the configuration of Pub/Sub subscription
_starting_signal/ failed. If the ack deadline on the subscription has been
changed from the default value, this may affect performance. Also, this
prevents Dataflow from checking for unsupported settings on the subscription.
See
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
for further details. If the error is not correctable, the job may be updated
with a valid Pub/Sub configuration. Specific error: INVALID_ARGUMENT:
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
{ stack_top_loc { filepath:
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc"
line: 592 } address: 94235765623985 address: 94235765603983 address:
94235765591590 address: 94235765595211 address: 94235766052738 address:
94235765585188 address: 94235643676714 address: 139910964234201 address:
139910963664543 } [dist_proc.dax.internal.TrailProto] { trail_point {
source_file_loc { filepath:
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc"
line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val:
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid
resource name given (name=_starting_signal/). Refer to
https://cloud.google.com/pubsub/docs/admin#resource_names for more
information." key: "error" } origin_id: 2561395843697536624
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED
}']
[Test ****] WARN
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:20:15.976Z: Querying the configuration of Pub/Sub subscription
_starting_signal/ failed. If the ack deadline on the subscription has been
changed from the default value, this may affect performance. Also, this
prevents Dataflow from checking for unsupported settings on the subscription.
See
https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub
for further details. If the error is not correctable, the job may be updated
with a valid Pub/Sub configuration. Specific error: INVALID_ARGUMENT:
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto]
{ stack_top_loc { filepath:
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc"
line: 592 } address: 94235765623985 address: 94235765603983 address:
94235765591590 address: 94235765595211 address: 94235766052738 address:
94235765585188 address: 94235643676714 address: 139910964234201 address:
139910963664543 } [dist_proc.dax.internal.TrailProto] { trail_point {
source_file_loc { filepath:
"dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc"
line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val:
"_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid
resource name given (name=_starting_signal/). Refer to
https://cloud.google.com/pubsub/docs/admin#resource_names for more
information." key: "error" } origin_id: 2561395843697534977
[dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED
}']
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:24:13.920Z: Cancel request is committed for workflow job:
2023-04-12_04_55_37-1482351204628370748.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:24:13.941Z: Finished operation
s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting
element
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:24:13.941Z: Finished operation Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from
unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair
with initial restriction+Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split
restriction+Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode
windows+Read from unbounded
RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign
unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:24:14.002Z: Cleaning up.
[Test ****] INFO
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler -
2023-04-12T12:24:14.063Z: Stopping **** pool...
[Test ****] WARN org.apache.beam.runners.dataflow.DataflowPipelineJob - No
terminal state was returned within allotted timeout. State value RUNNING
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT >
testSparkReceiverIOReadsInStreamingWithOffset FAILED
java.lang.AssertionError: expected:<5000000> but was:<1026114>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:633)
at
org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:337)
Gradle Test Executor 3 finished executing tests.
> Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED
1 test completed, 1 failed
Finished generating test XML results (0.024 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest>
:sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread
4,5,main]) completed. Took 35 mins 31.57 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html>
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 36m 14s
145 actionable tasks: 89 executed, 54 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/esvizgwxrkxgg
Stopped 2 **** daemon(s).
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]