See <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/350/display/redirect>
Changes: ------------------------------------------ [...truncated 374.60 KB...] Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:compileJava'. Build cache key for task ':runners:google-cloud-dataflow-java:****:compileJava' is 77df19078b518be4acf405bd52d589dd Task ':runners:google-cloud-dataflow-java:****:compileJava' is not up-to-date because: No history is available. Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:compileJava' with cache key 77df19078b518be4acf405bd52d589dd :runners:google-cloud-dataflow-java:****:compileJava (Thread[Execution **** Thread 4,5,main]) completed. Took 0.663 secs. Resolve mutations for :runners:google-cloud-dataflow-java:****:classes (Thread[included builds,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:****:classes (Thread[included builds,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:****:classes (Thread[included builds,5,main]) started. > Task :runners:google-cloud-dataflow-java:****:classes Skipping task ':runners:google-cloud-dataflow-java:****:classes' as it has no actions. :runners:google-cloud-dataflow-java:****:classes (Thread[included builds,5,main]) completed. Took 0.0 secs. Resolve mutations for :runners:google-cloud-dataflow-java:****:shadowJar (Thread[included builds,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:****:shadowJar (Thread[included builds,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:****:shadowJar (Thread[included builds,5,main]) started. > Task :runners:google-cloud-dataflow-java:****:shadowJar FROM-CACHE Custom actions are attached to task ':runners:google-cloud-dataflow-java:****:shadowJar'. Build cache key for task ':runners:google-cloud-dataflow-java:****:shadowJar' is cad968b5130cde5ddffc3a7d0fccde4e Task ':runners:google-cloud-dataflow-java:****:shadowJar' is not up-to-date because: No history is available. Loaded cache entry for task ':runners:google-cloud-dataflow-java:****:shadowJar' with cache key cad968b5130cde5ddffc3a7d0fccde4e :runners:google-cloud-dataflow-java:****:shadowJar (Thread[included builds,5,main]) completed. Took 0.873 secs. work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****) (Thread[Execution **** Thread 3,5,main]) started. work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. work action resolve beam-runners-google-cloud-dataflow-java-legacy-****.jar (project :runners:google-cloud-dataflow-java:****) (Thread[Execution **** Thread 3,5,main]) started. work action null (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. > Task :sdks:java:io:google-cloud-platform:compileTestJava FROM-CACHE Custom actions are attached to task ':sdks:java:io:google-cloud-platform:compileTestJava'. Build cache key for task ':sdks:java:io:google-cloud-platform:compileTestJava' is c27230c57cdde802291de1ac72b556e9 Task ':sdks:java:io:google-cloud-platform:compileTestJava' is not up-to-date because: No history is available. Loaded cache entry for task ':sdks:java:io:google-cloud-platform:compileTestJava' with cache key c27230c57cdde802291de1ac72b556e9 :sdks:java:io:google-cloud-platform:compileTestJava (Thread[Execution **** Thread 7,5,main]) completed. Took 1.842 secs. Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses (Thread[included builds,5,main]) started. Resolve mutations for :sdks:java:io:google-cloud-platform:testClasses (Thread[included builds,5,main]) completed. Took 0.0 secs. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 3,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testClasses Skipping task ':sdks:java:io:google-cloud-platform:testClasses' as it has no actions. :sdks:java:io:google-cloud-platform:testClasses (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. Resolve mutations for :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 3,5,main]) started. Resolve mutations for :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 3,5,main]) completed. Took 0.001 secs. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 3,5,main]) started. > Task :sdks:java:io:google-cloud-platform:testJar Caching disabled for task ':sdks:java:io:google-cloud-platform:testJar' because: Not worth caching Task ':sdks:java:io:google-cloud-platform:testJar' is not up-to-date because: No history is available. :sdks:java:io:google-cloud-platform:testJar (Thread[Execution **** Thread 3,5,main]) completed. Took 0.512 secs. work action resolve beam-sdks-java-io-google-cloud-platform-tests.jar (project :sdks:java:io:google-cloud-platform) (Thread[Execution **** Thread 7,5,main]) started. work action null (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs. Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 4,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 4,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 4,5,main]) started. > Task :runners:google-cloud-dataflow-java:compileTestJava FROM-CACHE Custom actions are attached to task ':runners:google-cloud-dataflow-java:compileTestJava'. Build cache key for task ':runners:google-cloud-dataflow-java:compileTestJava' is 35c9070a70466320bf3d25fcf7fdca32 Task ':runners:google-cloud-dataflow-java:compileTestJava' is not up-to-date because: No history is available. Loaded cache entry for task ':runners:google-cloud-dataflow-java:compileTestJava' with cache key 35c9070a70466320bf3d25fcf7fdca32 Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 3,5,main]) started. :runners:google-cloud-dataflow-java:compileTestJava (Thread[Execution **** Thread 4,5,main]) completed. Took 0.526 secs. > Task :runners:google-cloud-dataflow-java:testClasses UP-TO-DATE Skipping task ':runners:google-cloud-dataflow-java:testClasses' as it has no actions. :runners:google-cloud-dataflow-java:testClasses (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 3,5,main]) started. Resolve mutations for :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 3,5,main]) started. > Task :runners:google-cloud-dataflow-java:testJar Caching disabled for task ':runners:google-cloud-dataflow-java:testJar' because: Not worth caching Task ':runners:google-cloud-dataflow-java:testJar' is not up-to-date because: No history is available. file or directory '<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/build/resources/test',> not found :runners:google-cloud-dataflow-java:testJar (Thread[Execution **** Thread 3,5,main]) completed. Took 0.045 secs. work action resolve beam-runners-google-cloud-dataflow-java-tests.jar (project :runners:google-cloud-dataflow-java) (Thread[Execution **** Thread 5,5,main]) started. work action null (Thread[Execution **** Thread 5,5,main]) completed. Took 0.0 secs. > Task :sdks:java:io:sparkreceiver:2:compileJava Class dependency analysis for incremental compilation took 0.157 secs. Created classpath snapshot for incremental compilation in 0.73 secs. Stored cache entry for task ':sdks:java:io:sparkreceiver:2:compileJava' with cache key 0fb5ab5bad125982a1bc4f86e3af7f4a :sdks:java:io:sparkreceiver:2:compileJava (Thread[Execution ****,5,main]) completed. Took 16.24 secs. Resolve mutations for :sdks:java:io:sparkreceiver:2:classes (Thread[Execution **** Thread 3,5,main]) started. Resolve mutations for :sdks:java:io:sparkreceiver:2:classes (Thread[Execution **** Thread 3,5,main]) completed. Took 0.0 secs. :sdks:java:io:sparkreceiver:2:classes (Thread[Execution **** Thread 6,5,main]) started. > Task :sdks:java:io:sparkreceiver:2:classes Skipping task ':sdks:java:io:sparkreceiver:2:classes' as it has no actions. :sdks:java:io:sparkreceiver:2:classes (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs. Resolve mutations for :sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 6,5,main]) started. Resolve mutations for :sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs. :sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 6,5,main]) started. > Task :sdks:java:io:sparkreceiver:2:compileTestJava Custom actions are attached to task ':sdks:java:io:sparkreceiver:2:compileTestJava'. Build cache key for task ':sdks:java:io:sparkreceiver:2:compileTestJava' is 34f23f7968238464f4a34bd3169413a3 Task ':sdks:java:io:sparkreceiver:2:compileTestJava' is not up-to-date because: No history is available. The input changes require a full rebuild for incremental task ':sdks:java:io:sparkreceiver:2:compileTestJava'. Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments. Compiling with toolchain '/usr/lib/jvm/java-8-openjdk-amd64'. Compiling with JDK Java compiler API. Class dependency analysis for incremental compilation took 0.008 secs. Created classpath snapshot for incremental compilation in 0.4 secs. Stored cache entry for task ':sdks:java:io:sparkreceiver:2:compileTestJava' with cache key 34f23f7968238464f4a34bd3169413a3 :sdks:java:io:sparkreceiver:2:compileTestJava (Thread[Execution **** Thread 6,5,main]) completed. Took 4.589 secs. Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread 6,5,main]) started. Resolve mutations for :sdks:java:io:sparkreceiver:2:testClasses (Thread[Execution **** Thread 6,5,main]) completed. Took 0.0 secs. :sdks:java:io:sparkreceiver:2:testClasses (Thread[included builds,5,main]) started. > Task :sdks:java:io:sparkreceiver:2:testClasses Skipping task ':sdks:java:io:sparkreceiver:2:testClasses' as it has no actions. :sdks:java:io:sparkreceiver:2:testClasses (Thread[included builds,5,main]) completed. Took 0.0 secs. Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution ****,5,main]) started. Resolve mutations for :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution ****,5,main]) completed. Took 0.001 secs. :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 6,5,main]) started. producer locations for task group 0 (Thread[Execution **** Thread 7,5,main]) started. producer locations for task group 0 (Thread[Execution **** Thread 7,5,main]) completed. Took 0.0 secs. Gradle Test Executor 2 started executing tests. > Task :sdks:java:io:sparkreceiver:2:integrationTest Custom actions are attached to task ':sdks:java:io:sparkreceiver:2:integrationTest'. Build cache key for task ':sdks:java:io:sparkreceiver:2:integrationTest' is 5f13e548c4a517f48a435b45950702bb Task ':sdks:java:io:sparkreceiver:2:integrationTest' is not up-to-date because: Task.upToDateWhen is false. Starting process 'Gradle Test Executor 2'. Working directory: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2> Command: /usr/lib/jvm/java-8-openjdk-amd64/bin/java -DbeamTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--runner=DataflowRunner","--sourceOptions={\"numRecords\":\"5000000\",\"keySizeBytes\":\"1\",\"valueSizeBytes\":\"90\"}","--bigQueryDataset=beam_performance","--bigQueryTable=sparkreceiverioit_results","--influxMeasurement=sparkreceiverioit_results","--influxDatabase=beam_test_metrics","--influxHost=http://10.128.0.96:8086","--rabbitMqBootstrapServerAddress=amqp://guest:[email protected]:5672","--streamName=rabbitMqTestStream","--readTimeout=1800","--numWorkers=5","--autoscalingAlgorithm=NONE","--****HarnessContainerImage=","--dataflowWorkerJar=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.47.0-SNAPSHOT.jar","--region=us-central1"]> -Djava.security.manager=****.org.gradle.process.internal.****.child.BootstrapSecurityManager -Dorg.gradle.internal.****.tmpdir=<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/tmp/integrationTest/work> -Dorg.gradle.native=false -Xmx2g -Dfile.encoding=UTF-8 -Duser.country=US -Duser.language=en -Duser.variant -ea -cp /home/jenkins/.gradle/caches/7.5.1/****Main/gradle-****.jar ****.org.gradle.process.internal.****.GradleWorkerMain 'Gradle Test Executor 2' Successfully started process 'Gradle Test Executor 2' org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT STANDARD_ERROR SLF4J: Class path contains multiple SLF4J bindings. SLF4J: Found binding in [jar:file:/home/jenkins/.gradle/caches/modules-2/files-2.1/org.slf4j/slf4j-simple/1.7.30/e606eac955f55ecf1d8edcccba04eb8ac98088dd/slf4j-simple-1.7.30.jar!/org/slf4j/impl/StaticLoggerBinder.class] SLF4J: Found binding in [jar:<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/runners/google-cloud-dataflow-java/****/build/libs/beam-runners-google-cloud-dataflow-java-legacy-****-2.47.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]> SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation. SLF4J: Actual binding is of type [org.slf4j.impl.SimpleLoggerFactory] org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset STANDARD_ERROR [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT - 5000000 records were successfully written to RabbitMQ [Test ****] INFO org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIO - ReadFromSparkReceiverWithOffsetDoFn started reading [Test ****] WARN org.apache.beam.runners.dataflow.DataflowRunner - Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage. [Test ****] INFO org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory - No tempLocation specified, attempting to use default bucket: dataflow-staging-us-central1-844138762903 [Test ****] WARN org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer - Request failed with code 409, performed 0 retries due to IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP framework says request can be retried, (caller responsible for retrying): https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing. [Test ****] INFO org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory - No stagingLocation provided, falling back to gcpTempLocation [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 428 files. Enable logging at DEBUG level to see which files will be staged. [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading 429 files from PipelineOptions.filesToStage to staging location to prepare for execution. [pool-8-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.47.0-SNAPSHOT-MMCuEYwPYdGiPxjVjXX9rQvJSsVVzLbx43buGdMPrSw.jar [pool-8-thread-17] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/test6686037462124770443.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-c4pPnA61TT8PIun33aP2EVBgQ6xYPnSmtuZ5H9Y0-00.jar [pool-8-thread-19] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/main6707177944023059690.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/main-IbMlAtdW4VNxv5JgjObJTv1cwCy6enzqVGHTDAVZ2ZQ.jar [pool-8-thread-9] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading /tmp/test8150924053621533643.zip to gs://dataflow-staging-us-central1-844138762903/temp/staging/test-GX12M89gVY4YvFK707Dxqw-cbcvH_14G5UBoX-Jg-bI.jar [Test ****] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Staging files complete: 426 files cached, 3 files newly uploaded in 3 seconds [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Staging portable pipeline proto to gs://dataflow-staging-us-central1-844138762903/temp/staging/ [pool-15-thread-1] INFO org.apache.beam.runners.dataflow.util.PackageUtil - Uploading <161273 bytes, hash b96fb081fe0dd3c12aeb74ddc550d8746bff86cfa88f93307dd01ee11239f90d> to gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-uW-wgf4N08Eq63TdxVDYdGv_hs-oj5MwfdAe4RI5-Q0.pb [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse as step s1 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair with initial restriction as step s2 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split restriction as step s3 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode windows as step s4 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign unique key/AddKeys/Map as step s5 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/ProcessKeyedElements as step s6 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Measure read time as step s7 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowPipelineTranslator - Adding Counting element as step s8 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Dataflow SDK version: 2.47.0-SNAPSHOT [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2023-04-05_08_46_19-4327393257194877782?project=apache-beam-testing [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - Submitted job: 2023-04-05_08_46_19-4327393257194877782 [Test ****] INFO org.apache.beam.runners.dataflow.DataflowRunner - To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2023-04-05_08_46_19-4327393257194877782 [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:23.314Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: sparkreceiverioit0testsparkreceiverioreadsinstreamingwitho-ljcf. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:39.336Z: Worker configuration: e2-standard-4 in us-central1-b. [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.084Z: Querying the configuration of Pub/Sub subscription _starting_signal/ failed. If the ack deadline on the subscription has been changed from the default value, this may affect performance. Also, this prevents Dataflow from checking for unsupported settings on the subscription. See https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub for further details. If the error is not correctable, the job may be updated with a valid Pub/Sub configuration. Specific error: INVALID_ARGUMENT: [type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto] { stack_top_loc { filepath: "dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" line: 592 } address: 94485358396203 address: 94485539317825 address: 94485539296600 address: 94485539284468 address: 94485539287915 address: 94485539728711 address: 94485485221170 address: 94485502099942 address: 94485418762181 address: 139656162658265 address: 139656162088607 } [dist_proc.dax.internal.TrailProto] { trail_point { source_file_loc { filepath: "dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val: "_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid resource name given (name=_starting_signal/). Refer to https://cloud.google.com/pubsub/docs/admin#resource_names for more information." key: "error" } origin_id: 16405582797144002674 [dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED }'] [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.171Z: Expanding CoGroupByKey operations into optimizable parts. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.207Z: Expanding SplittableProcessKeyed operations into optimizable parts. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.255Z: Expanding GroupByKey operations into streaming Read/Write steps [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.327Z: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.423Z: Fusing adjacent ParDo, Read, Write, and Flatten operations [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.450Z: Fusing consumer Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair with initial restriction into Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.481Z: Fusing consumer Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split restriction into Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair with initial restriction [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.521Z: Fusing consumer Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode windows into Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split restriction [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.565Z: Fusing consumer Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign unique key/AddKeys/Map into Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode windows [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.597Z: Fusing consumer s6/GroupByKeyRaw/WriteStream into Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign unique key/AddKeys/Map [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.636Z: Fusing consumer s6/SplittableProcess into s6/GroupByKeyRaw/ReadStream [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.668Z: Fusing consumer Measure read time into s6/SplittableProcess [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:41.702Z: Fusing consumer Counting element into Measure read time [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:42.079Z: Executing operation s6/GroupByKeyRaw/ReadStream+s6/SplittableProcess+Measure read time+Counting element [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:42.156Z: Starting 5 ****s in us-central1-b... [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:43.199Z: Executing operation Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/Impulse+Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Pair with initial restriction+Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Split restriction+Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Explode windows+Read from unbounded RabbitMq/SparkReceiverIO.ReadFromSparkReceiverViaSdf/ParDo(ReadFromSparkReceiverWithOffset)/ParMultiDo(ReadFromSparkReceiverWithOffset)/Assign unique key/AddKeys/Map+s6/GroupByKeyRaw/WriteStream [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:46:58.595Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:47:26.645Z: Autoscaling: Raised the number of ****s to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:47:26.669Z: Autoscaling: Resized **** pool to 1, though goal was 5. This could be a quota issue. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:47:36.276Z: Autoscaling: Raised the number of ****s to 5 so that the pipeline can catch up with its backlog and keep up with its input rate. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:48:04.344Z: Workers have started successfully. [Test ****] INFO org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:48:06.161Z: All ****s have finished the startup processes and began to receive work requests. [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T15:58:58.082Z: Querying the configuration of Pub/Sub subscription _starting_signal/ failed. If the ack deadline on the subscription has been changed from the default value, this may affect performance. Also, this prevents Dataflow from checking for unsupported settings on the subscription. See https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub for further details. If the error is not correctable, the job may be updated with a valid Pub/Sub configuration. Specific error: INVALID_ARGUMENT: [type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto] { stack_top_loc { filepath: "dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" line: 592 } address: 94485358396203 address: 94485539317825 address: 94485539296600 address: 94485539284468 address: 94485539287915 address: 94485539728711 address: 94485539278820 address: 94485418764610 address: 139656162658265 address: 139656162088607 } [dist_proc.dax.internal.TrailProto] { trail_point { source_file_loc { filepath: "dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val: "_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid resource name given (name=_starting_signal/). Refer to https://cloud.google.com/pubsub/docs/admin#resource_names for more information." key: "error" } origin_id: 14815091023875082671 [dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED }'] [Test ****] WARN org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 2023-04-05T16:10:57.846Z: Querying the configuration of Pub/Sub subscription _starting_signal/ failed. If the ack deadline on the subscription has been changed from the default value, this may affect performance. Also, this prevents Dataflow from checking for unsupported settings on the subscription. See https://cloud.google.com/dataflow/docs/concepts/security-and-permissions#accessing_pubsub for further details. If the error is not correctable, the job may be updated with a valid Pub/Sub configuration. Specific error: INVALID_ARGUMENT: [type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.StackTraceProto] { stack_top_loc { filepath: "dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" line: 592 } address: 94485358396203 address: 94485539317825 address: 94485539296600 address: 94485539284468 address: 94485539287915 address: 94485539728711 address: 94485539278820 address: 94485418764610 address: 139656162658265 address: 139656162088607 } [dist_proc.dax.internal.TrailProto] { trail_point { source_file_loc { filepath: "dist_proc/dax_extensions/workflow/cloud/service/pubsub_subscription_util.cc" line: 592 } } } [dist_proc.dax.MessageCode] { argument { string_val: "_starting_signal/" key: "subscription_name" } argument { string_val: "Invalid resource name given (name=_starting_signal/). Refer to https://cloud.google.com/pubsub/docs/admin#resource_names for more information." key: "error" } origin_id: 14815091023875083340 [dist_proc.dax.pubsub_resource_message_ext]: GETTING_PUBSUB_SUBSCRIPTION_FAILED }'] [Test ****] WARN org.apache.beam.runners.dataflow.DataflowPipelineJob - No terminal state was returned within allotted timeout. State value RUNNING org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT > testSparkReceiverIOReadsInStreamingWithOffset FAILED java.lang.AssertionError: expected:<5000000> but was:<890000> at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:633) at org.apache.beam.sdk.io.sparkreceiver.SparkReceiverIOIT.testSparkReceiverIOReadsInStreamingWithOffset(SparkReceiverIOIT.java:337) Gradle Test Executor 2 finished executing tests. > Task :sdks:java:io:sparkreceiver:2:integrationTest FAILED 1 test completed, 1 failed Finished generating test XML results (0.024 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.027 secs) into: <https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest> :sdks:java:io:sparkreceiver:2:integrationTest (Thread[Execution **** Thread 6,5,main]) completed. Took 34 mins 35.653 secs. FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:io:sparkreceiver:2:integrationTest'. > There were failing tests. See the report at: > file://<https://ci-beam.apache.org/job/beam_PerformanceTests_SparkReceiver_IO/ws/src/sdks/java/io/sparkreceiver/2/build/reports/tests/integrationTest/index.html> * Try: > Run with --stacktrace option to get the stack trace. > Run with --debug option to get more log output. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 8.0. You can use '--warning-mode all' to show the individual deprecation warnings and determine if they come from your own scripts or plugins. See https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 35m 34s 145 actionable tasks: 88 executed, 55 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/venziphyfxnha Stopped 1 **** daemon(s). Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
