See
<https://ci-beam.apache.org/job/beam_LoadTests_Java_ParDo_Dataflow_Batch/738/display/redirect?page=changes>
Changes:
[Ismaël Mejía] [BEAM-12342] Upgrade Spark 2 to version 2.4.8
[Udi Meiri] [BEAM-12352] Skip
GcsIOIntegrationTest.test_copy{,_batch}_rewrite_token
[noreply] [BEAM-3713] Move validatesRunnerBatchTests and
validatesRunnerStreaming
------------------------------------------
[...truncated 16.46 KB...]
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest
> Task :sdks:java:core:generateGrammarSource FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources
> NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :model:fn-execution:processResources
> Task :model:job-management:processResources
> Task :runners:google-cloud-dataflow-java:processResources
> Task :sdks:java:core:processResources
> Task :model:pipeline:extractIncludeProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto
> FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:classes
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :model:job-management:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:extensions:protobuf:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :sdks:java:harness:shadowJar FROM-CACHE
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :sdks:java:testing:load-tests:compileJava FROM-CACHE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :sdks:java:testing:load-tests:jar
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava
> FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE
> Task :sdks:java:testing:load-tests:run
May 21, 2021 12:27:51 PM org.apache.beam.runners.dataflow.DataflowRunner
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option
--****HarnessContainerImage.
May 21, 2021 12:27:52 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 21, 2021 12:27:52 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from
the classpath: will stage 188 files. Enable logging at DEBUG level to see which
files will be staged.
May 21, 2021 12:27:53 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names:
ParDo(TimeMonitor)
May 21, 2021 12:27:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
May 21, 2021 12:27:55 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://temp-storage-for-perf-tests/loadtests/staging/
May 21, 2021 12:27:56 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 189 files from PipelineOptions.filesToStage to staging location
to prepare for execution.
May 21, 2021 12:27:56 PM
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes
forFileToStage
INFO: Staging custom dataflow-****.jar as
beam-runners-google-cloud-dataflow-java-legacy-****-2.31.0-SNAPSHOT-p8DZqfSGxUnLeoUu48qOOTbZOXglhdbI2awWBgSdBQs.jar
May 21, 2021 12:27:57 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 189 files cached, 0 files newly uploaded in 0
seconds
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input as step s1
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor) as step s2
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(ByteMonitor) as step s3
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 0 as step s4
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 1 as step s5
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 2 as step s6
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 3 as step s7
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 4 as step s8
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 5 as step s9
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 6 as step s10
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 7 as step s11
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 8 as step s12
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Step: 9 as step s13
May 21, 2021 12:27:57 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding ParDo(TimeMonitor)2 as step s14
May 21, 2021 12:27:57 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.31.0-SNAPSHOT
May 21, 2021 12:27:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-05-21_05_27_57-17233117034791124479?project=apache-beam-testing
May 21, 2021 12:27:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-05-21_05_27_57-17233117034791124479
May 21, 2021 12:27:58 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
> --region=us-central1 2021-05-21_05_27_57-17233117034791124479
May 21, 2021 12:28:11 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-05-21T12:28:10.467Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
load0tests0java0dataflow0batch0pardo01-jenkins-0521122752--bjey. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 21, 2021 12:28:15 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:15.008Z: Worker configuration: n1-standard-1 in
us-central1-f.
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:15.638Z: Expanding CoGroupByKey operations into
optimizable parts.
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:15.666Z: Expanding GroupByKey operations into
optimizable parts.
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:15.697Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:15.884Z: Fusing adjacent ParDo, Read, Write, and Flatten
operations
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.084Z: Fusing consumer ParDo(TimeMonitor) into Read
input
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.276Z: Fusing consumer ParDo(ByteMonitor) into
ParDo(TimeMonitor)
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.378Z: Fusing consumer Step: 0 into ParDo(ByteMonitor)
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.484Z: Fusing consumer Step: 1 into Step: 0
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.761Z: Fusing consumer Step: 2 into Step: 1
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.795Z: Fusing consumer Step: 3 into Step: 2
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.829Z: Fusing consumer Step: 4 into Step: 3
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.858Z: Fusing consumer Step: 5 into Step: 4
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.878Z: Fusing consumer Step: 6 into Step: 5
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.913Z: Fusing consumer Step: 7 into Step: 6
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.950Z: Fusing consumer Step: 8 into Step: 7
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:16.979Z: Fusing consumer Step: 9 into Step: 8
May 21, 2021 12:28:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:17.020Z: Fusing consumer ParDo(TimeMonitor)2 into Step: 9
May 21, 2021 12:28:18 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:17.407Z: Executing operation Read
input+ParDo(TimeMonitor)+ParDo(ByteMonitor)+Step: 0+Step: 1+Step: 2+Step:
3+Step: 4+Step: 5+Step: 6+Step: 7+Step: 8+Step: 9+ParDo(TimeMonitor)2
May 21, 2021 12:28:18 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:17.522Z: Starting 5 ****s in us-central1-f...
May 21, 2021 12:28:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:28:48.557Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 21, 2021 12:29:08 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:29:08.015Z: Autoscaling: Raised the number of ****s to 5
based on the rate of progress in the currently running stage(s).
May 21, 2021 12:29:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:29:32.688Z: Workers have started successfully.
May 21, 2021 12:29:33 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:29:32.800Z: Workers have started successfully.
May 21, 2021 12:30:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:30:13.131Z: Finished operation Read
input+ParDo(TimeMonitor)+ParDo(ByteMonitor)+Step: 0+Step: 1+Step: 2+Step:
3+Step: 4+Step: 5+Step: 6+Step: 7+Step: 8+Step: 9+ParDo(TimeMonitor)2
May 21, 2021 12:30:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:30:13.256Z: Cleaning up.
May 21, 2021 12:30:13 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:30:13.339Z: Stopping **** pool...
May 21, 2021 12:31:10 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:31:10.391Z: Autoscaling: Resized **** pool from 5 to 0.
May 21, 2021 12:31:10 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-05-21T12:31:10.437Z: Worker pool stopped.
May 21, 2021 12:36:15 PM org.apache.beam.runners.dataflow.DataflowPipelineJob
logTerminalState
INFO: Job 2021-05-21_05_27_57-17233117034791124479 finished with status DONE.
May 21, 2021 12:36:15 PM org.apache.beam.sdk.testutils.metrics.MetricsReader
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace pardo
Load test results for test (ID): 6ce93015-7d9a-4e63-8ffd-591ecd09cd33 and
timestamp: 2021-05-21T12:27:53.074000000Z:
Metric: Value:
dataflow_runtime_sec 0.0
dataflow_total_bytes_count -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid test results
at
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:137)
at
org.apache.beam.sdk.loadtests.ParDoLoadTest.run(ParDoLoadTest.java:53)
at
org.apache.beam.sdk.loadtests.ParDoLoadTest.main(ParDoLoadTest.java:103)
> Task :sdks:java:testing:load-tests:run FAILED
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with
> non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 8m 56s
87 actionable tasks: 54 executed, 33 from cache
Publishing build scan...
https://gradle.com/s/vyqso5tw53fdm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]