See 
<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/1008/display/redirect?page=changes>

Changes:

[stranniknm] [BEAM-13785] playground - enable scio sdk

[mattcasters] [BEAM-13854] Document casting trick for Avro value serializer in 
KafkaIO

[noreply] Merge pull request #16838 from [BEAM-13931] - make sure large rows 
cause

[noreply] Seznam Case Study (#16825)

[noreply] [Website] Apache Hop Case Study (#16824)

[noreply] [BEAM-13694] Force hadoop-hdfs-client in hadoopVersion tests for hdfs

[noreply] [Website] Ricardo - added case study feedback (#16807)

[noreply] Merge pull request #16735 from [BEAM-13827] - fix medium file size

[noreply] Merge pull request #16753 from [BEAM-13837] [Playground] show graph on


------------------------------------------
[...truncated 18.53 KB...]
> Task :model:job-management:processResources
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:createCheckerFrameworkManifest
> Task :sdks:java:core:processResources
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto
> Task :model:pipeline:extractIncludeProto
> Task :runners:google-cloud-dataflow-java:****:windmill:extractProto
> Task :model:pipeline:extractProto
> Task :runners:google-cloud-dataflow-java:****:windmill:generateProto 
> FROM-CACHE
> Task :model:pipeline:generateProto FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:processResources
> Task :runners:google-cloud-dataflow-java:****:windmill:classes
> Task :model:pipeline:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE
> Task :model:pipeline:processResources
> Task :model:pipeline:classes
> Task :model:pipeline:jar
> Task :model:pipeline:shadowJar FROM-CACHE
> Task :model:job-management:extractIncludeProto
> Task :model:fn-execution:extractIncludeProto
> Task :model:job-management:generateProto FROM-CACHE
> Task :model:fn-execution:generateProto FROM-CACHE
> Task :model:job-management:compileJava FROM-CACHE
> Task :model:job-management:classes
> Task :model:fn-execution:compileJava FROM-CACHE
> Task :model:fn-execution:classes
> Task :model:job-management:shadowJar FROM-CACHE
> Task :model:fn-execution:shadowJar FROM-CACHE
> Task :sdks:java:core:compileJava FROM-CACHE
> Task :sdks:java:core:classes
> Task :sdks:java:core:shadowJar FROM-CACHE
> Task :sdks:java:io:synthetic:compileJava FROM-CACHE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar
> Task :sdks:java:testing:test-utils:compileJava FROM-CACHE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar
> Task :sdks:java:io:kinesis:compileJava FROM-CACHE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:compileJava FROM-CACHE
> Task :sdks:java:extensions:arrow:classes UP-TO-DATE
> Task :sdks:java:extensions:arrow:jar
> Task :sdks:java:io:kinesis:jar
> Task :sdks:java:fn-execution:compileJava FROM-CACHE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar
> Task :runners:core-construction-java:compileJava FROM-CACHE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :runners:core-construction-java:jar
> Task :sdks:java:extensions:protobuf:extractIncludeProto
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :runners:core-java:compileJava FROM-CACHE
> Task :runners:core-java:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar
> Task :runners:core-java:jar
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar
> Task :sdks:java:harness:compileJava FROM-CACHE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar
> Task :runners:java-fn-execution:compileJava FROM-CACHE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar
> Task :sdks:java:expansion-service:compileJava FROM-CACHE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar
> Task :sdks:java:io:kafka:compileJava FROM-CACHE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar
> Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar
> Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE
> Task :runners:google-cloud-dataflow-java:classes
> Task :runners:google-cloud-dataflow-java:jar
> Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava 
> FROM-CACHE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE

> Task :sdks:java:testing:load-tests:compileJava
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.

> Task :sdks:java:testing:load-tests:classes
> Task :sdks:java:testing:load-tests:jar

> Task :sdks:java:testing:load-tests:run
Feb 16, 2022 12:18:01 PM org.apache.beam.runners.dataflow.DataflowRunner 
validateSdkContainerImageOptions
WARNING: Prefer --sdkContainerImage over deprecated legacy option 
--****HarnessContainerImage.
Feb 16, 2022 12:18:02 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Feb 16, 2022 12:18:02 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 204 files. Enable logging at DEBUG level to see which 
files will be staged.
Feb 16, 2022 12:18:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
Feb 16, 2022 12:18:07 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 205 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
Feb 16, 2022 12:18:07 PM 
org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes 
forFileToStage
INFO: Staging custom dataflow-****.jar as 
beam-runners-google-cloud-dataflow-java-legacy-****-2.38.0-SNAPSHOT-QuGnur7irnXUBunkr5-Fg8-cA-O2Pbxk01JwQWgcxT4.jar
Feb 16, 2022 12:18:07 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/com.amazonaws/amazon-kinesis-producer/0.14.1/8953110d00d2c36834fa1f950a7839d5a8432c4d/amazon-kinesis-producer-0.14.1.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/amazon-kinesis-producer-0.14.1-5YYT_oyDFV17VWs8Of43O7N08sRfHu0SuJeqVSFCuLg.jar
Feb 16, 2022 12:18:07 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading 
/home/jenkins/.gradle/caches/modules-2/files-2.1/software.amazon.ion/ion-java/1.0.2/ee9dacea7726e495f8352b81c12c23834ffbc564/ion-java-1.0.2.jar
 to 
gs://temp-storage-for-perf-tests/loadtests/staging/ion-java-1.0.2-DRJ7IFofzgq8KjdXoEF0hlG8ZsFc9MBZusWDOyfUcaU.jar
Feb 16, 2022 12:18:09 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 203 files cached, 2 files newly uploaded in 2 
seconds
Feb 16, 2022 12:18:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to 
gs://temp-storage-for-perf-tests/loadtests/staging/
Feb 16, 2022 12:18:09 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
INFO: Uploading <92330 bytes, hash 
a090a7277ac8f7e8c6d0f1733b3cbd77ff47b0e9e704c93de6baa54cf7014a33> to 
gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-oJCnJ3rI9-jG0PFzOzy9d_9HsOnnBMk95rqlTPcBSjM.pb
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as 
step s1
Feb 16, 2022 12:18:11 PM 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split
INFO: Split into 20 bundles of sizes: 
[org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4d41ba0f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c87e6b7, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@77bbadc, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3c3a0032, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ceb4478, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7fdab70c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@25ad4f71, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@49faf066, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f94a5a5, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@455c1d8c, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a451491, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1422ac7f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5e519ad3, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7bc44ce8, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@59072e9d, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@58472096, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a92be4f, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@53e800f9, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@337bbfdf, 
org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@52d97ab6]
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input/StripIds as step s2
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metrics as step s3
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Total bytes monitor as step s4
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s5
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Group by key (0) as step s6
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Ungroup and reiterate (0) as step s7
Feb 16, 2022 12:18:11 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metrics (0) as step s8
Feb 16, 2022 12:18:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.38.0-SNAPSHOT
Feb 16, 2022 12:18:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-02-16_04_18_11-6737523243826606533?project=apache-beam-testing
Feb 16, 2022 12:18:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2022-02-16_04_18_11-6737523243826606533
Feb 16, 2022 12:18:11 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2022-02-16_04_18_11-6737523243826606533
Feb 16, 2022 12:18:14 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2022-02-16T12:18:14.361Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java0dataflow0streaming0gbk01-jenkins-021612180-wk1d. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:18.582Z: Worker configuration: e2-standard-4 in 
us-central1-c.
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.260Z: Expanding CoGroupByKey operations into 
optimizable parts.
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.323Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.355Z: Expanding GroupByKey operations into streaming 
Read/Write steps
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.414Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.522Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.575Z: Fusing consumer Read input/StripIds into Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.611Z: Fusing consumer Collect start time metrics into 
Read input/StripIds
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.639Z: Fusing consumer Total bytes monitor into 
Collect start time metrics
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.673Z: Fusing consumer Window.Into()/Window.Assign 
into Total bytes monitor
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.718Z: Fusing consumer Group by key (0)/WriteStream 
into Window.Into()/Window.Assign
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.751Z: Fusing consumer Group by key (0)/MergeBuckets 
into Group by key (0)/ReadStream
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.776Z: Fusing consumer Ungroup and reiterate (0) into 
Group by key (0)/MergeBuckets
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:19.807Z: Fusing consumer Collect end time metrics (0) 
into Ungroup and reiterate (0)
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:20.136Z: Executing operation Group by key 
(0)/ReadStream+Group by key (0)/MergeBuckets+Ungroup and reiterate (0)+Collect 
end time metrics (0)
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:20.159Z: Executing operation Read 
input/DataflowRunner.StreamingUnboundedRead.ReadWithIds+Read 
input/StripIds+Collect start time metrics+Total bytes 
monitor+Window.Into()/Window.Assign+Group by key (0)/WriteStream
Feb 16, 2022 12:18:20 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:20.220Z: Starting 5 ****s in us-central1-c...
Feb 16, 2022 12:18:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:18:31.137Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Feb 16, 2022 12:19:07 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:19:05.816Z: Autoscaling: Raised the number of ****s to 5 so 
that the pipeline can catch up with its backlog and keep up with its input rate.
Feb 16, 2022 12:19:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:19:37.325Z: Workers have started successfully.
Feb 16, 2022 12:19:38 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2022-02-16T12:19:37.387Z: Workers have started successfully.

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 143

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 7m 44s
99 actionable tasks: 61 executed, 36 from cache, 2 up-to-date

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=91e8560e-9015-4f6a-8c7d-1be27079e2bd, 
currentDir=<https://ci-beam.apache.org/job/beam_LoadTests_Java_GBK_Dataflow_Streaming/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 4166311
  log file: /home/jenkins/.gradle/daemon/7.3.2/daemon-4166311.out.log
----- Last  20 lines from daemon log file - daemon-4166311.out.log -----
INFO: 2022-02-16T12:19:37.387Z: Workers have started successfully.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 143

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 7m 44s
99 actionable tasks: 61 executed, 36 from cache, 2 up-to-date

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to