See <https://ci-beam.apache.org/job/beam_LoadTests_Java_CoGBK_Dataflow_Streaming/764/display/redirect?page=changes>
Changes: [heejong] [BEAM-12838] Update artifact local path for DataflowRunner Java [kwu] [BEAM-12828] Convert UseTestStream tests to use Long instead of Integer [kwu] Apply SpotlessJava [kwu] Apply SpotlessJava [aydar.zaynutdinov] [BEAM-3385] Add requires about `equals()` and `hashMethod()` to [aydar.zaynutdinov] [BEAM-3385] Changes regarding spotlessApply task [noreply] Update runners/flink/job-server/flink_job_server.gradle [Valentyn Tymofieiev] Disable Kafka perf tests. [heejong] separate into resolveArtifacts method [heejong] add test [aydar.zaynutdinov] [BEAM-3385] wrap up equals() and hashCode() methods into links [Andrew Pilloud] [BEAM-12850] Calcite drops empty Calc now [Andrew Pilloud] [BEAM-12853] VALUES produces a UNION, window can't be set afterwards [Andrew Pilloud] [BEAM-12852] Revert BigTable changes, just cast to bigint [heejong] update [heejong] fix formatting [Andrew Pilloud] [BEAM-12851] Map output table names [Luke Cwik] [BEAM-12802] Define a prefetchable iterator and iterable and utility ------------------------------------------ [...truncated 9.80 KB...] > Task :runners:core-java:createCheckerFrameworkManifest > Task :runners:java-fn-execution:createCheckerFrameworkManifest > Task :model:job-management:createCheckerFrameworkManifest > Task > :sdks:java:extensions:google-cloud-platform-core:createCheckerFrameworkManifest > Task :runners:core-construction-java:createCheckerFrameworkManifest > Task :sdks:java:core:createCheckerFrameworkManifest > Task :sdks:java:fn-execution:processResources NO-SOURCE > Task :runners:core-java:processResources NO-SOURCE > Task :sdks:java:core:generateAvroProtocol NO-SOURCE > Task :runners:core-construction-java:processResources NO-SOURCE > Task :sdks:java:harness:processResources NO-SOURCE > Task :sdks:java:extensions:google-cloud-platform-core:processResources > NO-SOURCE > Task :runners:java-fn-execution:processResources NO-SOURCE > Task :sdks:java:core:generateAvroJava NO-SOURCE > Task :sdks:java:extensions:arrow:createCheckerFrameworkManifest > Task :model:fn-execution:extractProto > Task :model:job-management:extractProto > Task :sdks:java:extensions:protobuf:createCheckerFrameworkManifest > Task :sdks:java:expansion-service:createCheckerFrameworkManifest > Task :sdks:java:io:google-cloud-platform:createCheckerFrameworkManifest > Task :sdks:java:io:kafka:createCheckerFrameworkManifest > Task > :runners:google-cloud-dataflow-java:****:windmill:createCheckerFrameworkManifest > Task > :runners:google-cloud-dataflow-java:****:legacy-****:createCheckerFrameworkManifest > Task :sdks:java:expansion-service:processResources NO-SOURCE > Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE > Task :sdks:java:extensions:arrow:processResources NO-SOURCE > Task :sdks:java:io:kafka:processResources NO-SOURCE > Task :runners:google-cloud-dataflow-java:****:legacy-****:processResources > NO-SOURCE > Task :sdks:java:extensions:protobuf:extractProto > Task :sdks:java:extensions:protobuf:processResources NO-SOURCE > Task :sdks:java:io:synthetic:createCheckerFrameworkManifest > Task :sdks:java:testing:load-tests:createCheckerFrameworkManifest > Task :sdks:java:testing:test-utils:createCheckerFrameworkManifest > Task :sdks:java:io:kinesis:createCheckerFrameworkManifest > Task :model:job-management:processResources > Task :sdks:java:io:kinesis:processResources NO-SOURCE > Task :sdks:java:testing:load-tests:processResources NO-SOURCE > Task :sdks:java:io:synthetic:processResources NO-SOURCE > Task :sdks:java:testing:test-utils:processResources NO-SOURCE > Task :sdks:java:core:generateGrammarSource FROM-CACHE > Task :model:fn-execution:processResources > Task :runners:google-cloud-dataflow-java:processResources > Task :sdks:java:core:processResources > Task :runners:google-cloud-dataflow-java:****:windmill:extractIncludeProto > Task :runners:google-cloud-dataflow-java:****:windmill:extractProto > Task :model:pipeline:extractIncludeProto > Task :runners:google-cloud-dataflow-java:****:windmill:generateProto > FROM-CACHE > Task :model:pipeline:extractProto > Task :model:pipeline:generateProto FROM-CACHE > Task :runners:google-cloud-dataflow-java:****:windmill:compileJava FROM-CACHE > Task :runners:google-cloud-dataflow-java:****:windmill:processResources > Task :runners:google-cloud-dataflow-java:****:windmill:classes > Task :model:pipeline:compileJava FROM-CACHE > Task :model:pipeline:processResources > Task :model:pipeline:classes > Task :model:pipeline:jar > Task :runners:google-cloud-dataflow-java:****:windmill:shadowJar FROM-CACHE > Task :model:job-management:extractIncludeProto > Task :model:fn-execution:extractIncludeProto > Task :model:job-management:generateProto FROM-CACHE > Task :model:fn-execution:generateProto FROM-CACHE > Task :model:job-management:compileJava FROM-CACHE > Task :model:job-management:classes > Task :model:fn-execution:compileJava FROM-CACHE > Task :model:fn-execution:classes > Task :model:pipeline:shadowJar FROM-CACHE > Task :model:job-management:shadowJar FROM-CACHE > Task :model:fn-execution:shadowJar FROM-CACHE > Task :sdks:java:core:compileJava FROM-CACHE > Task :sdks:java:core:classes > Task :sdks:java:core:shadowJar FROM-CACHE > Task :sdks:java:extensions:protobuf:extractIncludeProto > Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE > Task :sdks:java:extensions:protobuf:compileJava FROM-CACHE > Task :sdks:java:extensions:protobuf:classes UP-TO-DATE > Task :sdks:java:extensions:arrow:compileJava FROM-CACHE > Task :sdks:java:extensions:arrow:classes UP-TO-DATE > Task :sdks:java:testing:test-utils:compileJava FROM-CACHE > Task :sdks:java:testing:test-utils:classes UP-TO-DATE > Task :sdks:java:extensions:protobuf:jar > Task :sdks:java:io:synthetic:compileJava FROM-CACHE > Task :sdks:java:io:synthetic:classes UP-TO-DATE > Task :sdks:java:extensions:arrow:jar > Task :sdks:java:testing:test-utils:jar > Task :sdks:java:io:synthetic:jar > Task :sdks:java:fn-execution:compileJava FROM-CACHE > Task :sdks:java:fn-execution:classes UP-TO-DATE > Task :sdks:java:io:kinesis:compileJava FROM-CACHE > Task :sdks:java:io:kinesis:classes UP-TO-DATE > Task :sdks:java:fn-execution:jar > Task :sdks:java:io:kinesis:jar > Task :runners:core-construction-java:compileJava FROM-CACHE > Task :runners:core-construction-java:classes UP-TO-DATE > Task :runners:core-construction-java:jar > Task :runners:core-java:compileJava FROM-CACHE > Task :runners:core-java:classes UP-TO-DATE > Task :runners:core-java:jar > Task :sdks:java:extensions:google-cloud-platform-core:compileJava FROM-CACHE > Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE > Task :sdks:java:extensions:google-cloud-platform-core:jar > Task :sdks:java:harness:compileJava FROM-CACHE > Task :sdks:java:harness:classes UP-TO-DATE > Task :sdks:java:harness:jar > Task :sdks:java:harness:shadowJar FROM-CACHE > Task :runners:java-fn-execution:compileJava FROM-CACHE > Task :runners:java-fn-execution:classes UP-TO-DATE > Task :runners:java-fn-execution:jar > Task :sdks:java:expansion-service:compileJava FROM-CACHE > Task :sdks:java:expansion-service:classes UP-TO-DATE > Task :sdks:java:expansion-service:jar > Task :sdks:java:io:kafka:compileJava FROM-CACHE > Task :sdks:java:io:kafka:classes UP-TO-DATE > Task :sdks:java:io:kafka:jar > Task :sdks:java:io:google-cloud-platform:compileJava FROM-CACHE > Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE > Task :sdks:java:io:google-cloud-platform:jar > Task :runners:google-cloud-dataflow-java:compileJava FROM-CACHE > Task :runners:google-cloud-dataflow-java:classes > Task :runners:google-cloud-dataflow-java:jar > Task :runners:google-cloud-dataflow-java:****:legacy-****:compileJava > FROM-CACHE > Task :runners:google-cloud-dataflow-java:****:legacy-****:classes UP-TO-DATE > Task :runners:google-cloud-dataflow-java:****:legacy-****:shadowJar FROM-CACHE > Task :sdks:java:testing:load-tests:compileJava Note: Some input files use unchecked or unsafe operations. Note: Recompile with -Xlint:unchecked for details. > Task :sdks:java:testing:load-tests:classes > Task :sdks:java:testing:load-tests:jar > Task :sdks:java:testing:load-tests:run Sep 08, 2021 12:23:04 PM org.apache.beam.runners.dataflow.DataflowRunner validateSdkContainerImageOptions WARNING: Prefer --sdkContainerImage over deprecated legacy option --****HarnessContainerImage. Sep 08, 2021 12:23:05 PM org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory create INFO: No stagingLocation provided, falling back to gcpTempLocation Sep 08, 2021 12:23:06 PM org.apache.beam.runners.dataflow.DataflowRunner fromOptions INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from the classpath: will stage 193 files. Enable logging at DEBUG level to see which files will be staged. Sep 08, 2021 12:23:06 PM org.apache.beam.sdk.Pipeline validate WARNING: The following transforms do not have stable unique names: Window.Into() Sep 08, 2021 12:23:06 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Executing pipeline on the Dataflow Service, which will have billing implications related to Google Compute Engine usage and other Google Cloud Services. Sep 08, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Uploading 194 files from PipelineOptions.filesToStage to staging location to prepare for execution. Sep 08, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.PackageUtil$PackageAttributes forFileToStage INFO: Staging custom dataflow-****.jar as beam-runners-google-cloud-dataflow-java-legacy-****-2.34.0-SNAPSHOT-kyLIsdCfiZiuGlhZEOOvAXu4J-2BxQlikUE0jy6y3CM.jar Sep 08, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.PackageUtil stageClasspathElements INFO: Staging files complete: 194 files cached, 0 files newly uploaded in 0 seconds Sep 08, 2021 12:23:10 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging portable pipeline proto to gs://temp-storage-for-perf-tests/loadtests/staging/ Sep 08, 2021 12:23:10 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <106136 bytes, hash 39ad254e5f4bf20b18477fe6a31d4cdb15c314809938f5f05a13fe26db593d2a> to gs://temp-storage-for-perf-tests/loadtests/staging/pipeline-Oa0lTl9L8gsYR3_mox1M2xXDFICZOPXwWhP-JttZPSo.pb Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s1 Sep 08, 2021 12:23:12 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3a26ec8d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17143b3b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@100c8b75, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2bc378f7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@268cbb86, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@10f7918f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@64d4f7c7, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@54e02f6a, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@17f3eefb, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3ba46e63, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@788ddc1f, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2dc3271b, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@254f906e, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@3d0035d2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2bfb6b49, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1f346ad2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@46a145ba, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7ab34619, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@ae2db25, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@363c4251] Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read input/StripIds as step s2 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (input) as step s3 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()/Window.Assign as step s4 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/DataflowRunner.StreamingUnboundedRead.ReadWithIds as step s5 Sep 08, 2021 12:23:12 PM org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource split INFO: Split into 20 bundles of sizes: [org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@d5af0a5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@5981f4a6, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63dfada0, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@6f231ced, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@35a60674, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@63d4f0a2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7d78f3d5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@a4b5ce3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@f5b6e78, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4b4eced1, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@71926a36, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@216e9ca3, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@75120e58, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@48976e6d, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@2a367e93, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@7f6874f2, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@1a6dc589, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@697a34af, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@70211df5, org.apache.beam.sdk.io.synthetic.SyntheticUnboundedSource@4c5228e7] Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Read co-input/StripIds as step s6 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect start time metrics (co-input) as step s7 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Window.Into()2/Window.Assign as step s8 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable0 as step s9 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/MakeUnionTable1 as step s10 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/Flatten as step s11 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/GBK as step s12 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding CoGroupByKey/ConstructCoGbkResultFn as step s13 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Ungroup and reiterate as step s14 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect total bytes as step s15 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Collect end time metrics as step s16 Sep 08, 2021 12:23:12 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Dataflow SDK version: 2.34.0-SNAPSHOT Sep 08, 2021 12:23:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2021-09-08_05_23_12-8580251134254452833?project=apache-beam-testing Sep 08, 2021 12:23:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Submitted job: 2021-09-08_05_23_12-8580251134254452833 Sep 08, 2021 12:23:13 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel > --region=us-central1 2021-09-08_05_23_12-8580251134254452833 Sep 08, 2021 12:23:18 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process WARNING: 2021-09-08T12:23:17.484Z: The workflow name is not a valid Cloud Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring will be labeled with this modified job name: load0tests0java0dataflow0streaming0cogbk01-jenkins-0908122-lv7l. For the best monitoring experience, please name your job with a valid Cloud Label. For details, see: https://cloud.google.com/compute/docs/labeling-resources#restrictions Sep 08, 2021 12:23:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-08T12:23:22.631Z: Worker configuration: e2-standard-4 in us-central1-a. Sep 08, 2021 12:23:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2021-09-08T12:23:23.443Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 5 instances in region us-central1. Quota summary (required/available): 5/19415 instances, 20/8 CPUs, 2150/274031 disk GB, 0/2397 SSD disk GB, 1/212 instance groups, 1/219 managed instance groups, 1/433 instance templates, 5/644 in-use IP addresses. Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota. Sep 08, 2021 12:23:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-08T12:23:23.482Z: Cleaning up. Sep 08, 2021 12:23:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-08T12:23:23.523Z: Worker pool stopped. Sep 08, 2021 12:23:24 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2021-09-08T12:23:24.703Z: Your project already contains 100 Dataflow-created metric descriptors, so new user metrics of the form custom.googleapis.com/* will not be created. However, all user metrics are also available in the metric dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, you can delete old / unused metric descriptors. See https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete Sep 08, 2021 12:23:29 PM org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState INFO: Job 2021-09-08_05_23_12-8580251134254452833 failed with status FAILED. Sep 08, 2021 12:23:29 PM org.apache.beam.sdk.testutils.metrics.MetricsReader getCounterMetric SEVERE: Failed to get metric totalBytes.count, from namespace co_gbk Load test results for test (ID): e4a8109a-1041-49b3-b470-9aedc4c880e0 and timestamp: 2021-09-08T12:23:06.411000000Z: Metric: Value: dataflow_runtime_sec 0.0 dataflow_total_bytes_count -1.0 Exception in thread "main" java.lang.RuntimeException: Invalid job state: FAILED. at org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55) at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:141) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.run(CoGroupByKeyLoadTest.java:62) at org.apache.beam.sdk.loadtests.CoGroupByKeyLoadTest.main(CoGroupByKeyLoadTest.java:157) > Task :sdks:java:testing:load-tests:run FAILED FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':sdks:java:testing:load-tests:run'. > Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with > non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 7.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/6.8.3/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1m 9s 90 actionable tasks: 57 executed, 33 from cache Publishing build scan... https://gradle.com/s/jjv4d7i3khytq Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
