See 
<https://builds.apache.org/job/beam_LoadTests_Java_Combine_Dataflow_Batch/368/display/redirect?page=changes>

Changes:

[nielm] Remove spurious error message.

[alxavier] [BEAM-9911]Replace SpannerIO.write latency counter to distribution

[kcweaver] [BEAM-4782] Remove workaround in Python multimap tests.

[kamil.wasilewski] [BEAM-8132, BEAM-8133] Apply InfluxDB pipeline options in 
Load Tests and

[kamil.wasilewski] [BEAM-8132, BEAM-8133] Assume no trailing slash after a 
hostname

[iemejia] [BEAM-2530] Compile and run tests on java 11 for Precommit portability

[github] [BEAM-9883] Refactor SDF test restrictions. (#11605)

[github] [BEAM-3288] Add suggested fix to error message (#11622)

[zyichi] [BEAM-9940] Set timer family spec for TimerDeclarations in dataflow


------------------------------------------
[...truncated 62.91 KB...]
> Task :buildSrc:processTestResources NO-SOURCE
> Task :buildSrc:testClasses UP-TO-DATE
> Task :buildSrc:test NO-SOURCE
> Task :buildSrc:validateTaskProperties UP-TO-DATE
> Task :buildSrc:check UP-TO-DATE
> Task :buildSrc:build UP-TO-DATE
Configuration on demand is an incubating feature.

> Configure project :sdks:java:container
Found go 1.12 in /usr/bin/go, use it.

> Task :sdks:java:core:generateAvroProtocol NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:processResources NO-SOURCE
> Task :sdks:java:fn-execution:processResources NO-SOURCE
> Task :sdks:java:extensions:google-cloud-platform-core:processResources 
> NO-SOURCE
> Task :runners:core-java:processResources NO-SOURCE
> Task :model:job-management:extractProto UP-TO-DATE
> Task :model:fn-execution:extractProto UP-TO-DATE
> Task :runners:core-construction-java:processResources NO-SOURCE
> Task :runners:java-fn-execution:processResources NO-SOURCE
> Task :sdks:java:harness:processResources NO-SOURCE
> Task :sdks:java:expansion-service:processResources NO-SOURCE
> Task :runners:java-job-service:processResources NO-SOURCE
> Task :sdks:java:core:generateAvroJava NO-SOURCE
> Task :runners:local-java:processResources NO-SOURCE
> Task :runners:direct-java:processResources NO-SOURCE
> Task :sdks:java:extensions:protobuf:extractProto UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:processResources NO-SOURCE
> Task :model:fn-execution:processResources UP-TO-DATE
> Task :sdks:java:extensions:protobuf:processResources NO-SOURCE
> Task :model:job-management:processResources UP-TO-DATE
> Task 
> :runners:google-cloud-dataflow-java:worker:legacy-worker:processResources 
> NO-SOURCE
> Task :sdks:java:io:kafka:processResources NO-SOURCE
> Task :runners:google-cloud-dataflow-java:processResources UP-TO-DATE
> Task :sdks:java:io:kinesis:processResources NO-SOURCE
> Task :sdks:java:io:synthetic:processResources NO-SOURCE
> Task :sdks:java:testing:test-utils:processResources NO-SOURCE
> Task :sdks:java:testing:load-tests:processResources NO-SOURCE
> Task :sdks:java:core:generateGrammarSource UP-TO-DATE
> Task :sdks:java:core:processResources UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:extractIncludeProto 
> UP-TO-DATE
> Task :model:pipeline:extractIncludeProto UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:extractProto 
> UP-TO-DATE
> Task :model:pipeline:extractProto UP-TO-DATE
> Task :model:pipeline:generateProto UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:generateProto 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:compileJava 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:processResources 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:classes UP-TO-DATE
> Task :model:pipeline:compileJava UP-TO-DATE
> Task :model:pipeline:processResources UP-TO-DATE
> Task :model:pipeline:classes UP-TO-DATE
> Task :model:pipeline:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:windmill:shadowJar UP-TO-DATE
> Task :model:pipeline:shadowJar UP-TO-DATE
> Task :model:job-management:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:extractIncludeProto UP-TO-DATE
> Task :model:fn-execution:generateProto UP-TO-DATE
> Task :model:job-management:generateProto UP-TO-DATE
> Task :model:job-management:compileJava UP-TO-DATE
> Task :model:job-management:classes UP-TO-DATE
> Task :model:fn-execution:compileJava UP-TO-DATE
> Task :model:fn-execution:classes UP-TO-DATE
> Task :model:job-management:shadowJar UP-TO-DATE
> Task :model:fn-execution:shadowJar UP-TO-DATE
> Task :sdks:java:core:compileJava UP-TO-DATE
> Task :sdks:java:core:classes UP-TO-DATE
> Task :sdks:java:core:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:extractIncludeProto UP-TO-DATE
> Task :sdks:java:extensions:protobuf:generateProto NO-SOURCE
> Task :vendor:sdks-java-extensions-protobuf:compileJava UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:classes UP-TO-DATE
> Task :vendor:sdks-java-extensions-protobuf:shadowJar UP-TO-DATE
> Task :sdks:java:extensions:protobuf:compileJava UP-TO-DATE
> Task :sdks:java:extensions:protobuf:classes UP-TO-DATE
> Task :sdks:java:extensions:protobuf:jar UP-TO-DATE
> Task :sdks:java:io:synthetic:compileJava UP-TO-DATE
> Task :sdks:java:io:synthetic:classes UP-TO-DATE
> Task :sdks:java:io:synthetic:jar UP-TO-DATE
> Task :sdks:java:fn-execution:compileJava UP-TO-DATE
> Task :sdks:java:fn-execution:classes UP-TO-DATE
> Task :sdks:java:fn-execution:jar UP-TO-DATE
> Task :runners:local-java:compileJava UP-TO-DATE
> Task :runners:local-java:classes UP-TO-DATE
> Task :runners:local-java:jar UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:compileJava UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:classes UP-TO-DATE
> Task :sdks:java:extensions:google-cloud-platform-core:jar UP-TO-DATE
> Task :runners:core-construction-java:compileJava UP-TO-DATE
> Task :runners:core-construction-java:classes UP-TO-DATE
> Task :sdks:java:io:kinesis:compileJava UP-TO-DATE
> Task :sdks:java:io:kinesis:classes UP-TO-DATE
> Task :runners:core-construction-java:jar UP-TO-DATE
> Task :sdks:java:io:kinesis:jar UP-TO-DATE
> Task :sdks:java:testing:test-utils:compileJava UP-TO-DATE
> Task :sdks:java:testing:test-utils:classes UP-TO-DATE
> Task :sdks:java:testing:test-utils:jar UP-TO-DATE
> Task :runners:core-java:compileJava UP-TO-DATE
> Task :runners:core-java:classes UP-TO-DATE
> Task :runners:core-java:jar UP-TO-DATE
> Task :sdks:java:harness:compileJava UP-TO-DATE
> Task :sdks:java:harness:classes UP-TO-DATE
> Task :sdks:java:harness:jar UP-TO-DATE
> Task :sdks:java:harness:shadowJar UP-TO-DATE
> Task :runners:java-fn-execution:compileJava UP-TO-DATE
> Task :runners:java-fn-execution:classes UP-TO-DATE
> Task :runners:java-fn-execution:jar UP-TO-DATE
> Task :sdks:java:expansion-service:compileJava UP-TO-DATE
> Task :sdks:java:expansion-service:classes UP-TO-DATE
> Task :sdks:java:expansion-service:jar UP-TO-DATE
> Task :runners:java-job-service:compileJava UP-TO-DATE
> Task :runners:java-job-service:classes UP-TO-DATE
> Task :runners:java-job-service:jar UP-TO-DATE
> Task :runners:direct-java:compileJava UP-TO-DATE
> Task :runners:direct-java:classes UP-TO-DATE
> Task :runners:direct-java:shadowJar UP-TO-DATE
> Task :sdks:java:io:kafka:compileJava UP-TO-DATE
> Task :sdks:java:io:kafka:classes UP-TO-DATE
> Task :sdks:java:io:kafka:jar UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:compileJava UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:classes UP-TO-DATE
> Task :sdks:java:io:google-cloud-platform:jar UP-TO-DATE
> Task :sdks:java:testing:load-tests:compileJava UP-TO-DATE
> Task :sdks:java:testing:load-tests:classes UP-TO-DATE
> Task :sdks:java:testing:load-tests:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:compileJava UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:classes UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:jar UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:compileJava 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:classes 
> UP-TO-DATE
> Task :runners:google-cloud-dataflow-java:worker:legacy-worker:shadowJar 
> UP-TO-DATE

> Task :sdks:java:testing:load-tests:run
May 09, 2020 12:37:46 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
INFO: No stagingLocation provided, falling back to gcpTempLocation
May 09, 2020 12:37:46 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files from 
the classpath: will stage 176 files. Enable logging at DEBUG level to see which 
files will be staged.
May 09, 2020 12:37:46 PM org.apache.beam.sdk.Pipeline validate
WARNING: The following transforms do not have stable unique names: Collect end 
time metric
May 09, 2020 12:37:46 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
May 09, 2020 12:37:46 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Uploading 177 files from PipelineOptions.filesToStage to staging location 
to prepare for execution.
May 09, 2020 12:37:47 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
INFO: Staging files complete: 177 files cached, 0 files newly uploaded in 0 
seconds
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read input as step s1
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect start time metric as step s2
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect metrics as step s3
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Window.Into()/Window.Assign as step s4
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 0/Map as step s5
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/GroupByKey as step s6
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 0/Combine.GroupedValues as step s7
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric as step s8
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 1/Map as step s9
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 1/GroupByKey as step s10
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 1/Combine.GroupedValues as step s11
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric2 as step s12
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 2/Map as step s13
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 2/GroupByKey as step s14
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 2/Combine.GroupedValues as step s15
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric3 as step s16
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Convert to Long: 3/Map as step s17
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 3/GroupByKey as step s18
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Combine: 3/Combine.GroupedValues as step s19
May 09, 2020 12:37:47 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Collect end time metric4 as step s20
May 09, 2020 12:37:47 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/loadtests/staging/
May 09, 2020 12:37:48 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.22.0-SNAPSHOT
May 09, 2020 12:37:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-05-09_05_37_48-12845657179123248579?project=apache-beam-testing
May 09, 2020 12:37:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2020-05-09_05_37_48-12845657179123248579
May 09, 2020 12:37:49 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel 
> --region=us-central1 2020-05-09_05_37_48-12845657179123248579
May 09, 2020 12:37:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-05-09T12:37:51.410Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
load0tests0java0dataflow0batch0combine04-jenkins-050912374-1xuf. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
May 09, 2020 12:37:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-09T12:37:51.539Z: Checking permissions granted to controller 
Service Account.
May 09, 2020 12:37:54 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2020-05-09T12:37:54.037Z: Staged package 
jackson-core-asl-1.9.13-MZxJpDBOP6n-PNjc_ACdNw.jar at location 
'gs://temp-storage-for-perf-tests/loadtests/staging/jackson-core-asl-1.9.13-MZxJpDBOP6n-PNjc_ACdNw.jar'
 is inaccessible.
May 09, 2020 12:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2020-05-09T12:37:55.032Z: Workflow failed. Causes: One or more access 
checks for temp location or staged files failed. Please refer to other error 
messages for details. For more information on security and permissions, please 
see https://cloud.google.com/dataflow/security-and-permissions.
May 09, 2020 12:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-09T12:37:55.106Z: Cleaning up.
May 09, 2020 12:37:56 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-09T12:37:55.213Z: Worker pool stopped.
May 09, 2020 12:37:58 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-05-09T12:37:57.700Z: Your project already contains 100 
Dataflow-created metric descriptors and Stackdriver will not create new 
Dataflow custom metrics for this job. Each unique user-defined metric name 
(independent of the DoFn in which it is defined) produces a new metric 
descriptor. To delete old / unused metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 09, 2020 12:38:01 PM org.apache.beam.runners.dataflow.DataflowPipelineJob 
logTerminalState
INFO: Job 2020-05-09_05_37_48-12845657179123248579 failed with status FAILED.
May 09, 2020 12:38:01 PM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
SEVERE: Failed to get metric totalBytes.count, from namespace combine
Load test results for test (ID): a6be5634-7c18-42d0-b72b-b827eb4e7969 and 
timestamp: 2020-05-09T12:37:45.144000000Z:
                 Metric:                    Value:
    dataflow_runtime_sec                       0.0
dataflow_total_bytes_count                      -1.0
Exception in thread "main" java.lang.RuntimeException: Invalid job state: 
FAILED.
        at 
org.apache.beam.sdk.loadtests.JobFailure.handleFailure(JobFailure.java:55)
        at org.apache.beam.sdk.loadtests.LoadTest.run(LoadTest.java:134)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.run(CombineLoadTest.java:66)
        at 
org.apache.beam.sdk.loadtests.CombineLoadTest.main(CombineLoadTest.java:169)

> Task :sdks:java:testing:load-tests:run FAILED

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:testing:load-tests:run'.
> Process 'command '/usr/lib/jvm/java-8-openjdk-amd64/bin/java'' finished with 
> non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 20s
74 actionable tasks: 1 executed, 73 up-to-date

Publishing build scan...
https://gradle.com/s/2q3aemw7ffb42

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to