See
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/311/display/redirect?page=changes>
Changes:
[thw] Mention portable Flink runner support for state and timers in 2.9.0
------------------------------------------
[...truncated 8.06 MB...]
Dec 17, 2018 12:52:12 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$13/RunChecks as step s35
Dec 17, 2018 12:52:12 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding PAssert$13/VerifyAssertions/ParDo(DefaultConclude) as step s36
Dec 17, 2018 12:52:12 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging pipeline description to
gs://temp-storage-for-end-to-end-tests/flattentest0testflattenpcollectionsemptythenpardo-jenkins-1217125203-f12e0983/output/results/staging/
Dec 17, 2018 12:52:12 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <77886 bytes, hash Ozjcig-xo_FaPWIdRBcusw> to
gs://temp-storage-for-end-to-end-tests/flattentest0testflattenpcollectionsemptythenpardo-jenkins-1217125203-f12e0983/output/results/staging/pipeline-Ozjcig-xo_FaPWIdRBcusw.pb
org.apache.beam.sdk.transforms.FlattenTest >
testFlattenPCollectionsEmptyThenParDo STANDARD_OUT
Dataflow SDK version: 2.10.0-SNAPSHOT
org.apache.beam.sdk.transforms.FlattenTest >
testFlattenPCollectionsEmptyThenParDo STANDARD_ERROR
Dec 17, 2018 12:52:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-12-17_04_52_12-17815663097439164906?project=apache-beam-testing
org.apache.beam.sdk.transforms.FlattenTest >
testFlattenPCollectionsEmptyThenParDo STANDARD_OUT
Submitted job: 2018-12-17_04_52_12-17815663097439164906
org.apache.beam.sdk.transforms.FlattenTest >
testFlattenPCollectionsEmptyThenParDo STANDARD_ERROR
Dec 17, 2018 12:52:13 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2018-12-17_04_52_12-17815663097439164906
Dec 17, 2018 12:52:13 PM
org.apache.beam.runners.dataflow.TestDataflowRunner run
INFO: Running Dataflow job 2018-12-17_04_52_12-17815663097439164906 with 0
expected assertions.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:12.974Z: Autoscaling is enabled for job
2018-12-17_04_52_12-17815663097439164906. The number of workers will be between
1 and 1000.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:13.015Z: Autoscaling was automatically enabled for
job 2018-12-17_04_52_12-17815663097439164906.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2018-12-17T12:52:15.518Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
flattentest0testflattenpcollectionsemptythenpardo-jenkins--hp55. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:15.855Z: Checking permissions granted to controller
Service Account.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:19.904Z: Worker configuration: n1-standard-1 in
us-central1-b.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:20.508Z: Expanding CollectionToSingleton operations
into optimizable parts.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:20.602Z: Expanding CoGroupByKey operations into
optimizable parts.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:20.831Z: Expanding GroupByKey operations into
optimizable parts.
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:20.927Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:20.975Z: Elided trivial flatten
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.017Z: Elided trivial flatten
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.056Z: Elided trivial flatten
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.100Z: Unzipping flatten s29 for input
s17.org.apache.beam.sdk.values.PCollection.<init>:402#a4f9f304fed667ee
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.146Z: Fusing unzipped copy of
PAssert$13/GroupGlobally/GroupDummyAndContents/Reify, through flatten
PAssert$13/GroupGlobally/FlattenDummyAndContents, into producer
PAssert$13/GroupGlobally/KeyForDummy/AddKeys/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.193Z: Fusing consumer
PAssert$13/VerifyAssertions/ParDo(DefaultConclude) into PAssert$13/RunChecks
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.238Z: Unzipping flatten s29-u40 for input
s31-reify-value18-c38
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.285Z: Fusing unzipped copy of
PAssert$13/GroupGlobally/GroupDummyAndContents/Write, through flatten s29-u40,
into producer PAssert$13/GroupGlobally/GroupDummyAndContents/Reify
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.332Z: Fusing consumer
PAssert$13/GroupGlobally/GroupDummyAndContents/GroupByWindow into
PAssert$13/GroupGlobally/GroupDummyAndContents/Read
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.379Z: Fusing consumer
PAssert$13/GroupGlobally/Values/Values/Map into
PAssert$13/GroupGlobally/GroupDummyAndContents/GroupByWindow
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.417Z: Fusing consumer PAssert$13/GetPane/Map into
PAssert$13/GroupGlobally/ParDo(Concat)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.464Z: Fusing consumer PAssert$13/RunChecks into
PAssert$13/GetPane/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.510Z: Fusing consumer
PAssert$13/GroupGlobally/ParDo(Concat) into
PAssert$13/GroupGlobally/Values/Values/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.565Z: Fusing consumer
PAssert$13/GroupGlobally/GroupDummyAndContents/Reify into
PAssert$13/GroupGlobally/WindowIntoDummy/Window.Assign
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.608Z: Fusing consumer
PAssert$13/GroupGlobally/GroupDummyAndContents/Write into
PAssert$13/GroupGlobally/GroupDummyAndContents/Reify
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.658Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.701Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.742Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Reify into
PAssert$13/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.790Z: Fusing consumer
PAssert$13/GroupGlobally/WindowIntoDummy/Window.Assign into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.833Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.873Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow into
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Read
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.919Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Write into
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Reify
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:21.967Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.015Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.057Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.105Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.153Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.200Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.247Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.300Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/Values/Values/Map into
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/GroupByWindow
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.339Z: Fusing consumer
PAssert$13/GroupGlobally/Window.Into()/Window.Assign into ParDo(Identity)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.375Z: Fusing consumer
PAssert$13/GroupGlobally/RewindowActuals/Window.Assign into
PAssert$13/GroupGlobally/GatherAllOutputs/Values/Values/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.422Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.459Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.504Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key into
Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.541Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/Window.Into()/Window.Assign into
PAssert$13/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.588Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Values/Values/Map
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.627Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/WithKeys/AddKeys/Map into
PAssert$13/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.672Z: Fusing consumer
PAssert$13/GroupGlobally/GatherAllOutputs/Reify.Window/ParDo(Anonymous) into
PAssert$13/GroupGlobally/Window.Into()/Window.Assign
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.720Z: Fusing consumer
PAssert$13/GroupGlobally/KeyForDummy/AddKeys/Map into
PAssert$13/GroupGlobally/RewindowActuals/Window.Assign
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.765Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
into
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.809Z: Fusing consumer
Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
into Flatten.PCollections/Create.Values/Read(CreateSource)/Impulse
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.852Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
into PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Impulse
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.900Z: Fusing consumer
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random key into
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:22.950Z: Fusing consumer ParDo(Identity) into
Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(ReadFromBoundedSource)
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.431Z: Executing operation
PAssert$13/GroupGlobally/GatherAllOutputs/GroupByKey/Create
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.478Z: Executing operation
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.525Z: Executing operation
PAssert$13/GroupGlobally/GroupDummyAndContents/Create
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.539Z: Starting 1 workers in us-central1-b...
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.572Z: Executing operation
Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.825Z: Executing operation
PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Impulse+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random
key+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+PAssert$13/GroupGlobally/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Dec 17, 2018 12:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:23.872Z: Executing operation
Flatten.PCollections/Create.Values/Read(CreateSource)/Impulse+Flatten.PCollections/Create.Values/Read(CreateSource)/ParDo(SplitBoundedSource)+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Pair
with random
key+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Flatten.PCollections/Create.Values/Read(CreateSource)/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Dec 17, 2018 12:52:36 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:52:34.329Z: Autoscaling: Raised the number of workers
to 0 based on the rate of progress in the currently running step(s).
Dec 17, 2018 12:53:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:53:06.444Z: Workers have started successfully.
Dec 17, 2018 12:53:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:53:09.336Z: Autoscaling: Raised the number of workers
to 1 based on the rate of progress in the currently running step(s).
Dec 17, 2018 12:53:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:53:09.382Z: Autoscaling: Would further reduce the
number of workers but reached the minimum number allowed for the job.
Dec 17, 2018 12:53:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T12:53:37.505Z: Workers have started successfully.
Dec 17, 2018 1:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
SEVERE: 2018-12-17T13:52:23.945Z: Workflow failed. Causes: The Dataflow job
appears to be stuck because no worker activity has been seen in the last 1h.
You can get help with Cloud Dataflow at
https://cloud.google.com/dataflow/support.
Dec 17, 2018 1:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T13:52:24.088Z: Cancel request is committed for workflow
job: 2018-12-17_04_52_12-17815663097439164906.
Dec 17, 2018 1:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T13:52:24.175Z: Cleaning up.
Dec 17, 2018 1:52:25 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T13:52:24.306Z: Stopping worker pool...
Dec 17, 2018 1:52:25 PM
org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler
process
INFO: Dataflow job 2018-12-17_04_52_12-17815663097439164906 threw
exception. Failure message was: Workflow failed. Causes: The Dataflow job
appears to be stuck because no worker activity has been seen in the last 1h.
You can get help with Cloud Dataflow at
https://cloud.google.com/dataflow/support.
Dec 17, 2018 1:54:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T13:54:16.091Z: Autoscaling: Reduced the number of workers
to 0 based on the rate of progress in the currently running step(s).
Dec 17, 2018 1:54:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2018-12-17T13:54:16.135Z: Worker pool stopped.
Dec 17, 2018 1:54:23 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
INFO: Job 2018-12-17_04_52_12-17815663097439164906 failed with status
FAILED.
Dec 17, 2018 1:54:23 PM org.apache.beam.runners.dataflow.TestDataflowRunner
checkForPAssertSuccess
INFO: Success result for Dataflow job
2018-12-17_04_52_12-17815663097439164906. Found 0 success, 0 failures out of 0
expected assertions.
Gradle Test Executor 110 finished executing tests.
> Task
> :beam-runners-google-cloud-dataflow-java:validatesRunnerFnApiWorkerExecutableStageTest
> FAILED
org.apache.beam.sdk.transforms.FlattenTest >
testFlattenPCollectionsEmptyThenParDo FAILED
java.lang.RuntimeException: Workflow failed. Causes: The Dataflow job
appears to be stuck because no worker activity has been seen in the last 1h.
You can get help with Cloud Dataflow at
https://cloud.google.com/dataflow/support.
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331)
at
org.apache.beam.sdk.transforms.FlattenTest.testFlattenPCollectionsEmptyThenParDo(FlattenTest.java:224)
72 tests completed, 3 failed, 1 skipped
Finished generating test XML results (0.209 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/test-results/validatesRunnerFnApiWorkerExecutableStageTest>
Generating HTML test report...
Finished generating test html results (0.269 secs) into:
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerFnApiWorkerExecutableStageTest>
:beam-runners-google-cloud-dataflow-java:validatesRunnerFnApiWorkerExecutableStageTest
(Thread[Task worker for ':' Thread 103,5,main]) completed. Took 2 hrs 46 mins
48.341 secs.
:beam-runners-google-cloud-dataflow-java:cleanUpDockerImages (Thread[Task
worker for ':' Thread 103,5,main]) started.
> Task :beam-runners-google-cloud-dataflow-java:cleanUpDockerImages FAILED
Caching disabled for task
':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages': Caching has not
been enabled for the task
Task ':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages' is not
up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory:
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java>
Command: docker rmi
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181217110433
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181217110433
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:ce4d20456354233a7b02bde0697ae41f67bf04e449a17a9bf02e0d3bcd5fdf2d
Starting process 'command 'gcloud''. Working directory:
<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java>
Command: gcloud --quiet container images delete --force-delete-tags
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181217110433
Successfully started process 'command 'gcloud''
ERROR: (gcloud.container.images.delete)
[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20181217110433] is not a
valid name.
:beam-runners-google-cloud-dataflow-java:cleanUpDockerImages (Thread[Task
worker for ':' Thread 103,5,main]) completed. Took 1.132 secs.
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task
':beam-runners-google-cloud-dataflow-java:validatesRunnerFnApiWorkerExecutableStageTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build/reports/tests/validatesRunnerFnApiWorkerExecutableStageTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Java_ValidatesRunner_PortabilityApi_Dataflow/ws/src/runners/google-cloud-dataflow-java/build.gradle'>
line: 457
* What went wrong:
Execution failed for task
':beam-runners-google-cloud-dataflow-java:cleanUpDockerImages'.
> Process 'command 'gcloud'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/4.10.2/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2h 50m 7s
75 actionable tasks: 69 executed, 5 from cache, 1 up-to-date
Publishing build scan...
https://gradle.com/s/5j5uacyuapoe4
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]