See
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/7903/display/redirect>
------------------------------------------
[...truncated 375.26 KB...]
root: INFO: 2019-04-13T18:15:19.876Z: JOB_MESSAGE_DETAILED: Fusing consumer
count into group/GroupByWindow
root: INFO: 2019-04-13T18:15:19.940Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
into
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey
root: INFO: 2019-04-13T18:15:19.980Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/CreateIsmShardKeyAndSortKey
into write/Write/WriteImpl/PreFinalize/MapToVoidKey1
root: INFO: 2019-04-13T18:15:20.024Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read
root: INFO: 2019-04-13T18:15:20.062Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/Extract into
write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2019-04-13T18:15:20.109Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/GroupByKey/GroupByWindow into
write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2019-04-13T18:15:20.157Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/GroupByKey/Write into
write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2019-04-13T18:15:20.205Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/GroupByKey/Reify into
write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2019-04-13T18:15:20.256Z: JOB_MESSAGE_DETAILED: Fusing consumer
group/GroupByWindow into group/Read
root: INFO: 2019-04-13T18:15:20.292Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
into
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey
root: INFO: 2019-04-13T18:15:20.344Z: JOB_MESSAGE_DETAILED: Fusing consumer
group/Write into group/Reify
root: INFO: 2019-04-13T18:15:20.394Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey
into write/Write/WriteImpl/WriteBundles/MapToVoidKey0
root: INFO: 2019-04-13T18:15:20.442Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
into
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
root: INFO: 2019-04-13T18:15:20.489Z: JOB_MESSAGE_DETAILED: Fusing consumer
pair_with_one into split
root: INFO: 2019-04-13T18:15:20.530Z: JOB_MESSAGE_DETAILED: Fusing consumer
split into read/Read/ReadSplits
root: INFO: 2019-04-13T18:15:20.567Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap
into
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
root: INFO: 2019-04-13T18:15:20.613Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/ReadSplits into read/Read/Reshuffle/RemoveRandomKeys
root: INFO: 2019-04-13T18:15:20.670Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Reshuffle/RemoveRandomKeys into
read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
root: INFO: 2019-04-13T18:15:20.714Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap
into
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read
root: INFO: 2019-04-13T18:15:20.764Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey
into write/Write/WriteImpl/PreFinalize/MapToVoidKey0
root: INFO: 2019-04-13T18:15:20.812Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write into
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify
root: INFO: 2019-04-13T18:15:20.857Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify into
read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
root: INFO: 2019-04-13T18:15:20.902Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
into
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey
root: INFO: 2019-04-13T18:15:20.953Z: JOB_MESSAGE_DETAILED: Fusing consumer
write/Write/WriteImpl/DoOnce/Map(decode) into
write/Write/WriteImpl/DoOnce/FlatMap(<lambda at core.py:2172>)
root: INFO: 2019-04-13T18:15:20.999Z: JOB_MESSAGE_DETAILED: Fusing consumer
read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow
root: INFO: 2019-04-13T18:15:21.042Z: JOB_MESSAGE_DEBUG: Workflow config is
missing a default resource spec.
root: INFO: 2019-04-13T18:15:21.093Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2019-04-13T18:15:21.142Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2019-04-13T18:15:21.191Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-04-13T18:15:21.436Z: JOB_MESSAGE_DEBUG: Executing wait step
start120
root: INFO: 2019-04-13T18:15:21.550Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
root: INFO: 2019-04-13T18:15:21.604Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
root: INFO: 2019-04-13T18:15:21.617Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2019-04-13T18:15:21.651Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
root: INFO: 2019-04-13T18:15:21.663Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-b...
root: INFO: 2019-04-13T18:15:21.691Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
root: INFO: 2019-04-13T18:15:21.739Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
root: INFO: 2019-04-13T18:15:21.784Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Create
root: INFO: 2019-04-13T18:15:21.837Z: JOB_MESSAGE_BASIC: Executing operation
group/Create
root: INFO: 2019-04-13T18:15:21.880Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/GroupByKey/Create
root: INFO: 2019-04-13T18:15:21.911Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Create
root: INFO: 2019-04-13T18:15:21.957Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.009Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.064Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.110Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.163Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.210Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.257Z: JOB_MESSAGE_DEBUG: Value "group/Session"
materialized.
root: INFO: 2019-04-13T18:15:22.298Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/GroupByKey/Session" materialized.
root: INFO: 2019-04-13T18:15:22.341Z: JOB_MESSAGE_DEBUG: Value
"read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized.
root: INFO: 2019-04-13T18:15:22.382Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/DoOnce/Impulse+write/Write/WriteImpl/DoOnce/FlatMap(<lambda
at
core.py:2172>)+write/Write/WriteImpl/DoOnce/Map(decode)+write/Write/WriteImpl/InitializeWrite+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/WriteBundles/MapToVoidKey0+write/Write/WriteImpl/PreFinalize/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey0+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write
root: INFO: 2019-04-13T18:15:22.434Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Impulse+read/Read/Split+read/Read/Reshuffle/AddRandomKeys+read/Read/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Write
root: INFO: 2019-04-13T18:15:32.799Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 0 based on the rate of progress in the currently
running step(s).
root: INFO: 2019-04-13T18:16:58.708Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-04-13T18:17:10.188Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 1 based on the rate of progress in the currently
running step(s).
root: INFO: 2019-04-13T18:17:10.246Z: JOB_MESSAGE_DETAILED: Autoscaling: Would
further reduce the number of workers but reached the minimum number allowed for
the job.
root: INFO: 2019-04-13T18:17:28.758Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2019-04-13T18:19:44.692Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/DoOnce/Map(decode).out" materialized.
root: INFO: 2019-04-13T18:19:44.740Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-13T18:19:44.786Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-13T18:19:44.821Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close
root: INFO: 2019-04-13T18:19:44.879Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-13T18:19:44.925Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-13T18:19:44.971Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap
root: INFO: 2019-04-13T18:19:46.600Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Close
root: INFO: 2019-04-13T18:19:46.696Z: JOB_MESSAGE_BASIC: Executing operation
read/Read/Reshuffle/ReshufflePerKey/GroupByKey/Read+read/Read/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read/Read/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read/Read/Reshuffle/RemoveRandomKeys+read/Read/ReadSplits+split+pair_with_one+group/Reify+group/Write
root: INFO: 2019-04-13T18:19:59.597Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0"
materialized.
root: INFO: 2019-04-13T18:19:59.708Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-13T18:19:59.829Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/WriteBundles/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0"
materialized.
root: INFO: 2019-04-13T18:20:10.868Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0"
materialized.
root: INFO: 2019-04-13T18:20:10.969Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-13T18:20:11.091Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/FinalizeWrite/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0"
materialized.
root: INFO: 2019-04-13T18:20:13.289Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/ToIsmRecordForMultimap.out0"
materialized.
root: INFO: 2019-04-13T18:20:13.374Z: JOB_MESSAGE_BASIC: Executing operation
write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize
root: INFO: 2019-04-13T18:20:13.502Z: JOB_MESSAGE_DEBUG: Value
"write/Write/WriteImpl/PreFinalize/_DataflowIterableSideInput(MapToVoidKey0.out.0)/Materialize.out0"
materialized.
root: INFO: 2019-04-13T18:20:14.759Z: JOB_MESSAGE_BASIC: Executing operation
group/Close
root: INFO: 2019-04-13T18:20:14.853Z: JOB_MESSAGE_BASIC: Executing operation
group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
root: INFO: 2019-04-13T18:20:16.792Z: JOB_MESSAGE_ERROR:
java.lang.IllegalArgumentException: This handler is only capable of dealing
with urn:beam:sideinput:materialization:multimap:0.1 materializations but was
asked to handle beam:side_input:multimap:v1 for PCollectionView with tag
side0-write/Write/WriteImpl/WriteBundles.
at
org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
root: INFO: 2019-04-13T18:20:18.043Z: JOB_MESSAGE_ERROR:
java.lang.IllegalArgumentException: This handler is only capable of dealing
with urn:beam:sideinput:materialization:multimap:0.1 materializations but was
asked to handle beam:side_input:multimap:v1 for PCollectionView with tag
side0-write/Write/WriteImpl/WriteBundles.
at
org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
root: INFO: 2019-04-13T18:20:20.161Z: JOB_MESSAGE_ERROR:
java.lang.IllegalArgumentException: This handler is only capable of dealing
with urn:beam:sideinput:materialization:multimap:0.1 materializations but was
asked to handle beam:side_input:multimap:v1 for PCollectionView with tag
side0-write/Write/WriteImpl/WriteBundles.
at
org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
root: INFO: 2019-04-13T18:20:22.278Z: JOB_MESSAGE_ERROR:
java.lang.IllegalArgumentException: This handler is only capable of dealing
with urn:beam:sideinput:materialization:multimap:0.1 materializations but was
asked to handle beam:side_input:multimap:v1 for PCollectionView with tag
side0-write/Write/WriteImpl/WriteBundles.
at
org.apache.beam.vendor.guava.v20_0.com.google.common.base.Preconditions.checkArgument(Preconditions.java:399)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.transformSideInputForRunner(RegisterNodeFunction.java:506)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:327)
at
org.apache.beam.runners.dataflow.worker.graph.RegisterNodeFunction.apply(RegisterNodeFunction.java:97)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:208)
at
org.apache.beam.runners.dataflow.worker.graph.CreateRegisterFnOperationFunction.apply(CreateRegisterFnOperationFunction.java:75)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at java.util.function.Function.lambda$andThen$1(Function.java:88)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:347)
at
org.apache.beam.runners.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:306)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.start(DataflowRunnerHarness.java:195)
at
org.apache.beam.runners.dataflow.worker.DataflowRunnerHarness.main(DataflowRunnerHarness.java:123)
root: INFO: 2019-04-13T18:20:22.335Z: JOB_MESSAGE_DEBUG: Executing failure step
failure119
root: INFO: 2019-04-13T18:20:22.384Z: JOB_MESSAGE_ERROR: Workflow failed.
Causes:
S20:group/Read+group/GroupByWindow+count+format+write/Write/WriteImpl/WriteBundles/WriteBundles+write/Write/WriteImpl/Pair+write/Write/WriteImpl/WindowInto(WindowIntoFn)+write/Write/WriteImpl/GroupByKey/Reify+write/Write/WriteImpl/GroupByKey/Write
failed., A work item was attempted 4 times without success. Each time the
worker eventually lost contact with the service. The work item was attempted
on:
beamapp-jenkins-041318145-04131115-i6nw-harness-zmj0,
beamapp-jenkins-041318145-04131115-i6nw-harness-zmj0,
beamapp-jenkins-041318145-04131115-i6nw-harness-zmj0,
beamapp-jenkins-041318145-04131115-i6nw-harness-zmj0
root: INFO: 2019-04-13T18:20:22.544Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-04-13T18:20:22.910Z: JOB_MESSAGE_DEBUG: Starting worker pool
teardown.
root: INFO: 2019-04-13T18:20:22.966Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-04-13T18:22:47.359Z: JOB_MESSAGE_DETAILED: Autoscaling:
Reduced the number of workers to 0 based on the rate of progress in the
currently running step(s).
root: INFO: 2019-04-13T18:22:47.424Z: JOB_MESSAGE_DETAILED: Autoscaling: Would
further reduce the number of workers but reached the minimum number allowed for
the job.
root: INFO: 2019-04-13T18:22:47.483Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-04-13T18:22:47.536Z: JOB_MESSAGE_DEBUG: Tearing down pending
resources...
root: INFO: Job 2019-04-13_11_15_13-1888490850462310806 is in state
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 34 tests in 3069.454s
FAILED (SKIP=1, errors=2)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_31-3428930342950480986?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_22_15-13256459757837754172?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_30_36-7859903374873003595?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_24-8932097969515457985?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_26_04-3474262005847236877?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_26-6668421654583648138?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_20_09-11203739388028176273?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_28_42-13105351882330321113?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_24-9670293250846227356?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_25_58-7027230000528733647?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_21-12436449932578398213?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_25-2177033111231092869?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_14_04-15378213939865994509?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_22_02-14595388571481945606?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_27_51-6961345264288961384?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_35_18-16249981088675371051?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_24-2698831291138549175?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_15_40-15998170860399364337?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_16_43-16756959766273709042?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_24_04-4978029365434610978?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_31_33-15195393278472320?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_25-8601306218544882749?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_15_13-1888490850462310806?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_23_06-11131141923355872471?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_29_54-9485828737771333552?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_36_32-17387506601299860525?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_43_45-9659544730735318833?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_50_28-3356260905819080298?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_06_24-13297406136134483175?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_14_50-7863486356974520971?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_25_55-9714264114400551484?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-13_11_33_29-8236249168946669961?project=apache-beam-testing.
> Task :beam-sdks-python:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
line: 127
* What went wrong:
Execution failed for task ':beam-sdks-python:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
line: 268
* What went wrong:
Execution failed for task ':beam-sdks-python:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python_Verify/ws/src/sdks/python/build.gradle'>
line: 229
* What went wrong:
Execution failed for task ':beam-sdks-python:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 52m 49s
6 actionable tasks: 6 executed
Publishing build scan...
https://gradle.com/s/mrb47asqcxmgo
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]