See
<https://builds.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/2748/display/redirect>
Changes:
------------------------------------------
[...truncated 323.82 KB...]
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.405Z: Fusing consumer Get file names/Values/Map
into Gather write end times
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.445Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Pair with random key into Find files/Match
filepatterns
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.484Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into Find
files/Reshuffle.ViaRandomKey/Pair with random key
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.520Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.545Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.574Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.608Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.644Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Values/Values/Map into Find
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.687Z: Fusing consumer Read matched
files/ParDo(ToReadableFile) into Find
files/Reshuffle.ViaRandomKey/Values/Values/Map
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.718Z: Fusing consumer Gather read start time into
Read matched files/ParDo(ToReadableFile)
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.752Z: Fusing consumer Read parquet
files/ParDo(Read) into Gather read start time
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.784Z: Fusing consumer Gather read end time into
Read parquet files/ParDo(Read)
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.817Z: Fusing consumer Map records to strings/Map
into Gather read end time
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.854Z: Fusing consumer Calculate
hashcode/WithKeys/AddKeys/Map into Map records to strings/Map
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.887Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate
hashcode/WithKeys/AddKeys/Map
May 04, 2020 6:31:01 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.925Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.958Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:01.992Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Read
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.031Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.067Z: Fusing consumer Calculate
hashcode/Values/Values/Map into Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.106Z: Unzipping flatten s9-u138 for input
s10.org.apache.beam.sdk.values.PCollection.<init>:400#4663620f501c9270-c136
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.132Z: Fusing unzipped copy of Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign,
through flatten Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections/Unzipped-1,
into producer Write Parquet files/WriteFiles/GatherTempFileResults/Add void
key/AddKeys/Map
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.167Z: Fusing consumer Write Parquet
files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.203Z: Fusing consumer Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
into Write Parquet files/WriteFiles/GatherTempFileResults/Add void
key/AddKeys/Map
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.237Z: Fusing consumer Produce text lines into
Generate sequence/Read(BoundedCountingSource)
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.272Z: Fusing consumer Produce Avro records into
Produce text lines
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.314Z: Fusing consumer Gather write start times
into Produce Avro records
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.349Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into
Gather write start times
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.379Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into
Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.414Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into
Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.450Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
into Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.506Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into Write
Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.539Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.571Z: Fusing consumer
PAssert$0/GroupGlobally/Window.Into()/Window.Assign into Calculate
hashcode/ProduceDefault
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.604Z: Fusing consumer
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
into Calculate hashcode/ProduceDefault
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.630Z: Fusing consumer Calculate
hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.666Z: Fusing consumer
PAssert$0/GroupGlobally/GroupDummyAndContents/Reify into
PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.701Z: Fusing consumer
PAssert$0/GroupGlobally/GroupDummyAndContents/Write into
PAssert$0/GroupGlobally/GroupDummyAndContents/Reify
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:02.734Z: Fusing consumer
PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign into
PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.257Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.295Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.340Z: Executing operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.340Z: Starting 5 workers in us-central1-f...
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.372Z: Executing operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.393Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.416Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.416Z: Finished operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.417Z: Executing operation Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.444Z: Finished operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.461Z: Executing operation
PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.493Z: Executing operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.495Z: Finished operation Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.523Z: Finished operation
PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.532Z: Executing operation
PAssert$0/GroupGlobally/GroupDummyAndContents/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.564Z: Finished operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.568Z: Executing operation
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.600Z: Finished operation
PAssert$0/GroupGlobally/GroupDummyAndContents/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.605Z: Executing operation
View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.675Z: Finished operation
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
May 04, 2020 6:31:03 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:03.704Z: Finished operation
View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
May 04, 2020 6:31:05 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:04.045Z: Executing operation Generate
sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro
records+Gather write start times+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
May 04, 2020 6:31:05 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:04.078Z: Executing operation
PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
May 04, 2020 6:31:15 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-05-04T06:31:13.435Z: Your project already contains 100
Dataflow-created metric descriptors and Stackdriver will not create new
Dataflow custom metrics for this job. Each unique user-defined metric name
(independent of the DoFn in which it is defined) produces a new metric
descriptor. To delete old / unused metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
May 04, 2020 6:31:27 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:27.412Z: Autoscaling: Raised the number of workers
to 1 based on the rate of progress in the currently running stage(s).
May 04, 2020 6:31:27 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:27.456Z: Resized worker pool to 1, though goal was
5. This could be a quota issue.
May 04, 2020 6:31:34 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:32.807Z: Autoscaling: Raised the number of workers
to 5 based on the rate of progress in the currently running stage(s).
May 04, 2020 6:31:51 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:50.383Z: Workers have started successfully.
May 04, 2020 6:31:51 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:31:50.409Z: Workers have started successfully.
May 04, 2020 6:32:08 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:32:06.378Z: Finished operation
PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
May 04, 2020 6:35:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:16.453Z: Finished operation Generate
sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro
records+Gather write start times+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
May 04, 2020 6:35:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:16.548Z: Executing operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
May 04, 2020 6:35:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:16.616Z: Finished operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
May 04, 2020 6:35:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:16.685Z: Executing operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+Write
Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:17 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:17.062Z: Finished operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+Write
Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:17.131Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:17.187Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:18 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:17.272Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet files/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+Write
Parquet files/WriteFiles/GatherTempFileResults/Drop key/Values/Map+Write
Parquet files/WriteFiles/GatherTempFileResults/Gather bundles+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random
key+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:20 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:18.978Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet files/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+Write
Parquet files/WriteFiles/GatherTempFileResults/Drop key/Values/Map+Write
Parquet files/WriteFiles/GatherTempFileResults/Gather bundles+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random
key+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:20 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:19.050Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:20 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:19.112Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:20 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:19.195Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+Write
Parquet files/WriteFiles/FinalizeTempFileBundles/Finalize+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with
random key+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:23 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:21.612Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+Write
Parquet files/WriteFiles/FinalizeTempFileBundles/Finalize+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with
random key+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:23 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:21.687Z: Executing operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:23 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:21.755Z: Finished operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:23 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:21.827Z: Executing operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map+Gather
write end times+Get file names/Values/Map+Find files/Match filepatterns+Find
files/Reshuffle.ViaRandomKey/Pair with random key+Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:25 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:24.888Z: Finished operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map+Gather
write end times+Get file names/Values/Map+Find files/Match filepatterns+Find
files/Reshuffle.ViaRandomKey/Pair with random key+Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
May 04, 2020 6:35:25 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:25.017Z: Executing operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:25 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:25.069Z: Finished operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
May 04, 2020 6:35:25 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-05-04T06:35:25.151Z: Executing operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Find
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Find
files/Reshuffle.ViaRandomKey/Values/Values/Map+Read matched
files/ParDo(ToReadableFile)+Gather read start time+Read parquet
files/ParDo(Read)+Gather read end time+Map records to strings/Map+Calculate
hashcode/WithKeys/AddKeys/Map+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Write
May 04, 2020 6:36:23 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2020-05-03_23_30_43-9298347288044165331
org.apache.beam.sdk.io.parquet.ParquetIOIT > writeThenReadAll SKIPPED
> Task :sdks:java:io:file-based-io-tests:integrationTest FAILED
:sdks:java:io:file-based-io-tests:integrationTest (Thread[Execution worker for
':' Thread 3,5,main]) completed. Took 5 mins 54.598 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:file-based-io-tests:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 36s
92 actionable tasks: 58 executed, 34 from cache
Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=5d54ffb0-0a0c-45f8-b646-6159833135e1,
currentDir=<https://builds.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 8508
log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-8508.out.log
----- Last 20 lines from daemon log file - daemon-8508.out.log -----
* What went wrong:
Execution failed for task ':sdks:java:io:file-based-io-tests:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 36s
92 actionable tasks: 58 executed, 34 from cache
Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]