See
<https://builds.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/2894/display/redirect?page=changes>
Changes:
[robertwb] [BEAM-6215] Additional tests for FlatMap label.
[github] [BEAM-2939] Fix splittable DoFn lifecycle. (#11941)
------------------------------------------
[...truncated 331.54 KB...]
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.167Z: Fusing consumer Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
into Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.199Z: Fusing consumer Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
into Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.224Z: Fusing consumer Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
into Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.263Z: Fusing consumer Gather write end times into
Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.292Z: Fusing consumer Get file names/Values/Map
into Gather write end times
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.313Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Pair with random key into Find files/Match
filepatterns
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.346Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign into Find
files/Reshuffle.ViaRandomKey/Pair with random key
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.373Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify into Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.405Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write into Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.435Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow into Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.470Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable into Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.501Z: Fusing consumer Find
files/Reshuffle.ViaRandomKey/Values/Values/Map into Find
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.535Z: Fusing consumer Read matched
files/ParDo(ToReadableFile) into Find
files/Reshuffle.ViaRandomKey/Values/Values/Map
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.561Z: Fusing consumer Gather read start time into
Read matched files/ParDo(ToReadableFile)
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.585Z: Fusing consumer Read parquet
files/ParDo(Read) into Gather read start time
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.624Z: Fusing consumer Gather read end time into
Read parquet files/ParDo(Read)
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.649Z: Fusing consumer Map records to strings/Map
into Gather read end time
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.674Z: Fusing consumer Calculate
hashcode/WithKeys/AddKeys/Map into Map records to strings/Map
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.709Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial into Calculate
hashcode/WithKeys/AddKeys/Map
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.743Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify into Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.779Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Write into Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.899Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues into Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Read
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.934Z: Fusing consumer Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract into Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.965Z: Fusing consumer Calculate
hashcode/Values/Values/Map into Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Extract
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:46.998Z: Unzipping flatten s9-u138 for input
s10.org.apache.beam.sdk.values.PCollection.<init>:400#4663620f501c9270-c136
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.034Z: Fusing unzipped copy of Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign,
through flatten Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/Flatten.PCollections/Unzipped-1,
into producer Write Parquet files/WriteFiles/GatherTempFileResults/Add void
key/AddKeys/Map
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.070Z: Fusing consumer Write Parquet
files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map into Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.105Z: Fusing consumer Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign
into Write Parquet files/WriteFiles/GatherTempFileResults/Add void
key/AddKeys/Map
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.141Z: Fusing consumer Produce text lines into
Generate sequence/Read(BoundedCountingSource)
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.176Z: Fusing consumer Produce Avro records into
Produce text lines
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.205Z: Fusing consumer Gather write start times
into Produce Avro records
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.240Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles into
Gather write start times
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.268Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify into
Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.299Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write into
Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify
Jun 09, 2020 6:37:47 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.326Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
into Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.363Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten into Write
Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.400Z: Fusing consumer Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum into Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.437Z: Fusing consumer
PAssert$0/GroupGlobally/Window.Into()/Window.Assign into Calculate
hashcode/ProduceDefault
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.474Z: Fusing consumer
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
into Calculate hashcode/ProduceDefault
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.510Z: Fusing consumer Calculate
hashcode/ProduceDefault into Calculate hashcode/CreateVoid/Read(CreateSource)
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.547Z: Fusing consumer
PAssert$0/GroupGlobally/GroupDummyAndContents/Reify into
PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.576Z: Fusing consumer
PAssert$0/GroupGlobally/GroupDummyAndContents/Write into
PAssert$0/GroupGlobally/GroupDummyAndContents/Reify
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:47.605Z: Fusing consumer
PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign into
PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.184Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.234Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.258Z: Executing operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.275Z: Starting 5 ****s in us-central1-a...
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.302Z: Executing operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.319Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.339Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.339Z: Finished operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.343Z: Executing operation Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.366Z: Finished operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.379Z: Executing operation
PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.397Z: Finished operation Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.424Z: Executing operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.444Z: Finished operation
PAssert$0/GroupGlobally/GatherAllOutputs/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.458Z: Executing operation
PAssert$0/GroupGlobally/GroupDummyAndContents/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.491Z: Finished operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.501Z: Executing operation
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.521Z: Finished operation
PAssert$0/GroupGlobally/GroupDummyAndContents/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.558Z: Executing operation
View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.577Z: Finished operation
View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.637Z: Finished operation
View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:48.956Z: Executing operation Generate
sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro
records+Gather write start times+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Jun 09, 2020 6:37:49 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:37:49.001Z: Executing operation
PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
Jun 09, 2020 6:37:58 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2020-06-09T18:37:57.670Z: Your project already contains 100
Dataflow-created metric descriptors and Stackdriver will not create new
Dataflow custom metrics for this job. Each unique user-defined metric name
(independent of the DoFn in which it is defined) produces a new metric
descriptor. To delete old / unused metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Jun 09, 2020 6:38:24 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:38:21.674Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Jun 09, 2020 6:38:39 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:38:38.514Z: Workers have started successfully.
Jun 09, 2020 6:38:39 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:38:38.540Z: Workers have started successfully.
Jun 09, 2020 6:38:52 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:38:51.474Z: Finished operation
PAssert$0/GroupGlobally/Create.Values/Read(CreateSource)+PAssert$0/GroupGlobally/WindowIntoDummy/Window.Assign+PAssert$0/GroupGlobally/GroupDummyAndContents/Reify+PAssert$0/GroupGlobally/GroupDummyAndContents/Write
Jun 09, 2020 6:42:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:05.444Z: Finished operation Generate
sequence/Read(BoundedCountingSource)+Produce text lines+Produce Avro
records+Gather write start times+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnshardedBundles+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write+Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Reify+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Write
Jun 09, 2020 6:42:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:05.518Z: Executing operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Jun 09, 2020 6:42:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:05.586Z: Finished operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Close
Jun 09, 2020 6:42:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:05.676Z: Executing operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+Write
Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:06.031Z: Finished operation Write Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/Read+Write
Parquet
files/WriteFiles/WriteUnshardedBundlesToTempFiles/GroupUnwritten/GroupByWindow+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/WriteUnwritten+Write
Parquet files/WriteFiles/WriteUnshardedBundlesToTempFiles/DropShardNum+Write
Parquet files/WriteFiles/GatherTempFileResults/Add void key/AddKeys/Map+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Reify+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:06.100Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:06.165Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:06.305Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet files/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+Write
Parquet files/WriteFiles/GatherTempFileResults/Drop key/Values/Map+Write
Parquet files/WriteFiles/GatherTempFileResults/Gather bundles+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random
key+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:06.902Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/Read+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet files/WriteFiles/GatherTempFileResults/Reshuffle/ExpandIterable+Write
Parquet files/WriteFiles/GatherTempFileResults/Drop key/Values/Map+Write
Parquet files/WriteFiles/GatherTempFileResults/Gather bundles+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Pair with random
key+Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:06.979Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:07.044Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:07 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:07.118Z: Executing operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+Write
Parquet files/WriteFiles/FinalizeTempFileBundles/Finalize+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with
random key+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:07.821Z: Finished operation Write Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/GatherTempFileResults/Reshuffle.ViaRandomKey/Values/Values/Map+Write
Parquet files/WriteFiles/FinalizeTempFileBundles/Finalize+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Pair with
random key+Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:07.904Z: Executing operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:07.965Z: Finished operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:08.048Z: Executing operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map+Gather
write end times+Get file names/Values/Map+Find files/Match filepatterns+Find
files/Reshuffle.ViaRandomKey/Pair with random key+Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:12 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:10.605Z: Finished operation Write Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Write
Parquet
files/WriteFiles/FinalizeTempFileBundles/Reshuffle.ViaRandomKey/Values/Values/Map+Gather
write end times+Get file names/Values/Map+Find files/Match filepatterns+Find
files/Reshuffle.ViaRandomKey/Pair with random key+Find
files/Reshuffle.ViaRandomKey/Reshuffle/Window.Into()/Window.Assign+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Reify+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Write
Jun 09, 2020 6:42:12 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:10.715Z: Executing operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:12 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:10.783Z: Finished operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Close
Jun 09, 2020 6:42:12 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2020-06-09T18:42:10.858Z: Executing operation Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/Read+Find
files/Reshuffle.ViaRandomKey/Reshuffle/GroupByKey/GroupByWindow+Find
files/Reshuffle.ViaRandomKey/Reshuffle/ExpandIterable+Find
files/Reshuffle.ViaRandomKey/Values/Values/Map+Read matched
files/ParDo(ToReadableFile)+Gather read start time+Read parquet
files/ParDo(Read)+Gather read end time+Map records to strings/Map+Calculate
hashcode/WithKeys/AddKeys/Map+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Write
org.apache.beam.sdk.io.parquet.ParquetIOIT > writeThenReadAll SKIPPED
> Task :sdks:java:io:file-based-io-tests:integrationTest FAILED
:sdks:java:io:file-based-io-tests:integrationTest (Thread[Execution **** for
':' Thread 10,5,main]) completed. Took 6 mins 4.587 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:file-based-io-tests:integrationTest'.
> Process 'Gradle Test Executor 14' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 41s
95 actionable tasks: 60 executed, 35 from cache
Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=4c79d252-4689-4cc1-84f1-d3fa502f66d7,
currentDir=<https://builds.apache.org/job/beam_PerformanceTests_ParquetIOIT_HDFS/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 19913
log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-19913.out.log
----- Last 20 lines from daemon log file - daemon-19913.out.log -----
* What went wrong:
Execution failed for task ':sdks:java:io:file-based-io-tests:integrationTest'.
> Process 'Gradle Test Executor 14' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 6m 41s
95 actionable tasks: 60 executed, 35 from cache
Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]