See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/230/display/redirect>

Changes:


------------------------------------------
[...truncated 2.38 MB...]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Prevent fusion before writing/Reshuffle, 
transform=ReshuffleOverrideFactory.ReshuffleWithOnlyTrigger}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Prevent fusion before writing/Reshuffle/Window.Into(), 
transform=Window.Into()}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Prevent fusion before 
writing/Reshuffle/Window.Into()/Window.Assign, transform=Window.Assign}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent 
fusion before writing/Reshuffle/Window.Into()/Window.Assign.out 
[PCollection@1189316844]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Prevent fusion before writing/Reshuffle/GroupByKey, 
transform=GroupByKey}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent 
fusion before writing/Reshuffle/GroupByKey.out [PCollection@1593727781]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Prevent fusion before writing/Reshuffle/ExpandIterable, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent 
fusion before 
writing/Reshuffle/RestoreOriginalTimestamps/Reify.ExtractTimestampsFromValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output
 [PCollection@1202547191]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Prevent fusion before writing/Values, transform=Values}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Prevent fusion before writing/Values/Values, 
transform=MapElements}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Prevent fusion before writing/Values/Values/Map, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Prevent 
fusion before writing/Values/Values/Map/ParMultiDo(Anonymous).output 
[PCollection@881977454]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Collect write time, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Collect 
write time/ParMultiDo(TimeMonitor).output [PCollection@2113891589]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Construct rows for DBOutputFormat, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value 
Construct rows for DBOutputFormat/ParMultiDo(ConstructDBOutputFormatRow).output 
[PCollection@1568159144]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO, transform=CdapIO.Write}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write, 
transform=HadoopFormatIO.Write}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CreateOutputConfig, 
transform=Create.Values}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource), 
transform=Read(CreateSource)}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper).output
 [PCollection@1500079441]
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/View.AsSingleton, 
transform=View.AsSingleton}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView, 
transform=BatchViewAsSingleton}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton),
 transform=Combine.globally(Singleton)}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys,
 transform=WithKeys}
    14:41:51.006 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys,
 transform=MapElements}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map,
 transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map/ParMultiDo(Anonymous).output
 [PCollection@892230274]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton),
 transform=Combine.perKey(Singleton)}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey,
 transform=GroupByKey}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey.out
 [PCollection@88226327]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues,
 transform=DataflowRunner.CombineGroupedValues}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output
 [PCollection@177589009]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values,
 transform=Values}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values,
 transform=MapElements}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map,
 transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map/ParMultiDo(Anonymous).output
 [PCollection@1576141372]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey,
 transform=BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey),
 transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)/ParMultiDo(UseWindowHashAsKeyAndWindowAsSortKey).output
 [PCollection@842924653]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly,
 transform=BatchViewOverrides.GroupByKeyAndSortValuesOnly}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly.out
 [PCollection@2041983781]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow),
 transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)/ParMultiDo(IsmRecordForSingularValuePerWindow).output
 [PCollection@1623822117]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView,
 transform=CreateDataflowView}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CombineValues/Values/Values/Map/ParMultiDo(Anonymous).output
 [PCollection@1901018532]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob), 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)/ParMultiDo(SetupJob).output 
[PCollection@925024581]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/GroupDataByPartition, 
transform=HadoopFormatIO.GroupDataByPartition}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask/ParMultiDo(AssignTask).output
 [PCollection@2005293363]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId, 
transform=GroupByKey}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId.out 
[PCollection@1277882374]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks/ParMultiDo(FlattenGroupedTasks).output
 [PCollection@325674467]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/Write, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using CdapIO/HadoopFormatIO.Write/Write/ParMultiDo(Write).output 
[PCollection@987255094]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks, 
transform=Combine.globally(IterableCombiner)}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys, transform=WithKeys}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys, 
transform=MapElements}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map/ParMultiDo(Anonymous).output
 [PCollection@1117747481]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner), 
transform=Combine.perKey(IterableCombiner)}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey,
 transform=GroupByKey}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey.out
 [PCollection@2038185019]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues,
 transform=DataflowRunner.CombineGroupedValues}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/ParDo(Anonymous)/ParMultiDo(Anonymous).output
 [PCollection@913148823]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values, 
transform=Values}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting composite node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values, 
transform=MapElements}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map/ParMultiDo(Anonymous).output
 [PCollection@256522893]
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting primitive node 
Node{fullName=Write using CdapIO/HadoopFormatIO.Write/CommitWriteJob, 
transform=PrimitiveParDoSingleFactory.ParDoSingle}
    14:41:51.007 [Test ****] DEBUG 
org.apache.beam.sdk.runners.TransformHierarchy - Visiting output value Write 
using CdapIO/HadoopFormatIO.Write/CommitWriteJob/ParMultiDo(CommitJob).output 
[PCollection@881513107]
    14:41:53.397 [Test ****] WARN 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:41:52.392Z: The workflow name is not a valid Cloud Label. Labels 
applied to Cloud resources (such as GCE Instances) for monitoring will be 
labeled with this modified job name: 
cdapioit0testcdapioreadsandwritescorrectlyinbatch-jenkins--bwaa. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    14:42:08.054 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:05.962Z: Your project already contains 100 Dataflow-created 
metric descriptors, so new user metrics of the form custom.googleapis.com/* 
will not be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    14:42:11.221 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:11.102Z: Worker configuration: e2-standard-2 in us-central1-a.
    14:42:13.723 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.444Z: Expanding CoGroupByKey operations into optimizable 
parts.
    14:42:13.723 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.497Z: Combiner lifting skipped for step Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId: GroupByKey not 
followed by a combiner.
    14:42:13.723 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.530Z: Combiner lifting skipped for step Prevent fusion 
before writing/Reshuffle/GroupByKey: GroupByKey not followed by a combiner.
    14:42:13.723 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.643Z: Expanding GroupByKey operations into optimizable 
parts.
    14:42:13.723 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.671Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    14:42:13.723 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.843Z: Annotating graph with Autotuner information.
    14:42:13.723 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.884Z: Fusing adjacent ParDo, Read, Write, and Flatten 
operations
    14:42:13.723 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.920Z: Fusing consumer Produce db rows into Generate 
sequence/Read(BoundedCountingSource)
    14:42:13.723 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.954Z: Fusing consumer Prevent fusion before writing/Pair 
with random key into Produce db rows
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:12.988Z: Fusing consumer Prevent fusion before 
writing/Reshuffle/Window.Into()/Window.Assign into Prevent fusion before 
writing/Pair with random key
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.023Z: Fusing consumer Prevent fusion before 
writing/Reshuffle/GroupByKey/Reify into Prevent fusion before 
writing/Reshuffle/Window.Into()/Window.Assign
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.058Z: Fusing consumer Prevent fusion before 
writing/Reshuffle/GroupByKey/Write into Prevent fusion before 
writing/Reshuffle/GroupByKey/Reify
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.091Z: Fusing consumer Prevent fusion before 
writing/Reshuffle/GroupByKey/GroupByWindow into Prevent fusion before 
writing/Reshuffle/GroupByKey/Read
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.126Z: Fusing consumer Prevent fusion before 
writing/Reshuffle/ExpandIterable into Prevent fusion before 
writing/Reshuffle/GroupByKey/GroupByWindow
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.160Z: Fusing consumer Prevent fusion before 
writing/Values/Values/Map into Prevent fusion before 
writing/Reshuffle/ExpandIterable
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.182Z: Fusing consumer Collect write time into Prevent 
fusion before writing/Values/Values/Map
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.215Z: Fusing consumer Construct rows for DBOutputFormat 
into Collect write time
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.249Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/ParDo(SetupJob) into Construct rows for 
DBOutputFormat
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.282Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask into Write using 
CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.318Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify into Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.341Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Write into Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.376Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow 
into Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Read
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.412Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks into Write 
using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.440Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/Write into Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.474Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map into Write 
using CdapIO/HadoopFormatIO.Write/Write
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.508Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial
 into Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map
    14:42:13.724 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.539Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify
 into Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial
    14:42:13.725 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.564Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
 into Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify
    14:42:13.725 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.589Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues
 into Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read
    14:42:13.725 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.623Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract
 into Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues
    14:42:13.725 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.657Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map into Write 
using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract
    14:42:14.935 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.692Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/CommitWriteJob into Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map
    14:42:14.935 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.728Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
 into Write using 
CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)
    14:42:14.935 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.754Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.789Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.824Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.858Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.894Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.926Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:13.956Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.016Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.051Z: Fusing consumer Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 into Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.094Z: Workflow config is missing a default resource spec.
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.121Z: Adding StepResource setup and teardown to workflow 
graph.
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.155Z: Adding workflow start and stop steps.
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.188Z: Assigning stage ids.
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.346Z: Executing wait step start68
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.425Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.462Z: Executing operation Prevent fusion before 
writing/Reshuffle/GroupByKey/Create
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.475Z: Starting **** pool setup.
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.512Z: Starting 5 ****s in us-central1-a...
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.754Z: Finished operation Prevent fusion before 
writing/Reshuffle/GroupByKey/Create
    14:42:14.936 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.771Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
    14:42:14.936 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.828Z: Value "Prevent fusion before 
writing/Reshuffle/GroupByKey/Session" materialized.
    14:42:14.937 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.863Z: Value "Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Session"
 materialized.
    14:42:14.937 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.903Z: Executing operation Generate 
sequence/Read(BoundedCountingSource)+Produce db rows+Prevent fusion before 
writing/Pair with random key+Prevent fusion before 
writing/Reshuffle/Window.Into()/Window.Assign+Prevent fusion before 
writing/Reshuffle/GroupByKey/Reify+Prevent fusion before 
writing/Reshuffle/GroupByKey/Write
    14:42:17.364 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:14.936Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)+Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    14:42:57.841 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:42:56.043Z: Autoscaling: Raised the number of ****s to 5 based on 
the rate of progress in the currently running stage(s).
    14:43:28.593 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:27.453Z: Workers have started successfully.
    14:43:59.565 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.270Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/CreateOutputConfig/Read(CreateSource)+Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
    14:43:59.565 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.346Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    14:43:59.565 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.407Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Close
    14:43:59.565 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.473Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    14:43:59.565 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.624Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
    14:43:59.565 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.687Z: Value "Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Session"
 materialized.
    14:43:59.565 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:43:58.753Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:01.073Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:01.141Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:01.190Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Close
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:01.250Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:02.648Z: Finished operation Generate 
sequence/Read(BoundedCountingSource)+Produce db rows+Prevent fusion before 
writing/Pair with random key+Prevent fusion before 
writing/Reshuffle/Window.Into()/Window.Assign+Prevent fusion before 
writing/Reshuffle/GroupByKey/Reify+Prevent fusion before 
writing/Reshuffle/GroupByKey/Write
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:02.713Z: Executing operation Prevent fusion before 
writing/Reshuffle/GroupByKey/Close
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:02.760Z: Finished operation Prevent fusion before 
writing/Reshuffle/GroupByKey/Close
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.308Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read+Write
 using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
    14:44:03.790 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.375Z: Value "Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow).out0"
 materialized.
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.447Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.503Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
    14:44:03.790 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.565Z: Value "Write using 
CdapIO/HadoopFormatIO.Write/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView.out0"
 materialized.
    14:44:03.790 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.622Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Create
    14:44:06.708 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.782Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Create
    14:44:06.708 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.852Z: Value "Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Session" 
materialized.
    14:44:06.708 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:03.913Z: Executing operation Prevent fusion before 
writing/Reshuffle/GroupByKey/Read+Prevent fusion before 
writing/Reshuffle/GroupByKey/GroupByWindow+Prevent fusion before 
writing/Reshuffle/ExpandIterable+Prevent fusion before 
writing/Values/Values/Map+Collect write time+Construct rows for 
DBOutputFormat+Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)+Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask+Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify+Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Write
    14:44:13.356 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.226Z: Finished operation Prevent fusion before 
writing/Reshuffle/GroupByKey/Read+Prevent fusion before 
writing/Reshuffle/GroupByKey/GroupByWindow+Prevent fusion before 
writing/Reshuffle/ExpandIterable+Prevent fusion before 
writing/Values/Values/Map+Collect write time+Construct rows for 
DBOutputFormat+Write using CdapIO/HadoopFormatIO.Write/ParDo(SetupJob)+Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/AssignTask+Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Reify+Write 
using CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Write
    14:44:13.356 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.281Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Close
    14:44:13.357 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.339Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Close
    14:44:13.357 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.408Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Create
    14:44:13.357 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.560Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Create
    14:44:13.357 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.632Z: Value "Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Session"
 materialized.
    14:44:13.357 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:12.690Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Read+Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow+Write
 using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks+Write 
using CdapIO/HadoopFormatIO.Write/Write+Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map+Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
    14:44:25.496 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:24.082Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/Read+Write using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/GroupByTaskId/GroupByWindow+Write
 using 
CdapIO/HadoopFormatIO.Write/GroupDataByPartition/FlattenGroupedTasks+Write 
using CdapIO/HadoopFormatIO.Write/Write+Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/WithKeys/AddKeys/Map+Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
    14:44:25.496 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:24.137Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Close
    14:44:25.496 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:24.972Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Close
    14:44:25.496 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:25.032Z: Executing operation Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract+Write
 using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map+Write 
using CdapIO/HadoopFormatIO.Write/CommitWriteJob
    14:44:26.875 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:26.372Z: Finished operation Write using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Read+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues+Write
 using 
CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract+Write
 using CdapIO/HadoopFormatIO.Write/CollectWriteTasks/Values/Values/Map+Write 
using CdapIO/HadoopFormatIO.Write/CommitWriteJob
    14:44:26.875 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:26.443Z: Executing success step success66
    14:44:26.875 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:26.535Z: Cleaning up.
    14:44:26.875 [Test ****] DEBUG 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:26.578Z: Starting **** pool teardown.
    14:44:26.875 [Test ****] INFO 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler - 
2022-09-30T14:44:26.607Z: Stopping **** pool...
    14:46:04.060 [Thread-7] WARN 
org.apache.beam.runners.dataflow.DataflowPipelineJob - Job is already running 
in Google Cloud Platform, Ctrl-C will not cancel it.
    To cancel the job in the cloud, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2022-09-30_07_41_50-17365050359063832274

org.apache.beam.sdk.io.cdap.CdapIOIT > testCdapIOReadsAndWritesCorrectlyInBatch 
SKIPPED

> Task :sdks:java:io:cdap:integrationTest FAILED
:sdks:java:io:cdap:integrationTest (Thread[Execution **** Thread 6,5,main]) 
completed. Took 4 mins 50.196 secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:cdap:integrationTest'.
> Process 'Gradle Test Executor 3' finished with non-zero exit value 143
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/7.5.1/userguide/java_testing.html#sec:test_execution

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
142 actionable tasks: 84 executed, 56 from cache, 2 up-to-date

Publishing build scan...
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=8b949c0d-382b-4bb7-8689-99deb7c051aa, 
currentDir=<https://ci-beam.apache.org/job/beam_PerformanceTests_Cdap/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 598774
  log file: /home/jenkins/.gradle/daemon/7.5.1/daemon-598774.out.log
----- Last  20 lines from daemon log file - daemon-598774.out.log -----
  This problem might be caused by incorrect test process configuration.
  Please refer to the test execution section in the User Manual at 
https://docs.gradle.org/7.5.1/userguide/java_testing.html#sec:test_execution

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 5m 2s
142 actionable tasks: 84 executed, 56 from cache, 2 up-to-date

Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated 
in response to a user interrupt.
----- End of the daemon log -----


FAILURE: Build failed with an exception.

* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may 
have crashed)

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.
> Run with --scan to get full insights.

* Get more help at https://help.gradle.org
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to