See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/2042/display/redirect?page=changes>

Changes:

[yoshiki.obata] [BEAM-9372] remove python version check whether python 3.6 
above or not

[shehzaad] [BEAM-10961] enable strict dependency checking for 
sdks/java/io/debezium

[shehzaad] [BEAM-10961] Enable strict dependency checking on Google Cloud 
Dataflow

[Kenneth Knowles] Include Cron run in postcommit health dashboard

[Robert Bradshaw] [BEAM-11719] Allow encoding protos and dataclasses as 
deterministically.

[Robert Bradshaw] Better type inference for GroupBy.

[Robert Bradshaw] Add support for named tuples.

[Robert Bradshaw] Named tuple pickling fix for Python 3.6.

[noreply] [BEAM-11962] Disable failing test (#14202)

[Kenneth Knowles] GroupIntoBatches test uses stateful ParDo

[Kenneth Knowles] Always use portable job submission for Dataflow runner v2

[noreply] [BEAM-11715] [BEAM-11694] Re-enable (conditional) combiner packing.


------------------------------------------
[...truncated 1.22 MB...]
    Mar 12, 2021 12:58:03 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 225 files. Enable logging at DEBUG level to see 
which files will be staged.
    Mar 12, 2021 12:58:03 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Mar 12, 2021 12:58:06 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Mar 12, 2021 12:58:06 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <103163 bytes, hash 
3718a328a103f9c9aa135f2a6cc5b9a52cfa6926fbfba01d68905098ce57e698> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-NxijKKED-cmqE18qbMW5pSz6aSb7-6AdaJBQmM5X5pg.pb
    Mar 12, 2021 12:58:07 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 225 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Mar 12, 2021 12:58:08 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 225 files cached, 0 files newly uploaded in 0 
seconds
    Mar 12, 2021 12:58:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/Impulse as step s1
    Mar 12, 2021 12:58:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
    Mar 12, 2021 12:58:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
    Mar 12, 2021 12:58:08 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure write time as step s4
    Mar 12, 2021 12:58:09 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
    Mar 12, 2021 12:58:09 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step 
s6
    Mar 12, 2021 12:58:09 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
    Mar 12, 2021 12:58:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-03-11_16_58_09-814643611957500039?project=apache-beam-testing
    Mar 12, 2021 12:58:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-03-11_16_58_09-814643611957500039
    Mar 12, 2021 12:58:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2021-03-11_16_58_09-814643611957500039
    Mar 12, 2021 12:58:15 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-03-12T00:58:15.194Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-5v23. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Mar 12, 2021 12:58:28 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:26.378Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:31.546Z: Worker configuration: n1-standard-1 in 
us-central1-f.
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.221Z: Expanding SplittableParDo operations into 
optimizable parts.
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.259Z: Expanding CollectionToSingleton operations 
into optimizable parts.
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.333Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.375Z: Expanding GroupByKey operations into 
optimizable parts.
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.456Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.483Z: Fusing consumer Generate 
records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Generate 
records/Impulse
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.511Z: Fusing consumer Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction
 into Generate records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.565Z: Fusing consumer Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitWithSizing
 into Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.666Z: Fusing consumer Measure write 
time/ParMultiDo(TimeMonitor) into Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/ProcessElementAndRestrictionWithSizing
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.708Z: Fusing consumer Write to Kafka/Kafka 
ProducerRecord/Map/ParMultiDo(Anonymous) into Measure write 
time/ParMultiDo(TimeMonitor)
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:32.767Z: Fusing consumer Write to 
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter) into 
Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:33.326Z: Executing operation Generate 
records/Impulse+Generate 
records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction+Generate
 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitWithSizing
    Mar 12, 2021 12:58:33 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:58:33.416Z: Starting 5 ****s in us-central1-f...
    Mar 12, 2021 12:59:09 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:59:09.377Z: Autoscaling: Raised the number of ****s to 
1 based on the rate of progress in the currently running stage(s).
    Mar 12, 2021 12:59:10 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:59:09.417Z: Resized **** pool to 1, though goal was 5.  
This could be a quota issue.
    Mar 12, 2021 12:59:20 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:59:19.850Z: Autoscaling: Raised the number of ****s to 
5 based on the rate of progress in the currently running stage(s).
    Mar 12, 2021 12:59:37 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:59:36.985Z: Workers have started successfully.
    Mar 12, 2021 12:59:37 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T00:59:37.021Z: Workers have started successfully.
    Mar 12, 2021 1:00:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:00:26.494Z: Finished operation Generate 
records/Impulse+Generate 
records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction+Generate
 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitWithSizing
    Mar 12, 2021 1:00:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:00:26.689Z: Executing operation Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/ProcessElementAndRestrictionWithSizing+Measure
 write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka 
ProducerRecord/Map/ParMultiDo(Anonymous)+Write to 
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
    Mar 12, 2021 1:00:47 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:00:45.329Z: Finished operation Generate 
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/ProcessElementAndRestrictionWithSizing+Measure
 write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka 
ProducerRecord/Map/ParMultiDo(Anonymous)+Write to 
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
    Mar 12, 2021 1:00:47 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:00:45.524Z: Cleaning up.
    Mar 12, 2021 1:00:47 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:00:45.614Z: Stopping **** pool...
    Mar 12, 2021 1:01:41 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:01:41.479Z: Autoscaling: Resized **** pool from 5 to 0.
    Mar 12, 2021 1:01:43 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:01:41.526Z: Worker pool stopped.
    Mar 12, 2021 1:01:49 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2021-03-11_16_58_09-814643611957500039 finished with status DONE.
    Mar 12, 2021 1:01:49 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Mar 12, 2021 1:01:49 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Mar 12, 2021 1:01:49 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Mar 12, 2021 1:01:50 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 225 files. Enable logging at DEBUG level to see 
which files will be staged.
    Mar 12, 2021 1:01:50 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Mar 12, 2021 1:01:51 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Mar 12, 2021 1:01:51 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <109819 bytes, hash 
490e45b07e7e35cd79fe89cf9f0f114eb0e5cf5b8568a1c9d9d0174e8b63d432> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-SQ5FsH5-Nc15_onPnw8RTrDlz1uFaKHJ2dAXTotj1DI.pb
    Mar 12, 2021 1:01:53 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 225 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Mar 12, 2021 1:01:54 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 225 files cached, 0 files newly uploaded in 0 
seconds
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded Kafka/Impulse as step s1
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded Kafka/ParDo(GenerateKafkaSourceDescriptor) 
as step s2
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka) as step s3
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map as step s4
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s5
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s6
    Mar 12, 2021 1:01:54 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s7
    Mar 12, 2021 1:01:54 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.29.0-SNAPSHOT
    Mar 12, 2021 1:01:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-03-11_17_01_54-8775954733455006680?project=apache-beam-testing
    Mar 12, 2021 1:01:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2021-03-11_17_01_54-8775954733455006680
    Mar 12, 2021 1:01:55 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2021-03-11_17_01_54-8775954733455006680
    Mar 12, 2021 1:02:02 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2021-03-12T01:01:59.968Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-8ckk. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:09.571Z: Worker configuration: n1-standard-2 in 
us-central1-f.
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.050Z: Expanding SplittableParDo operations into 
optimizable parts.
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.104Z: Expanding CollectionToSingleton operations 
into optimizable parts.
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.289Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.344Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.384Z: Expanding GroupByKey operations into 
streaming Read/Write steps
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.420Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.513Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.558Z: Fusing consumer Read from unbounded 
Kafka/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
 into Read from unbounded Kafka/Impulse
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.588Z: Fusing consumer Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/PairWithRestriction
 into Read from unbounded 
Kafka/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.621Z: Fusing consumer Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/SplitWithSizing
 into Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/PairWithRestriction
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.667Z: Fusing consumer Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous) into 
Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/ProcessElementAndRestrictionWithSizing
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.698Z: Fusing consumer Measure read 
time/ParMultiDo(TimeMonitor) into Read from unbounded 
Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous)
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.732Z: Fusing consumer Map records to 
strings/Map/ParMultiDo(Anonymous) into Measure read time/ParMultiDo(TimeMonitor)
    Mar 12, 2021 1:02:11 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:10.770Z: Fusing consumer Counting 
element/ParMultiDo(Counting) into Map records to 
strings/Map/ParMultiDo(Anonymous)
    Mar 12, 2021 1:02:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:12.230Z: Starting 5 ****s in us-central1-f...
    Mar 12, 2021 1:02:13 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:12.426Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Mar 12, 2021 1:02:57 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:02:55.932Z: Autoscaling: Raised the number of ****s to 
5 so that the pipeline can catch up with its backlog and keep up with its input 
rate.
    Mar 12, 2021 1:03:35 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:03:34.272Z: Workers have started successfully.
    Mar 12, 2021 1:03:35 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2021-03-12T01:03:34.308Z: Workers have started successfully.
    Mar 12, 2021 1:16:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State 
value RUNNING
    Mar 12, 2021 1:16:56 AM org.apache.beam.sdk.testutils.metrics.MetricsReader 
getCounterMetric
    SEVERE: Failed to get metric kafka_read_element_count, from namespace 
org.apache.beam.sdk.io.kafka.KafkaIOIT

Gradle Test Executor 3 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest FAILED

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
    java.lang.AssertionError: expected:<100000> but was:<-1>
        at org.junit.Assert.fail(Assert.java:89)
        at org.junit.Assert.failNotEquals(Assert.java:835)
        at org.junit.Assert.assertEquals(Assert.java:647)
        at org.junit.Assert.assertEquals(Assert.java:633)
        at 
org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:158)

1 test completed, 1 failed
Finished generating test XML results (0.007 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.007 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Daemon ****,5,main]) completed. 
Took 18 mins 58.946 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Daemon 
****,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task 
':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
  Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not 
up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: docker rmi --force 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312005235
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312005235
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:35acd32b6668deaf66859607f2a8bc1b1e5a39611c231fc20e5ecfcaaaba8a71
Starting process 'command 'gcloud''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: gcloud --quiet container images delete --force-delete-tags 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312005235
Successfully started process 'command 'gcloud''
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:35acd32b6668deaf66859607f2a8bc1b1e5a39611c231fc20e5ecfcaaaba8a71
  Associated tags:
 - 20210312005235
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312005235
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312005235].
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:35acd32b6668deaf66859607f2a8bc1b1e5a39611c231fc20e5ecfcaaaba8a71].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Daemon 
****,5,main]) completed. Took 2.779 secs.
:sdks:java:io:kafka:cleanUp (Thread[Daemon ****,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Daemon ****,5,main]) completed. Took 0.0 
secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to 
get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 24m 25s
134 actionable tasks: 26 executed, 108 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches 
limit is too low.

Publishing build scan...
https://gradle.com/s/tyv7zqy6ypzhi

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to