See
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/2044/display/redirect>
Changes:
------------------------------------------
[...truncated 1.06 MB...]
Mar 12, 2021 12:57:32 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 225 files. Enable logging at DEBUG level to see
which files will be staged.
Mar 12, 2021 12:57:32 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Mar 12, 2021 12:57:35 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Mar 12, 2021 12:57:35 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <103163 bytes, hash
09fabbac7e2528ac464d97d83e711cf83539d2e58d757e5a31014869f77921c7> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-Cfq7rH4lKKxGTZfYPnEc-DU50uWNdX5aMQFIafd5Icc.pb
Mar 12, 2021 12:57:36 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 225 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Mar 12, 2021 12:57:37 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 225 files cached, 0 files newly uploaded in 0
seconds
Mar 12, 2021 12:57:37 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Generate records/Impulse as step s1
Mar 12, 2021 12:57:37 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Generate records/ParDo(OutputSingleSource) as step s2
Mar 12, 2021 12:57:37 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Generate records/ParDo(BoundedSourceAsSDFWrapper) as step s3
Mar 12, 2021 12:57:37 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Measure write time as step s4
Mar 12, 2021 12:57:37 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to Kafka/Kafka ProducerRecord/Map as step s5
Mar 12, 2021 12:57:37 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Write to Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter) as step
s6
Mar 12, 2021 12:57:37 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.30.0-SNAPSHOT
Mar 12, 2021 12:57:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-03-12_04_57_37-8470489006326966015?project=apache-beam-testing
Mar 12, 2021 12:57:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-03-12_04_57_37-8470489006326966015
Mar 12, 2021 12:57:39 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2021-03-12_04_57_37-8470489006326966015
Mar 12, 2021 12:57:44 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-03-12T12:57:42.823Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-r7br. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Mar 12, 2021 12:57:51 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:50.762Z: Worker configuration: n1-standard-1 in
us-central1-f.
Mar 12, 2021 12:57:51 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.235Z: Expanding SplittableParDo operations into
optimizable parts.
Mar 12, 2021 12:57:51 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.271Z: Expanding CollectionToSingleton operations
into optimizable parts.
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.545Z: Expanding CoGroupByKey operations into
optimizable parts.
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.578Z: Expanding GroupByKey operations into
optimizable parts.
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.659Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.691Z: Fusing consumer Generate
records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource) into Generate
records/Impulse
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.726Z: Fusing consumer Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction
into Generate records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.768Z: Fusing consumer Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitWithSizing
into Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.806Z: Fusing consumer Measure write
time/ParMultiDo(TimeMonitor) into Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/ProcessElementAndRestrictionWithSizing
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.833Z: Fusing consumer Write to Kafka/Kafka
ProducerRecord/Map/ParMultiDo(Anonymous) into Measure write
time/ParMultiDo(TimeMonitor)
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:51.858Z: Fusing consumer Write to
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter) into
Write to Kafka/Kafka ProducerRecord/Map/ParMultiDo(Anonymous)
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:52.489Z: Executing operation Generate
records/Impulse+Generate
records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction+Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitWithSizing
Mar 12, 2021 12:57:53 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:57:52.590Z: Starting 5 ****s in us-central1-f...
Mar 12, 2021 12:58:17 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:58:17.181Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Mar 12, 2021 12:58:43 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:58:42.292Z: Autoscaling: Raised the number of ****s to
4 based on the rate of progress in the currently running stage(s).
Mar 12, 2021 12:58:43 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:58:42.330Z: Resized **** pool to 4, though goal was 5.
This could be a quota issue.
Mar 12, 2021 12:59:03 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:59:02.688Z: Workers have started successfully.
Mar 12, 2021 12:59:03 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:59:02.730Z: Workers have started successfully.
Mar 12, 2021 12:59:51 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:59:51.057Z: Finished operation Generate
records/Impulse+Generate
records/ParDo(OutputSingleSource)/ParMultiDo(OutputSingleSource)+Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/PairWithRestriction+Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/SplitWithSizing
Mar 12, 2021 12:59:51 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:59:51.185Z: Executing operation Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/ProcessElementAndRestrictionWithSizing+Measure
write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka
ProducerRecord/Map/ParMultiDo(Anonymous)+Write to
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Mar 12, 2021 12:59:54 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T12:59:54.453Z: Autoscaling: Raised the number of ****s to
5 based on the rate of progress in the currently running stage(s).
Mar 12, 2021 1:00:14 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:00:11.569Z: Finished operation Generate
records/ParDo(BoundedSourceAsSDFWrapper)/ParMultiDo(BoundedSourceAsSDFWrapper)/ProcessElementAndRestrictionWithSizing+Measure
write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka
ProducerRecord/Map/ParMultiDo(Anonymous)+Write to
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Mar 12, 2021 1:00:14 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:00:11.710Z: Cleaning up.
Mar 12, 2021 1:00:14 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:00:11.788Z: Stopping **** pool...
Mar 12, 2021 1:00:55 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:00:54.229Z: Autoscaling: Resized **** pool from 5 to 0.
Mar 12, 2021 1:00:55 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:00:54.274Z: Worker pool stopped.
Mar 12, 2021 1:01:01 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2021-03-12_04_57_37-8470489006326966015 finished with status DONE.
Mar 12, 2021 1:01:01 PM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Mar 12, 2021 1:01:01 PM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Mar 12, 2021 1:01:01 PM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Mar 12, 2021 1:01:01 PM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 225 files. Enable logging at DEBUG level to see
which files will be staged.
Mar 12, 2021 1:01:01 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Mar 12, 2021 1:01:03 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Mar 12, 2021 1:01:03 PM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <109812 bytes, hash
9eae7cc4ce7797b7e95df8857fca44df847cc8a1dd8710e63dfb2ef45586c6fc> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-nq58xM53l7fpXfiFf8pE34R8yKHdhxDmPfsu9FWGxvw.pb
Mar 12, 2021 1:01:05 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 225 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Mar 12, 2021 1:01:07 PM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 225 files cached, 0 files newly uploaded in 2
seconds
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/Impulse as step s1
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded Kafka/ParDo(GenerateKafkaSourceDescriptor)
as step s2
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka) as step s3
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map as step s4
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Measure read time as step s5
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Map records to strings/Map as step s6
Mar 12, 2021 1:01:07 PM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Counting element as step s7
Mar 12, 2021 1:01:07 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Dataflow SDK version: 2.30.0-SNAPSHOT
Mar 12, 2021 1:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-03-12_05_01_07-14411601291430249040?project=apache-beam-testing
Mar 12, 2021 1:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Submitted job: 2021-03-12_05_01_07-14411601291430249040
Mar 12, 2021 1:01:09 PM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: To cancel the job using the 'gcloud' tool, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2021-03-12_05_01_07-14411601291430249040
Mar 12, 2021 1:01:14 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
WARNING: 2021-03-12T13:01:13.907Z: The workflow name is not a valid Cloud
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring
will be labeled with this modified job name:
kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-ui93. For the best
monitoring experience, please name your job with a valid Cloud Label. For
details, see:
https://cloud.google.com/compute/docs/labeling-resources#restrictions
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:24.877Z: Worker configuration: n1-standard-2 in
us-central1-f.
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.365Z: Expanding SplittableParDo operations into
optimizable parts.
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.412Z: Expanding CollectionToSingleton operations
into optimizable parts.
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.479Z: Expanding CoGroupByKey operations into
optimizable parts.
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.537Z: Expanding SplittableProcessKeyed operations
into optimizable parts.
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.574Z: Expanding GroupByKey operations into
streaming Read/Write steps
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.608Z: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.684Z: Fusing adjacent ParDo, Read, Write, and
Flatten operations
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.727Z: Fusing consumer Read from unbounded
Kafka/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
into Read from unbounded Kafka/Impulse
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.770Z: Fusing consumer Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/PairWithRestriction
into Read from unbounded
Kafka/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.817Z: Fusing consumer Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/SplitWithSizing
into Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/PairWithRestriction
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.855Z: Fusing consumer Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous) into
Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/ParDo(ReadFromKafka)/ParMultiDo(ReadFromKafka)/ProcessElementAndRestrictionWithSizing
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.889Z: Fusing consumer Measure read
time/ParMultiDo(TimeMonitor) into Read from unbounded
Kafka/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous)
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.929Z: Fusing consumer Map records to
strings/Map/ParMultiDo(Anonymous) into Measure read time/ParMultiDo(TimeMonitor)
Mar 12, 2021 1:01:26 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:25.974Z: Fusing consumer Counting
element/ParMultiDo(Counting) into Map records to
strings/Map/ParMultiDo(Anonymous)
Mar 12, 2021 1:01:29 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:27.575Z: Starting 5 ****s in us-central1-f...
Mar 12, 2021 1:01:38 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:01:37.197Z: Your project already contains 100
Dataflow-created metric descriptors, so new user metrics of the form
custom.googleapis.com/* will not be created. However, all user metrics are also
available in the metric dataflow.googleapis.com/job/user_counter. If you rely
on the custom metrics, you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
Mar 12, 2021 1:02:09 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:02:09.064Z: Autoscaling: Raised the number of ****s to
5 so that the pipeline can catch up with its backlog and keep up with its input
rate.
Mar 12, 2021 1:02:40 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:02:39.245Z: Workers have started successfully.
Mar 12, 2021 1:02:40 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2021-03-12T13:02:39.283Z: Workers have started successfully.
Mar 12, 2021 1:16:09 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
WARNING: No terminal state was returned within allotted timeout. State
value RUNNING
Mar 12, 2021 1:16:09 PM org.apache.beam.sdk.testutils.metrics.MetricsReader
getCounterMetric
SEVERE: Failed to get metric kafka_read_element_count, from namespace
org.apache.beam.sdk.io.kafka.KafkaIOIT
Gradle Test Executor 2 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest FAILED
org.apache.beam.sdk.io.kafka.KafkaIOIT >
testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.AssertionError: expected:<100000> but was:<-1>
at org.junit.Assert.fail(Assert.java:89)
at org.junit.Assert.failNotEquals(Assert.java:835)
at org.junit.Assert.assertEquals(Assert.java:647)
at org.junit.Assert.assertEquals(Assert.java:633)
at
org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:158)
1 test completed, 1 failed
Finished generating test XML results (0.006 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.007 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
:sdks:java:io:kafka:integrationTest (Thread[Execution **** for ':' Thread
3,5,main]) completed. Took 18 mins 43.122 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution ****
for ':' Thread 3,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Custom actions are attached to task
':runners:google-cloud-dataflow-java:cleanUpDockerImages'.
Caching disabled for task
':runners:google-cloud-dataflow-java:cleanUpDockerImages' because:
Caching has not been enabled for the task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerImages' is not
up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
Command: docker rmi --force
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312125220
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312125220
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c4c2461c2b2f5f600fa4c038236ef9cce12b99db7b639222a8317c0ee04c962
Starting process 'command 'gcloud''. Working directory:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
Command: gcloud --quiet container images delete --force-delete-tags
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312125220
Successfully started process 'command 'gcloud''
Digests:
-
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c4c2461c2b2f5f600fa4c038236ef9cce12b99db7b639222a8317c0ee04c962
Associated tags:
- 20210312125220
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312125220
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210312125220].
Deleted
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:1c4c2461c2b2f5f600fa4c038236ef9cce12b99db7b639222a8317c0ee04c962].
:runners:google-cloud-dataflow-java:cleanUpDockerImages (Thread[Execution ****
for ':' Thread 3,5,main]) completed. Took 2.847 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 3,5,main])
started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution **** for ':' Thread 3,5,main])
completed. Took 0.0 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 23m 55s
134 actionable tasks: 26 executed, 108 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/oxxrpuf7evoua
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]