See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/3111/display/redirect?page=changes>

Changes:

[noreply] [Java SDK core] emit watermark from PeriodicSequence (#23301) (#23302)


------------------------------------------
[...truncated 919.90 KB...]
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Sep 21, 2022 6:52:43 PM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Sep 21, 2022 6:52:43 PM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Sep 21, 2022 6:52:44 PM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 279 files. Enable logging at DEBUG level to see 
which files will be staged.
    Sep 21, 2022 6:52:44 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Sep 21, 2022 6:52:52 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 279 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Sep 21, 2022 6:52:53 PM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 279 files cached, 0 files newly uploaded in 1 
seconds
    Sep 21, 2022 6:52:53 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Sep 21, 2022 6:52:53 PM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <127517 bytes, hash 
864137b6e884eac918c968cd543d5d44bd2941608e7d2b1477f5736263c6ed09> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-hkE3tuiE6skYyWjNVD1dRL0pQWCOfSsUd_VzYmPG7Qk.pb
    Sep 21, 2022 6:53:01 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds
 as step s1
    Sep 21, 2022 6:53:01 PM org.apache.kafka.common.config.AbstractConfig logAll
    INFO: ConsumerConfig values: 
        allow.auto.create.topics = true
        auto.commit.interval.ms = 5000
        auto.offset.reset = earliest
        bootstrap.servers = [104.197.30.63:32401, 35.238.12.200:32402, 
34.67.121.146:32403]
        check.crcs = true
        client.dns.lookup = default
        client.id = 
        client.rack = 
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        isolation.level = read_uncommitted
        key.deserializer = class 
org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class 
org.apache.kafka.clients.consumer.RangeAssignor]
        receive.buffer.bytes = 524288
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.mechanism = GSSAPI
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm = https
        ssl.key.password = null
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class 
org.apache.kafka.common.serialization.ByteArrayDeserializer

    Sep 21, 2022 6:53:02 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo 
<init>
    INFO: Kafka version: 2.4.1
    Sep 21, 2022 6:53:02 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo 
<init>
    INFO: Kafka commitId: c57222ae8cd7866b
    Sep 21, 2022 6:53:02 PM org.apache.kafka.common.utils.AppInfoParser$AppInfo 
<init>
    INFO: Kafka startTimeMs: 1663786382313
    Sep 21, 2022 6:53:04 PM org.apache.kafka.clients.Metadata update
    INFO: [Consumer clientId=consumer-1, groupId=null] Cluster ID: 
lFHwL8pIRZmQcPaVsUBTbw
    Sep 21, 2022 6:53:04 PM org.apache.beam.sdk.io.kafka.KafkaUnboundedSource 
split
    INFO: Partitions assigned to split 0 (total 1): beam-sdf-0
    Sep 21, 2022 6:53:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/StripIds as 
step s2
    Sep 21, 2022 6:53:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Measure read time as step s3
    Sep 21, 2022 6:53:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Map records to strings/Map as step s4
    Sep 21, 2022 6:53:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Counting element as step s5
    Sep 21, 2022 6:53:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Dataflow SDK version: 2.43.0-SNAPSHOT
    Sep 21, 2022 6:53:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2022-09-21_11_53_04-6761674015015921799?project=apache-beam-testing
    Sep 21, 2022 6:53:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Submitted job: 2022-09-21_11_53_04-6761674015015921799
    Sep 21, 2022 6:53:04 PM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: To cancel the job using the 'gcloud' tool, run:
    > gcloud dataflow jobs --project=apache-beam-testing cancel 
--region=us-central1 2022-09-21_11_53_04-6761674015015921799
    Sep 21, 2022 6:53:16 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    WARNING: 2022-09-21T18:53:10.516Z: The workflow name is not a valid Cloud 
Label. Labels applied to Cloud resources (such as GCE Instances) for monitoring 
will be labeled with this modified job name: 
kafkaioit0testkafkaioreadsandwritescorrectlyinstreaming-je-75v3. For the best 
monitoring experience, please name your job with a valid Cloud Label. For 
details, see: 
https://cloud.google.com/compute/docs/labeling-resources#restrictions
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:29.254Z: Worker configuration: e2-standard-2 in 
us-central1-a.
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.612Z: Expanding SplittableParDo operations into 
optimizable parts.
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.642Z: Expanding CollectionToSingleton operations 
into optimizable parts.
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.718Z: Expanding CoGroupByKey operations into 
optimizable parts.
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.764Z: Expanding SplittableProcessKeyed operations 
into optimizable parts.
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.799Z: Expanding GroupByKey operations into 
streaming Read/Write steps
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.841Z: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.917Z: Fusing adjacent ParDo, Read, Write, and 
Flatten operations
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:30.951Z: Fusing consumer Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
 into Read from unbounded Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Impulse
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.003Z: Fusing consumer 
Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/PairWithRestriction
 into Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/ParDo(GenerateKafkaSourceDescriptor)/ParMultiDo(GenerateKafkaSourceDescriptor)
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.082Z: Fusing consumer 
Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/SplitWithSizing
 into 
Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/PairWithRestriction
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.118Z: Fusing consumer Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous)
 into 
Read-from-unbounded-Kafka-KafkaIO-Read-ReadFromKafkaViaSDF-KafkaIO-ReadSourceDescriptors-ParDo-Unbou/ProcessElementAndRestrictionWithSizing
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.155Z: Fusing consumer Measure read 
time/ParMultiDo(TimeMonitor) into Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/KafkaIO.ReadSourceDescriptors/MapElements/Map/ParMultiDo(Anonymous)
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.187Z: Fusing consumer Map records to 
strings/Map/ParMultiDo(Anonymous) into Measure read time/ParMultiDo(TimeMonitor)
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.221Z: Fusing consumer Counting 
element/ParMultiDo(Counting) into Map records to 
strings/Map/ParMultiDo(Anonymous)
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.338Z: Running job using Streaming Engine
    Sep 21, 2022 6:53:32 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:31.586Z: Starting 5 ****s in us-central1-a...
    Sep 21, 2022 6:53:53 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:53:52.092Z: Your project already contains 100 
Dataflow-created metric descriptors, so new user metrics of the form 
custom.googleapis.com/* will not be created. However, all user metrics are also 
available in the metric dataflow.googleapis.com/job/user_counter. If you rely 
on the custom metrics, you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
    Sep 21, 2022 6:54:14 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:54:14.862Z: Autoscaling: Raised the number of ****s to 
5 so that the pipeline can catch up with its backlog and keep up with its input 
rate.
    Sep 21, 2022 6:55:18 PM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2022-09-21T18:55:17.397Z: Workers have started successfully.
    Sep 21, 2022 7:08:04 PM 
org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish
    WARNING: No terminal state was returned within allotted timeout. State 
value RUNNING

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInStreaming STANDARD_OUT
    Load test results for test (ID): 91020791-4a9e-4e37-b28e-6d7b8166bc75 and 
timestamp: 2022-09-21T18:47:19.907000000Z:
                     Metric:                    Value:
                   read_time                     2.752
                  write_time                     4.092
                    run_time         6.843999999999999

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest
Finished generating test XML results (0.278 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.432 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>

> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for 
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution 
****,5,main]) started.
Resolve mutations for 
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution 
****,5,main]) completed. Took 0.0 secs.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution 
****,5,main]) started.
:sdks:java:io:kafka:integrationTest (Thread[Execution **** Thread 3,5,main]) 
completed. Took 21 mins 58.021 secs.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
  Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not 
up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: docker rmi --force 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220921184419
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220921184419
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:05a83918402916c84ed9d8d8a9f63fd1ed1f2499fcb5cd1063439b9b7bdc993f
Starting process 'command 'gcloud''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: gcloud --quiet container images untag 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220921184419
Successfully started process 'command 'gcloud''
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220921184419]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:05a83918402916c84ed9d8d8a9f63fd1ed1f2499fcb5cd1063439b9b7bdc993f]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20220921184419] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:05a83918402916c84ed9d8d8a9f63fd1ed1f2499fcb5cd1063439b9b7bdc993f])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working 
directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: ./scripts/cleanup_untagged_gcr_images.sh 
us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command 
'./scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:05a83918402916c84ed9d8d8a9f63fd1ed1f2499fcb5cd1063439b9b7bdc993f
Digests:
- 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:05a83918402916c84ed9d8d8a9f63fd1ed1f2499fcb5cd1063439b9b7bdc993f
Deleted 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:05a83918402916c84ed9d8d8a9f63fd1ed1f2499fcb5cd1063439b9b7bdc993f].
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution 
****,5,main]) completed. Took 38.827 secs.
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution 
****,5,main]) started.
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution 
****,5,main]) completed. Took 0.0 secs.
:sdks:java:io:kafka:cleanUp (Thread[Execution ****,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
:sdks:java:io:kafka:cleanUp (Thread[Execution ****,5,main]) completed. Took 0.0 
secs.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> Failed to store cache entry 7b24672cc29d5258ac5de5748ee734b3 for task 
> ':sdks:java:io:kafka:integrationTest': Timeout waiting to lock Build cache 
> (/home/jenkins/.gradle/caches/build-cache-1). It is currently in use by 
> another Gradle instance.
  Owner PID: 1808577
  Our PID: 1803007
  Owner Operation: 
  Our operation: 
  Lock file: /home/jenkins/.gradle/caches/build-cache-1/build-cache-1.lock

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --debug option to get more log output.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.5.1/userguide/command_line_interface.html#sec:command_line_warnings

Execution optimizations have been disabled for 1 invalid unit(s) of work during 
this build to ensure correctness.
Please consult deprecation warnings for more details.

BUILD FAILED in 26m 28s
156 actionable tasks: 97 executed, 55 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/j377ntf6kzp6i

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to