See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4103/display/redirect?page=changes>

Changes:

[noreply] Merge pull request #28843: Do not rewrite project version in


------------------------------------------
[...truncated 782.21 KB...]
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
        at 
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
        at 
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
        at 
org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
        at 
org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown
 Source)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
        at 
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
        at 
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
        at 
org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
        at 
org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
        at 
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
        at 
org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    Caused by: java.io.IOException: KafkaWriter : failed to send 1 records 
(since last report)
        at 
org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
        at 
org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
    Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf 
not present in metadata after 60000 ms.

    Oct 06, 2023 12:42:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    SEVERE: 2023-10-06T00:42:25.860Z: 
org.apache.beam.sdk.util.UserCodeException: java.io.IOException: KafkaWriter : 
failed to send 1 records (since last report)
        at 
org.apache.beam.sdk.util.UserCodeException.wrap(UserCodeException.java:39)
        at 
org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeFinishBundle(Unknown 
Source)
        at 
org.apache.beam.fn.harness.FnApiDoFnRunner.finishBundle(FnApiDoFnRunner.java:1772)
        at 
org.apache.beam.fn.harness.data.PTransformFunctionRegistry.lambda$register$0(PTransformFunctionRegistry.java:116)
        at 
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:560)
        at 
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
        at 
org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
        at java.util.concurrent.FutureTask.run(FutureTask.java:266)
        at 
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
    Caused by: java.io.IOException: KafkaWriter : failed to send 1 records 
(since last report)
        at 
org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
        at 
org.apache.beam.sdk.io.kafka.KafkaWriter.finishBundle(KafkaWriter.java:97)
    Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf 
not present in metadata after 60000 ms.

    Oct 06, 2023 12:42:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-10-06T00:42:25.964Z: Finished operation 
Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure
 write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka 
ProducerRecord/Map/ParMultiDo(Anonymous)+Write to 
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
    Oct 06, 2023 12:42:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-10-06T00:42:26.068Z: Cleaning up.
    Oct 06, 2023 12:42:26 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-10-06T00:42:26.128Z: Stopping **** pool...
    Oct 06, 2023 12:44:50 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-10-06T00:44:48.553Z: Autoscaling: Resized **** pool from 5 to 0.
    Oct 06, 2023 12:44:50 AM 
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
    INFO: 2023-10-06T00:44:48.597Z: Worker pool stopped.
    Oct 06, 2023 12:44:55 AM 
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
    INFO: Job 2023-10-05_17_36_10-8493101983374399159 finished with status DONE.
    Oct 06, 2023 12:44:55 AM 
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory 
tryCreateDefaultBucket
    INFO: No tempLocation specified, attempting to use default bucket: 
dataflow-staging-us-central1-844138762903
    Oct 06, 2023 12:44:56 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 409, performed 0 retries due to 
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP 
framework says request can be retried, (caller responsible for retrying): 
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
 
    Oct 06, 2023 12:44:56 AM 
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
 create
    INFO: No stagingLocation provided, falling back to gcpTempLocation
    Oct 06, 2023 12:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner 
fromOptions
    INFO: PipelineOptions.filesToStage was not specified. Defaulting to files 
from the classpath: will stage 287 files. Enable logging at DEBUG level to see 
which files will be staged.
    Oct 06, 2023 12:44:57 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Executing pipeline on the Dataflow Service, which will have billing 
implications related to Google Compute Engine usage and other Google Cloud 
Services.
    Oct 06, 2023 12:45:09 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Uploading 287 files from PipelineOptions.filesToStage to staging 
location to prepare for execution.
    Oct 06, 2023 12:45:10 AM org.apache.beam.runners.dataflow.util.PackageUtil 
stageClasspathElements
    INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 1 
seconds
    Oct 06, 2023 12:45:10 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging portable pipeline proto to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/
    Oct 06, 2023 12:45:11 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <193246 bytes, hash 
47f8c3ab6da3424d7080df9e29b81fc22671449f06e6e50ec7629533a395870f> to 
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-R_jDq22jQk1wgN-eKbgfwiZxRJ8G5uUOx2KVM6OVhw8.pb
    Oct 06, 2023 12:45:13 AM org.apache.beam.sdk.coders.SerializableCoder 
checkEqualsMethodDefined
    WARNING: Can't verify serialized elements of type BoundedSource have well 
defined equals method. This may produce incorrect results on some 
PipelineRunner implementations
    Oct 06, 2023 12:45:26 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Read from unbounded 
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds
 as step s1
    Oct 06, 2023 12:45:26 AM org.apache.kafka.common.config.AbstractConfig 
logAll
    INFO: ConsumerConfig values: 
        allow.auto.create.topics = true
        auto.commit.interval.ms = 5000
        auto.offset.reset = earliest
        bootstrap.servers = [34.122.127.222:32404, 34.121.70.76:32405, 
34.135.66.105:32406]
        check.crcs = true
        client.dns.lookup = default
        client.id = 
        client.rack = 
        connections.max.idle.ms = 540000
        default.api.timeout.ms = 60000
        enable.auto.commit = false
        exclude.internal.topics = true
        fetch.max.bytes = 52428800
        fetch.max.wait.ms = 500
        fetch.min.bytes = 1
        group.id = null
        group.instance.id = null
        heartbeat.interval.ms = 3000
        interceptor.classes = []
        internal.leave.group.on.close = true
        isolation.level = read_uncommitted
        key.deserializer = class 
org.apache.kafka.common.serialization.ByteArrayDeserializer
        max.partition.fetch.bytes = 1048576
        max.poll.interval.ms = 300000
        max.poll.records = 500
        metadata.max.age.ms = 300000
        metric.reporters = []
        metrics.num.samples = 2
        metrics.recording.level = INFO
        metrics.sample.window.ms = 30000
        partition.assignment.strategy = [class 
org.apache.kafka.clients.consumer.RangeAssignor]
        receive.buffer.bytes = 524288
        reconnect.backoff.max.ms = 1000
        reconnect.backoff.ms = 50
        request.timeout.ms = 30000
        retry.backoff.ms = 100
        sasl.client.callback.handler.class = null
        sasl.jaas.config = null
        sasl.kerberos.kinit.cmd = /usr/bin/kinit
        sasl.kerberos.min.time.before.relogin = 60000
        sasl.kerberos.service.name = null
        sasl.kerberos.ticket.renew.jitter = 0.05
        sasl.kerberos.ticket.renew.window.factor = 0.8
        sasl.login.callback.handler.class = null
        sasl.login.class = null
        sasl.login.refresh.buffer.seconds = 300
        sasl.login.refresh.min.period.seconds = 60
        sasl.login.refresh.window.factor = 0.8
        sasl.login.refresh.window.jitter = 0.05
        sasl.mechanism = GSSAPI
        security.protocol = PLAINTEXT
        security.providers = null
        send.buffer.bytes = 131072
        session.timeout.ms = 10000
        ssl.cipher.suites = null
        ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
        ssl.endpoint.identification.algorithm = https
        ssl.key.password = null
        ssl.keymanager.algorithm = SunX509
        ssl.keystore.location = null
        ssl.keystore.password = null
        ssl.keystore.type = JKS
        ssl.protocol = TLS
        ssl.provider = null
        ssl.secure.random.implementation = null
        ssl.trustmanager.algorithm = PKIX
        ssl.truststore.location = null
        ssl.truststore.password = null
        ssl.truststore.type = JKS
        value.deserializer = class 
org.apache.kafka.common.serialization.ByteArrayDeserializer

    Oct 06, 2023 12:45:26 AM 
org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka version: 2.4.1
    Oct 06, 2023 12:45:26 AM 
org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka commitId: c57222ae8cd7866b
    Oct 06, 2023 12:45:26 AM 
org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
    INFO: Kafka startTimeMs: 1696553126978

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:kafka:integrationTest

org.apache.beam.sdk.io.kafka.KafkaIOIT > 
testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
    java.lang.RuntimeException: 
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching 
topic metadata
        at 
org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
        at 
org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2154)
        at 
org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2149)
        at 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
        at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
        at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
        at 
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
        at 
org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
        at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
        at 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
        at 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1226)
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:199)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
        at 
org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)

        Caused by:
        org.apache.kafka.common.errors.TimeoutException: Timeout expired while 
fetching topic metadata

1 test completed, 1 failed
Finished generating test XML results (0.317 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.301 secs) into: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>

> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for 
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[included 
builds,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[included 
builds,5,main]) started.

> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task 
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
  Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not 
up-to-date because:
  Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: docker rmi --force 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231006003316
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231006003316
Untagged: 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3212c99b297a79fd72ba8c5f66d6d7217a0ee3bb0abfe7bf2023ffe540977e1
Starting process 'command 'gcloud''. Working directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: gcloud --quiet container images untag 
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231006003316
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use 
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231006003316]
- referencing digest: 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3212c99b297a79fd72ba8c5f66d6d7217a0ee3bb0abfe7bf2023ffe540977e1]

Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231006003316] 
(referencing 
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3212c99b297a79fd72ba8c5f66d6d7217a0ee3bb0abfe7bf2023ffe540977e1])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working 
directory: 
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
 Command: ./scripts/cleanup_untagged_gcr_images.sh 
us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command 
'./scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image 
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:a3212c99b297a79fd72ba8c5f66d6d7217a0ee3bb0abfe7bf2023ffe540977e1
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[included 
builds,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[included builds,5,main]) started.

> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 9.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

For more on this, please refer to 
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
 in the Gradle documentation.

BUILD FAILED in 14m 30s
163 actionable tasks: 103 executed, 58 from cache, 2 up-to-date

Publishing build scan...
https://ge.apache.org/s/nevf7edwc5hom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to