See
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/4119/display/redirect>
Changes:
------------------------------------------
[...truncated 714.75 KB...]
at
org.apache.beam.sdk.io.kafka.KafkaWriter$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at
org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at
org.apache.beam.sdk.transforms.MapElements$2.processElement(MapElements.java:151)
at
org.apache.beam.sdk.transforms.MapElements$2$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at
org.apache.beam.fn.harness.FnApiDoFnRunner$NonWindowObservingProcessBundleContext.output(FnApiDoFnRunner.java:2506)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor.processElement(TimeMonitor.java:42)
at
org.apache.beam.sdk.testutils.metrics.TimeMonitor$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForParDo(FnApiDoFnRunner.java:799)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.outputTo(FnApiDoFnRunner.java:1788)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.access$3000(FnApiDoFnRunner.java:142)
at
org.apache.beam.fn.harness.FnApiDoFnRunner$WindowObservingProcessBundleContext.outputWithTimestamp(FnApiDoFnRunner.java:2214)
at
org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn.processElement(Read.java:321)
at
org.apache.beam.sdk.io.Read$BoundedSourceAsSDFWrapperFn$DoFnInvoker.invokeProcessElement(Unknown
Source)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.processElementForWindowObservingSizedElementAndRestriction(FnApiDoFnRunner.java:1096)
at
org.apache.beam.fn.harness.FnApiDoFnRunner.access$1500(FnApiDoFnRunner.java:142)
at
org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:656)
at
org.apache.beam.fn.harness.FnApiDoFnRunner$4.accept(FnApiDoFnRunner.java:651)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:348)
at
org.apache.beam.fn.harness.data.PCollectionConsumerRegistry$MetricTrackingFnDataReceiver.accept(PCollectionConsumerRegistry.java:275)
at
org.apache.beam.fn.harness.BeamFnDataReadRunner.forwardElementToConsumer(BeamFnDataReadRunner.java:213)
at
org.apache.beam.sdk.fn.data.BeamFnDataInboundObserver.multiplexElements(BeamFnDataInboundObserver.java:158)
at
org.apache.beam.fn.harness.control.ProcessBundleHandler.processBundle(ProcessBundleHandler.java:537)
at
org.apache.beam.fn.harness.control.BeamFnControlClient.delegateOnInstructionRequestType(BeamFnControlClient.java:150)
at
org.apache.beam.fn.harness.control.BeamFnControlClient$InboundObserver.lambda$onNext$0(BeamFnControlClient.java:115)
at
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.beam.sdk.util.UnboundedScheduledExecutorService$ScheduledFutureTask.run(UnboundedScheduledExecutorService.java:163)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.io.IOException: KafkaWriter : failed to send 1 records
(since last report)
at
org.apache.beam.sdk.io.kafka.KafkaWriter.checkForFailures(KafkaWriter.java:149)
at
org.apache.beam.sdk.io.kafka.KafkaWriter.processElement(KafkaWriter.java:62)
Caused by: org.apache.kafka.common.errors.TimeoutException: Topic beam-sdf
not present in metadata after 60000 ms.
Oct 14, 2023 12:39:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-14T00:39:40.119Z: Finished operation
Generate-records-ParDo-BoundedSourceAsSDFWrapper--ParMultiDo-BoundedSourceAsSDFWrapper-/ProcessElementAndRestrictionWithSizing+Measure
write time/ParMultiDo(TimeMonitor)+Write to Kafka/Kafka
ProducerRecord/Map/ParMultiDo(Anonymous)+Write to
Kafka/KafkaIO.WriteRecords/ParDo(KafkaWriter)/ParMultiDo(KafkaWriter)
Oct 14, 2023 12:39:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-14T00:39:40.227Z: Cleaning up.
Oct 14, 2023 12:39:41 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-14T00:39:40.290Z: Stopping **** pool...
Oct 14, 2023 12:42:06 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-14T00:42:06.423Z: Autoscaling: Resized **** pool from 5 to 0.
Oct 14, 2023 12:42:06 AM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2023-10-14T00:42:06.484Z: Worker pool stopped.
Oct 14, 2023 12:42:13 AM
org.apache.beam.runners.dataflow.DataflowPipelineJob logTerminalState
INFO: Job 2023-10-13_17_34_33-14332997471155662693 finished with status
DONE.
Oct 14, 2023 12:42:13 AM
org.apache.beam.sdk.extensions.gcp.options.GcpOptions$GcpTempLocationFactory
tryCreateDefaultBucket
INFO: No tempLocation specified, attempting to use default bucket:
dataflow-staging-us-central1-844138762903
Oct 14, 2023 12:42:13 AM
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
handleResponse
WARNING: Request failed with code 409, performed 0 retries due to
IOExceptions, performed 0 retries due to unsuccessful status codes, HTTP
framework says request can be retried, (caller responsible for retrying):
https://storage.googleapis.com/storage/v1/b?predefinedAcl=projectPrivate&predefinedDefaultObjectAcl=projectPrivate&project=apache-beam-testing.
Oct 14, 2023 12:42:13 AM
org.apache.beam.runners.dataflow.options.DataflowPipelineOptions$StagingLocationFactory
create
INFO: No stagingLocation provided, falling back to gcpTempLocation
Oct 14, 2023 12:42:14 AM org.apache.beam.runners.dataflow.DataflowRunner
fromOptions
INFO: PipelineOptions.filesToStage was not specified. Defaulting to files
from the classpath: will stage 287 files. Enable logging at DEBUG level to see
which files will be staged.
Oct 14, 2023 12:42:14 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Executing pipeline on the Dataflow Service, which will have billing
implications related to Google Compute Engine usage and other Google Cloud
Services.
Oct 14, 2023 12:42:17 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Uploading 287 files from PipelineOptions.filesToStage to staging
location to prepare for execution.
Oct 14, 2023 12:42:17 AM org.apache.beam.runners.dataflow.util.PackageUtil
stageClasspathElements
INFO: Staging files complete: 287 files cached, 0 files newly uploaded in 0
seconds
Oct 14, 2023 12:42:17 AM org.apache.beam.runners.dataflow.DataflowRunner run
INFO: Staging portable pipeline proto to
gs://dataflow-staging-us-central1-844138762903/temp/staging/
Oct 14, 2023 12:42:17 AM org.apache.beam.runners.dataflow.util.PackageUtil
tryStagePackage
INFO: Uploading <193278 bytes, hash
2ff351555a474a7df8dc54936d23105204c0d3263a19119d6677dc9aba5a9458> to
gs://dataflow-staging-us-central1-844138762903/temp/staging/pipeline-L_NRVVpHSn343FSTbSMQUgTA0yY6GRGdZnfcmrpalFg.pb
Oct 14, 2023 12:42:18 AM org.apache.beam.sdk.coders.SerializableCoder
checkEqualsMethodDefined
WARNING: Can't verify serialized elements of type BoundedSource have well
defined equals method. This may produce incorrect results on some
PipelineRunner implementations
Oct 14, 2023 12:42:21 AM
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
INFO: Adding Read from unbounded
Kafka/KafkaIO.Read.ReadFromKafkaViaSDF/Read(KafkaUnboundedSource)/DataflowRunner.StreamingUnboundedRead.ReadWithIds
as step s1
Oct 14, 2023 12:42:21 AM org.apache.kafka.common.config.AbstractConfig
logAll
INFO: ConsumerConfig values:
allow.auto.create.topics = true
auto.commit.interval.ms = 5000
auto.offset.reset = earliest
bootstrap.servers = [34.136.38.216:32404, 35.226.175.100:32405,
34.66.159.44:32406]
check.crcs = true
client.dns.lookup = default
client.id =
client.rack =
connections.max.idle.ms = 540000
default.api.timeout.ms = 60000
enable.auto.commit = false
exclude.internal.topics = true
fetch.max.bytes = 52428800
fetch.max.wait.ms = 500
fetch.min.bytes = 1
group.id = null
group.instance.id = null
heartbeat.interval.ms = 3000
interceptor.classes = []
internal.leave.group.on.close = true
isolation.level = read_uncommitted
key.deserializer = class
org.apache.kafka.common.serialization.ByteArrayDeserializer
max.partition.fetch.bytes = 1048576
max.poll.interval.ms = 300000
max.poll.records = 500
metadata.max.age.ms = 300000
metric.reporters = []
metrics.num.samples = 2
metrics.recording.level = INFO
metrics.sample.window.ms = 30000
partition.assignment.strategy = [class
org.apache.kafka.clients.consumer.RangeAssignor]
receive.buffer.bytes = 524288
reconnect.backoff.max.ms = 1000
reconnect.backoff.ms = 50
request.timeout.ms = 30000
retry.backoff.ms = 100
sasl.client.callback.handler.class = null
sasl.jaas.config = null
sasl.kerberos.kinit.cmd = /usr/bin/kinit
sasl.kerberos.min.time.before.relogin = 60000
sasl.kerberos.service.name = null
sasl.kerberos.ticket.renew.jitter = 0.05
sasl.kerberos.ticket.renew.window.factor = 0.8
sasl.login.callback.handler.class = null
sasl.login.class = null
sasl.login.refresh.buffer.seconds = 300
sasl.login.refresh.min.period.seconds = 60
sasl.login.refresh.window.factor = 0.8
sasl.login.refresh.window.jitter = 0.05
sasl.mechanism = GSSAPI
security.protocol = PLAINTEXT
security.providers = null
send.buffer.bytes = 131072
session.timeout.ms = 10000
ssl.cipher.suites = null
ssl.enabled.protocols = [TLSv1.2, TLSv1.1, TLSv1]
ssl.endpoint.identification.algorithm = https
ssl.key.password = null
ssl.keymanager.algorithm = SunX509
ssl.keystore.location = null
ssl.keystore.password = null
ssl.keystore.type = JKS
ssl.protocol = TLS
ssl.provider = null
ssl.secure.random.implementation = null
ssl.trustmanager.algorithm = PKIX
ssl.truststore.location = null
ssl.truststore.password = null
ssl.truststore.type = JKS
value.deserializer = class
org.apache.kafka.common.serialization.ByteArrayDeserializer
Oct 14, 2023 12:42:21 AM
org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka version: 2.4.1
Oct 14, 2023 12:42:21 AM
org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka commitId: c57222ae8cd7866b
Oct 14, 2023 12:42:21 AM
org.apache.kafka.common.utils.AppInfoParser$AppInfo <init>
INFO: Kafka startTimeMs: 1697244141653
Gradle Test Executor 1 finished executing tests.
> Task :sdks:java:io:kafka:integrationTest
org.apache.beam.sdk.io.kafka.KafkaIOIT >
testKafkaIOReadsAndWritesCorrectlyInStreaming FAILED
java.lang.RuntimeException:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while fetching
topic metadata
at
org.apache.beam.runners.dataflow.ReadTranslator.translateReadHelper(ReadTranslator.java:55)
at
org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2154)
at
org.apache.beam.runners.dataflow.DataflowRunner$StreamingUnboundedRead$ReadWithIdsTranslator.translate(DataflowRunner.java:2149)
at
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.visitPrimitiveTransform(DataflowPipelineTranslator.java:498)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:593)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.visit(TransformHierarchy.java:585)
at
org.apache.beam.sdk.runners.TransformHierarchy$Node.access$500(TransformHierarchy.java:240)
at
org.apache.beam.sdk.runners.TransformHierarchy.visit(TransformHierarchy.java:214)
at org.apache.beam.sdk.Pipeline.traverseTopologically(Pipeline.java:466)
at
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator.translate(DataflowPipelineTranslator.java:431)
at
org.apache.beam.runners.dataflow.DataflowPipelineTranslator.translate(DataflowPipelineTranslator.java:188)
at
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:1226)
at
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:199)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:321)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:398)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:335)
at
org.apache.beam.sdk.io.kafka.KafkaIOIT.testKafkaIOReadsAndWritesCorrectlyInStreaming(KafkaIOIT.java:216)
Caused by:
org.apache.kafka.common.errors.TimeoutException: Timeout expired while
fetching topic metadata
1 test completed, 1 failed
Finished generating test XML results (0.054 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.049 secs) into:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest>
> Task :sdks:java:io:kafka:integrationTest FAILED
Resolve mutations for
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[included
builds,5,main]) started.
:runners:google-cloud-dataflow-java:cleanUpDockerJavaImages (Thread[Execution
**** Thread 2,5,main]) started.
> Task :runners:google-cloud-dataflow-java:cleanUpDockerJavaImages
Custom actions are attached to task
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages'.
Caching disabled for task
':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' because:
Gradle would require more information to cache this task
Task ':runners:google-cloud-dataflow-java:cleanUpDockerJavaImages' is not
up-to-date because:
Task has not declared any outputs despite executing actions.
Starting process 'command 'docker''. Working directory:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
Command: docker rmi --force
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231014003328
Successfully started process 'command 'docker''
Untagged: us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231014003328
Untagged:
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7dd08ccf4b88cecf3079dc746c7d14d960495e72d9fdaacd32ab0909c236719
Starting process 'command 'gcloud''. Working directory:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
Command: gcloud --quiet container images untag
us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231014003328
Successfully started process 'command 'gcloud''
WARNING: Successfully resolved tag to sha256, but it is recommended to use
sha256 directly.
Tag: [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231014003328]
- referencing digest:
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7dd08ccf4b88cecf3079dc746c7d14d960495e72d9fdaacd32ab0909c236719]
Deleted [[us.gcr.io/apache-beam-testing/java-postcommit-it/java:20231014003328]
(referencing
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7dd08ccf4b88cecf3079dc746c7d14d960495e72d9fdaacd32ab0909c236719])].
Starting process 'command './scripts/cleanup_untagged_gcr_images.sh''. Working
directory:
<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/runners/google-cloud-dataflow-java>
Command: ./scripts/cleanup_untagged_gcr_images.sh
us.gcr.io/apache-beam-testing/java-postcommit-it/java
Successfully started process 'command
'./scripts/cleanup_untagged_gcr_images.sh''
Removing untagged image
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:d7dd08ccf4b88cecf3079dc746c7d14d960495e72d9fdaacd32ab0909c236719
Resolve mutations for :sdks:java:io:kafka:cleanUp (Thread[Execution **** Thread
2,5,main]) started.
:sdks:java:io:kafka:cleanUp (Thread[Execution ****,5,main]) started.
> Task :sdks:java:io:kafka:cleanUp
Skipping task ':sdks:java:io:kafka:cleanUp' as it has no actions.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:kafka:integrationTest'.
> There were failing tests. See the report at:
> file://<https://ci-beam.apache.org/job/beam_PerformanceTests_Kafka_IO/ws/src/sdks/java/io/kafka/build/reports/tests/integrationTest/index.html>
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 9.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
For more on this, please refer to
https://docs.gradle.org/8.3/userguide/command_line_interface.html#sec:command_line_warnings
in the Gradle documentation.
BUILD FAILED in 10m 27s
163 actionable tasks: 103 executed, 58 from cache, 2 up-to-date
Publishing build scan...
https://ge.apache.org/s/rbsekfghtz2qg
Build cache (/home/jenkins/.gradle/caches/build-cache-1) removing files not
accessed on or after Sat Oct 07 00:43:29 UTC 2023.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleanup deleted 1500
files/directories.
Build cache (/home/jenkins/.gradle/caches/build-cache-1) cleaned up in 2.736
secs.
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]