See
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/723/display/redirect?page=changes>
Changes:
[noreply] Update nbconvert requirement in /sdks/python
[noreply] [CdapIO] Add readme for CdapIO. Update readme for SparkReceiverIO.
[Moritz Mack] [Spark Dataset runner] Add @Experimental and reduce visibility
where
[noreply] Fix grafana dashboard id (#24524)
[noreply] [Spark runner] Support running (VR) tests with Java 17 (closes #24400)
[noreply] Replaced deprecated finalize with DoFn Teardown (#24516)
[noreply] Bump cloud.google.com/go/storage from 1.28.0 to 1.28.1 in /sdks
(#24517)
[noreply] add clarifier to error message (#24449)
[noreply] Batch rename requests in fileio.WriteToFiles (#24341)
[noreply] Bump golang.org/x/text from 0.4.0 to 0.5.0 in /sdks (#24520)
[noreply] Support for JsonSchema in Kafka Read Schema Transform (#24272)
[noreply] Run go fmt over full go directory with go 1.19 (#24525)
[noreply] Cloudbuild+manualsetup+playground (#24144)
[noreply] Bump golang.org/x/sys from 0.2.0 to 0.3.0 in /sdks (#24519)
[noreply] Bump cloud.google.com/go/bigtable from 1.18.0 to 1.18.1 in /sdks
[noreply] Update from interface{} -> any for core packages (#24505)
[noreply] Implement FileWriteSchemaTransformConfiguration (#24479)
[noreply] Bump cloud.google.com/go/pubsub from 1.27.0 to 1.27.1 in /sdks
(#24518)
[noreply] [Playground] Healthcheck was added (#24227)
[noreply] Update dataflow container version for Pandas upgrade (#24532)
------------------------------------------
[...truncated 41.12 KB...]
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s2"
root_transform_ids: "e8"
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "e9_cogbk"
root_transform_ids: "e10"
root_transform_ids: "e11"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/12/06 08:28:44 Using specified **** binary: 'linux_amd64/cogbk'
2022/12/06 08:28:44 Prepared job with id:
load-tests-go-flink-batch-cogbk-2-1206065525_fcb08652-315f-4298-a76e-005902055bd8
and staging token:
load-tests-go-flink-batch-cogbk-2-1206065525_fcb08652-315f-4298-a76e-005902055bd8
2022/12/06 08:28:51 Staged binary artifact with token:
2022/12/06 08:28:52 Submitted job:
load0tests0go0flink0batch0cogbk0201206065525-root-1206082851-774cd22b_508f652f-6575-4b8a-ae47-836d6e44ab62
2022/12/06 08:28:52 Job state: STOPPED
2022/12/06 08:28:52 Job state: STARTING
2022/12/06 08:28:52 Job state: RUNNING
2022/12/06 08:40:48 ():
org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID:
15a508e56bdc5b13fb6f8d156985603b)
at
org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
at
java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at
java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at
org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at
org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at
org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution
failed.
at
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
at
org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by
NoRestartBackoffTimeStrategy
at
org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
at
org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
at
org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
at
org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
at
org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
at
org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
at
org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:78)
at
org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:443)
at jdk.internal.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
at
jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:566)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:304)
at
org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:302)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217)
at
org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
at akka.actor.Actor.aroundReceive(Actor.scala:537)
at akka.actor.Actor.aroundReceive$(Actor.scala:535)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
at akka.actor.ActorCell.invoke(ActorCell.scala:548)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
at akka.dispatch.Mailbox.run(Mailbox.scala:231)
at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
at
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
at
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.lang.OutOfMemoryError: Direct buffer memory. The direct
out-of-memory error has occurred. This can mean two things: either job(s)
require(s) a larger size of JVM direct memory or there is a direct memory leak.
The direct memory can be allocated by user code or some of its dependencies. In
this case 'taskmanager.memory.task.off-heap.size' configuration option should
be increased. Flink framework and its dependencies also consume the direct
memory, mostly for network communication. The most of network memory is managed
by Flink and should not result in out-of-memory error. In certain special
cases, in particular for jobs with high parallelism, the framework may require
more direct memory which is not managed by Flink. In this case
'taskmanager.memory.framework.off-heap.size' configuration option should be
increased. If the error persists then there is probably a direct memory leak in
user code or some of its dependencies which has to be investigated and fixed.
The task executor has to be shutdown...
at java.nio.Bits.reserveMemory(Bits.java:175)
at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:118)
at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:317)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena$DirectArena.allocateDirect(PoolArena.java:649)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena$DirectArena.newChunk(PoolArena.java:624)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.allocateNormal(PoolArena.java:203)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.tcacheAllocateNormal(PoolArena.java:187)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.allocate(PoolArena.java:136)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.allocate(PoolArena.java:126)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:396)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:188)
at
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.AbstractByteBufAllocator.buffer(AbstractByteBufAllocator.java:124)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.netty.NettyWritableBufferAllocator.allocate(NettyWritableBufferAllocator.java:51)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeRaw(MessageFramer.java:285)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.access$400(MessageFramer.java:43)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer$OutputStreamAdapter.write(MessageFramer.java:375)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.write(CodedOutputStream.java:2984)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeLazy(CodedOutputStream.java:2992)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.ByteString$LiteralByteString.writeTo(ByteString.java:1470)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:461)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:461)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeBytesNoTag(CodedOutputStream.java:2780)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeBytes(CodedOutputStream.java:2754)
at
org.apache.beam.model.fnexecution.v1.BeamFnApi$Elements$Data.writeTo(BeamFnApi.java:31639)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeMessageNoTag(CodedOutputStream.java:2834)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeMessage(CodedOutputStream.java:2803)
at
org.apache.beam.model.fnexecution.v1.BeamFnApi$Elements.writeTo(BeamFnApi.java:33712)
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:83)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.protobuf.lite.ProtoInputStream.drainTo(ProtoInputStream.java:52)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeToOutputStream(MessageFramer.java:267)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeKnownLengthUncompressed(MessageFramer.java:229)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeUncompressed(MessageFramer.java:168)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writePayload(MessageFramer.java:141)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.AbstractStream.writeMessage(AbstractStream.java:65)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessageInternal(ServerCallImpl.java:171)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessage(ServerCallImpl.java:153)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onNext(ServerCalls.java:380)
at
org.apache.beam.sdk.fn.stream.DirectStreamObserver.onNext(DirectStreamObserver.java:108)
at
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator.flushInternal(BeamFnDataOutboundAggregator.java:169)
at
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator.access$400(BeamFnDataOutboundAggregator.java:65)
at
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator$Receiver.accept(BeamFnDataOutboundAggregator.java:349)
at
org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver.accept(BeamFnDataOutboundObserver.java:85)
at
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$CountingFnDataReceiver.accept(SdkHarnessClient.java:716)
at
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.processElements(FlinkExecutableStageFunction.java:363)
at
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:268)
at
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:113)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:514)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:357)
at
org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948)
at
org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:741)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563)
at java.lang.Thread.run(Thread.java:829)
Suppressed: java.lang.IllegalStateException: call is closed
at
org.apache.beam.vendor.grpc.v1p48p1.com.google.common.base.Preconditions.checkState(Preconditions.java:502)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessageInternal(ServerCallImpl.java:161)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessage(ServerCallImpl.java:153)
at
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onNext(ServerCalls.java:380)
at
org.apache.beam.sdk.fn.stream.DirectStreamObserver.onNext(DirectStreamObserver.java:108)
at
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator.sendOrCollectBufferedDataAndFinishOutboundStreams(BeamFnDataOutboundAggregator.java:219)
at
org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver.close(BeamFnDataOutboundObserver.java:71)
at
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$CountingFnDataReceiver.close(SdkHarnessClient.java:727)
at
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:492)
at
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:555)
at
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:269)
at
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:269)
... 8 more
Suppressed: java.lang.IllegalStateException: Processing bundle
failed, TODO: [https://github.com/apache/beam/issues/18756] abort bundle.
at
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:517)
... 11 more
2022/12/06 08:40:48 (): java.lang.OutOfMemoryError: Direct buffer memory. The
direct out-of-memory error has occurred. This can mean two things: either
job(s) require(s) a larger size of JVM direct memory or there is a direct
memory leak. The direct memory can be allocated by user code or some of its
dependencies. In this case 'taskmanager.memory.task.off-heap.size'
configuration option should be increased. Flink framework and its dependencies
also consume the direct memory, mostly for network communication. The most of
network memory is managed by Flink and should not result in out-of-memory
error. In certain special cases, in particular for jobs with high parallelism,
the framework may require more direct memory which is not managed by Flink. In
this case 'taskmanager.memory.framework.off-heap.size' configuration option
should be increased. If the error persists then there is probably a direct
memory leak in user code or some of its dependencies which has to be
investigated and fixed. The task executor has to be shutdown...
2022/12/06 08:40:48 Job state: FAILED
2022/12/06 08:40:48 Failed to execute job: job
load0tests0go0flink0batch0cogbk0201206065525-root-1206082851-774cd22b_508f652f-6575-4b8a-ae47-836d6e44ab62
failed
panic: Failed to execute job: job
load0tests0go0flink0batch0cogbk0201206065525-root-1206082851-774cd22b_508f652f-6575-4b8a-ae47-836d6e44ab62
failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x17f59c8,
0xc00004e0c0}, {0x1649e1d?, 0x227a718?}, {0xc0006d5e60?, 0x0?, 0x3e8?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/pkg/beam/log/log.go>:153
+0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/cogbk/cogbk.go>:103
+0x4bb
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/build.gradle'>
line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
* Get more help at https://help.gradle.org
BUILD FAILED in 12m 44s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/ruq34agx67tuc
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]