See 
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/755/display/redirect?page=changes>

Changes:

[samuelw] Add thread prefix for StreamingDataflowWorker work executor

[samuelw] Modify windmill DirectStreamObserver to call isReady only every 10

[noreply] Sync script for grafana  dashboard -  github actions  postcommit

[noreply] Integration test to load the default example of the default SDK and

[noreply] issue23923 emulated data indicator (#24708)

[noreply] Bump github.com/aws/aws-sdk-go-v2/service/s3 in /sdks (#24917)

[noreply] issue23918 extract java symbols (#24588)

[noreply] Attempt deserialize all non-standard portable logical types from proto

[noreply] TFT Criteo benchmarks (#24382)

[noreply] update Java version from 11 to 17 (#24839)


------------------------------------------
[...truncated 41.55 KB...]
      capabilities: "beam:protocol:monitoring_info_short_ids:v1"
      capabilities: "beam:version:sdk_base:go:apache/beam_go_sdk:2.45.0.dev"
      capabilities: "beam:coder:bytes:v1"
      capabilities: "beam:coder:bool:v1"
      capabilities: "beam:coder:varint:v1"
      capabilities: "beam:coder:double:v1"
      capabilities: "beam:coder:string_utf8:v1"
      capabilities: "beam:coder:length_prefix:v1"
      capabilities: "beam:coder:kv:v1"
      capabilities: "beam:coder:iterable:v1"
      capabilities: "beam:coder:state_backed_iterable:v1"
      capabilities: "beam:coder:windowed_value:v1"
      capabilities: "beam:coder:global_window:v1"
      capabilities: "beam:coder:interval_window:v1"
      capabilities: "beam:coder:row:v1"
      capabilities: "beam:coder:nullable:v1"
      dependencies: <
        type_urn: "beam:artifact:type:file:v1"
        role_urn: "beam:artifact:role:go_****_binary:v1"
      >
    >
  >
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e8"
root_transform_ids: "e9_cogbk"
root_transform_ids: "e10"
root_transform_ids: "e11"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2023/01/07 08:27:44 Using specified **** binary: 'linux_amd64/cogbk'
2023/01/07 08:27:44 Prepared job with id: 
load-tests-go-flink-batch-cogbk-2-0107065541_68a0e477-909f-4c21-bf89-c1616a6abd8f
 and staging token: 
load-tests-go-flink-batch-cogbk-2-0107065541_68a0e477-909f-4c21-bf89-c1616a6abd8f
2023/01/07 08:27:51 Staged binary artifact with token: 
2023/01/07 08:27:53 Submitted job: 
load0tests0go0flink0batch0cogbk0200107065541-root-0107082751-a449dec4_f16c76d1-8821-45f2-abec-3f97f4a98426
2023/01/07 08:27:53 Job state: STOPPED
2023/01/07 08:27:53 Job state: STARTING
2023/01/07 08:27:53 Job state: RUNNING
2023/01/07 08:39:48  (): 
org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID: 
bfee9e442f6f4f1923e6c44ec33c3129)
        at 
org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:130)
        at 
java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
        at 
java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
        at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
        at 
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
        at 
org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
        at 
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
        at 
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
        at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
        at 
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
        at 
org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$26(RestClusterClient.java:708)
        at 
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
        at 
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
        at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
        at 
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
        at 
org.apache.flink.util.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:403)
        at 
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
        at 
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
        at 
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
        at 
java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
        at 
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
        at 
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
        at java.lang.Thread.run(Thread.java:750)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution 
failed.
        at 
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:144)
        at 
org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:128)
        ... 24 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by 
NoRestartBackoffTimeStrategy
        at 
org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:138)
        at 
org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:82)
        at 
org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:301)
        at 
org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:291)
        at 
org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:282)
        at 
org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:739)
        at 
org.apache.flink.runtime.scheduler.SchedulerNG.updateTaskExecutionState(SchedulerNG.java:78)
        at 
org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:443)
        at jdk.internal.reflect.GeneratedMethodAccessor27.invoke(Unknown Source)
        at 
jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:566)
        at 
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.lambda$handleRpcInvocation$1(AkkaRpcActor.java:304)
        at 
org.apache.flink.runtime.concurrent.akka.ClassLoadingUtils.runWithContextClassLoader(ClassLoadingUtils.java:83)
        at 
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:302)
        at 
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:217)
        at 
org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:78)
        at 
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:163)
        at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:24)
        at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:20)
        at scala.PartialFunction.applyOrElse(PartialFunction.scala:127)
        at scala.PartialFunction.applyOrElse$(PartialFunction.scala:126)
        at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:20)
        at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:175)
        at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
        at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:176)
        at akka.actor.Actor.aroundReceive(Actor.scala:537)
        at akka.actor.Actor.aroundReceive$(Actor.scala:535)
        at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:220)
        at akka.actor.ActorCell.receiveMessage(ActorCell.scala:580)
        at akka.actor.ActorCell.invoke(ActorCell.scala:548)
        at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:270)
        at akka.dispatch.Mailbox.run(Mailbox.scala:231)
        at akka.dispatch.Mailbox.exec(Mailbox.scala:243)
        at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:290)
        at 
java.util.concurrent.ForkJoinPool$WorkQueue.topLevelExec(ForkJoinPool.java:1020)
        at java.util.concurrent.ForkJoinPool.scan(ForkJoinPool.java:1656)
        at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1594)
        at 
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:183)
Caused by: java.lang.OutOfMemoryError: Direct buffer memory. The direct 
out-of-memory error has occurred. This can mean two things: either job(s) 
require(s) a larger size of JVM direct memory or there is a direct memory leak. 
The direct memory can be allocated by user code or some of its dependencies. In 
this case 'taskmanager.memory.task.off-heap.size' configuration option should 
be increased. Flink framework and its dependencies also consume the direct 
memory, mostly for network communication. The most of network memory is managed 
by Flink and should not result in out-of-memory error. In certain special 
cases, in particular for jobs with high parallelism, the framework may require 
more direct memory which is not managed by Flink. In this case 
'taskmanager.memory.framework.off-heap.size' configuration option should be 
increased. If the error persists then there is probably a direct memory leak in 
user code or some of its dependencies which has to be investigated and fixed. 
The task executor has to be shutdown...
        at java.nio.Bits.reserveMemory(Bits.java:175)
        at java.nio.DirectByteBuffer.<init>(DirectByteBuffer.java:118)
        at java.nio.ByteBuffer.allocateDirect(ByteBuffer.java:317)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena$DirectArena.allocateDirect(PoolArena.java:649)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena$DirectArena.newChunk(PoolArena.java:624)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.allocateNormal(PoolArena.java:203)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.tcacheAllocateNormal(PoolArena.java:187)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.allocate(PoolArena.java:136)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PoolArena.allocate(PoolArena.java:126)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.PooledByteBufAllocator.newDirectBuffer(PooledByteBufAllocator.java:396)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.AbstractByteBufAllocator.directBuffer(AbstractByteBufAllocator.java:188)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.netty.buffer.AbstractByteBufAllocator.buffer(AbstractByteBufAllocator.java:124)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.netty.NettyWritableBufferAllocator.allocate(NettyWritableBufferAllocator.java:51)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeRaw(MessageFramer.java:285)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.access$400(MessageFramer.java:43)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer$OutputStreamAdapter.write(MessageFramer.java:375)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.write(CodedOutputStream.java:2984)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeLazy(CodedOutputStream.java:2992)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.ByteString$LiteralByteString.writeTo(ByteString.java:1470)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:461)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:461)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.RopeByteString.writeTo(RopeByteString.java:460)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeBytesNoTag(CodedOutputStream.java:2780)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeBytes(CodedOutputStream.java:2754)
        at 
org.apache.beam.model.fnexecution.v1.BeamFnApi$Elements$Data.writeTo(BeamFnApi.java:31639)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeMessageNoTag(CodedOutputStream.java:2834)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.CodedOutputStream$OutputStreamEncoder.writeMessage(CodedOutputStream.java:2803)
        at 
org.apache.beam.model.fnexecution.v1.BeamFnApi$Elements.writeTo(BeamFnApi.java:33712)
        at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.protobuf.AbstractMessageLite.writeTo(AbstractMessageLite.java:83)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.protobuf.lite.ProtoInputStream.drainTo(ProtoInputStream.java:52)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeToOutputStream(MessageFramer.java:267)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeKnownLengthUncompressed(MessageFramer.java:229)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writeUncompressed(MessageFramer.java:168)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.MessageFramer.writePayload(MessageFramer.java:141)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.AbstractStream.writeMessage(AbstractStream.java:65)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessageInternal(ServerCallImpl.java:171)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessage(ServerCallImpl.java:153)
        at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onNext(ServerCalls.java:380)
        at 
org.apache.beam.sdk.fn.stream.DirectStreamObserver.onNext(DirectStreamObserver.java:108)
        at 
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator.flushInternal(BeamFnDataOutboundAggregator.java:169)
        at 
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator.access$400(BeamFnDataOutboundAggregator.java:65)
        at 
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator$Receiver.accept(BeamFnDataOutboundAggregator.java:349)
        at 
org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver.accept(BeamFnDataOutboundObserver.java:85)
        at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$CountingFnDataReceiver.accept(SdkHarnessClient.java:716)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.processElements(FlinkExecutableStageFunction.java:363)
        at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:268)
        at 
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:113)
        at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:514)
        at 
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:357)
        at 
org.apache.flink.runtime.taskmanager.Task.runWithSystemExitMonitoring(Task.java:948)
        at 
org.apache.flink.runtime.taskmanager.Task.restoreAndInvoke(Task.java:927)
        at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:741)
        at org.apache.flink.runtime.taskmanager.Task.run(Task.java:563)
        at java.lang.Thread.run(Thread.java:829)
        Suppressed: java.lang.IllegalStateException: call is closed
                at 
org.apache.beam.vendor.grpc.v1p48p1.com.google.common.base.Preconditions.checkState(Preconditions.java:502)
                at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessageInternal(ServerCallImpl.java:161)
                at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.internal.ServerCallImpl.sendMessage(ServerCallImpl.java:153)
                at 
org.apache.beam.vendor.grpc.v1p48p1.io.grpc.stub.ServerCalls$ServerCallStreamObserverImpl.onNext(ServerCalls.java:380)
                at 
org.apache.beam.sdk.fn.stream.DirectStreamObserver.onNext(DirectStreamObserver.java:108)
                at 
org.apache.beam.sdk.fn.data.BeamFnDataOutboundAggregator.sendOrCollectBufferedDataAndFinishOutboundStreams(BeamFnDataOutboundAggregator.java:219)
                at 
org.apache.beam.sdk.fn.data.BeamFnDataOutboundObserver.close(BeamFnDataOutboundObserver.java:71)
                at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$CountingFnDataReceiver.close(SdkHarnessClient.java:727)
                at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:492)
                at 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:555)
                at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:269)
                at 
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:269)
                ... 8 more
                Suppressed: java.lang.IllegalStateException: Processing bundle 
failed, TODO: [https://github.com/apache/beam/issues/18756] abort bundle.
                        at 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:517)
                        ... 11 more
2023/01/07 08:39:48  (): java.lang.OutOfMemoryError: Direct buffer memory. The 
direct out-of-memory error has occurred. This can mean two things: either 
job(s) require(s) a larger size of JVM direct memory or there is a direct 
memory leak. The direct memory can be allocated by user code or some of its 
dependencies. In this case 'taskmanager.memory.task.off-heap.size' 
configuration option should be increased. Flink framework and its dependencies 
also consume the direct memory, mostly for network communication. The most of 
network memory is managed by Flink and should not result in out-of-memory 
error. In certain special cases, in particular for jobs with high parallelism, 
the framework may require more direct memory which is not managed by Flink. In 
this case 'taskmanager.memory.framework.off-heap.size' configuration option 
should be increased. If the error persists then there is probably a direct 
memory leak in user code or some of its dependencies which has to be 
investigated and fixed. The task executor has to be shutdown...
2023/01/07 08:39:48 Job state: FAILED
2023/01/07 08:39:48 Failed to execute job: job 
load0tests0go0flink0batch0cogbk0200107065541-root-0107082751-a449dec4_f16c76d1-8821-45f2-abec-3f97f4a98426
 failed
panic: Failed to execute job: job 
load0tests0go0flink0batch0cogbk0200107065541-root-0107082751-a449dec4_f16c76d1-8821-45f2-abec-3f97f4a98426
 failed

goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x17f9308, 
0xc000120000}, {0x164d3ab?, 0x2280e08?}, {0xc0003e7e60?, 0x0?, 0x3e8?})
        
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/pkg/beam/log/log.go>:162
 +0x8c
main.main()
        
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/cogbk/cogbk.go>:103
 +0x4bb

> Task :sdks:go:test:load:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/build.gradle'>
 line: 31

* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.

* Get more help at https://help.gradle.org

BUILD FAILED in 12m 26s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/dqpodtiofehbo

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to