See
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/80/display/redirect?page=changes>
Changes:
[relax] First step in Vortex sink - refactor CreateTablea and add TableRow ->
[tysonjh] [BEAM-11377] Fix retry & cleanup issues.
[relax] address comments
[shehzaad] [BEAM-10961] enable strict dependency checking for
sdks/java/io/azure/
[shehzaad] [BEAM-10961] enable strict dependency checking for
sdks/java/io/influxdb
[Kenneth Knowles] Add test for side input created once but consumed twice
[shehzaad] [BEAM-10961] (1) fix azure-storage-common version (2) add reference
to
[shehzaad] [BEAM-10961] nit: spacing
[shehzaad] [BEAM-10961] enable strict dependency checking for sdks/java/io/mqtt
[shehzaad] [BEAM-10961] enable strict dependency checking for sdks/java/io/solr
[shehzaad] [BEAM-10961] enable strict dependency checking for
sdks/java/io/splunk
[shehzaad] [BEAM-10961] enable strict dependency checking for
[shehzaad] [BEAM-10961] enable strict dependency checking for sdks/java/io/tika
[shehzaad] [BEAM-10961] enable strict dependency checking for sdks/java/io/xml
[shehzaad] [BEAM-10961] fix spacing
[nir.gzt] [BEAM-11859] Fixed bug in python S3 IO
[noreply] [BEAM-10961] enable strict dependency checking for
sdks/java/io/hcatalog
[noreply] [BEAM-10961] enable strict dependency checking for sdks/java/io/kafka
[noreply] [BEAM-10961] enable strict dependency checking for sdks/java/io/jms
[noreply] [BEAM-10961] enable strict dependency checking for sdks/java/io/jdbc
[noreply] [BEAM-10961] enable strict dependency checking for
[noreply] [BEAM-10961] enable strict dependency checking for
[noreply] [BEAM-10961] enable strict dependency checking for
[noreply] [BEAM-10961] enable strict dependency checking for
sdks/java/io/kinesis
[noreply] [BEAM-10961] enable strict dependency checking for
[noreply] [BEAM-10961] enable strict dependency checking for
[noreply] Returning successful writes in FhirIO.Write.Result (#14034)
[noreply] Merge pull request #14046 from [BEAM-11791] Adding a microbenchmark
for
[noreply] [BEAM-11344] Apply "Become a Committer" changes from Website Revamp
[noreply] [BEAM-10937] Add Tour of Beam page (#13747)
[Kenneth Knowles] Remove metadata-driven triggers from capability matrix,
because they do
[Kenneth Knowles] Remove retractions from capability matrix, because they do
not exist yet
[Kenneth Knowles] Remove JStorm runner from capability matrix, because it is on
a branch
[Kenneth Knowles] Remove MapReduce runner from capability matrix, because it is
on a
[Kenneth Knowles] Merge redundant model feature columns in capability matrix
[noreply] Merge pull request #14033 from [BEAM-11408] Integrate Python BigQuery
[Kenneth Knowles] Log a warning when Dataflow returns an unrecognized state
[Kenneth Knowles] Show string from Dataflow service when job terminates in
unrecognized
[Brian Hulette] Fix preview
------------------------------------------
[...truncated 55.37 KB...]
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c4"
component_coder_ids: "c0"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c5"
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:coder:iterable:v1"
>
component_coder_ids: "c5"
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:go:coder:cogbklist:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c0"
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload:
"\n>gcr.io/apache-beam-testing/beam_portability/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v0"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
dependencies: <
type_urn: "beam:artifact:type:go_****_binary:v1"
role_urn: "beam:artifact:role:staging_to:v1"
role_payload: "\n\006****"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "s2"
root_transform_ids: "e8"
root_transform_ids: "e4"
root_transform_ids: "e9_cogbk"
root_transform_ids: "e10"
root_transform_ids: "e11"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2021/02/25 08:31:18 Prepared job with id:
load-tests-go-flink-batch-cogbk-2-0225065610_9940b085-75e1-4880-9118-1c61e1abfa0b
and staging token:
load-tests-go-flink-batch-cogbk-2-0225065610_9940b085-75e1-4880-9118-1c61e1abfa0b
2021/02/25 08:31:18 Using specified **** binary: 'linux_amd64/cogbk'
2021/02/25 08:31:21 Staged binary artifact with token:
2021/02/25 08:31:22 Submitted job:
load0tests0go0flink0batch0cogbk0200225065610-root-0225083121-af4df7c4_04606e0d-03eb-4b67-be2f-aae7fbdca39f
2021/02/25 08:31:22 Job state: STOPPED
2021/02/25 08:31:22 Job state: STARTING
2021/02/25 08:31:22 Job state: RUNNING
2021/02/25 08:43:52 (): java.util.concurrent.ExecutionException:
org.apache.flink.client.program.ProgramInvocationException: Job failed (JobID:
301614f32741fa99f2750ba9d0f228ba)
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at
org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:864)
at
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:199)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.client.program.ProgramInvocationException: Job
failed (JobID: 301614f32741fa99f2750ba9d0f228ba)
at
org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:112)
at
java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at
java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at
org.apache.flink.client.program.rest.RestClusterClient.lambda$pollResourceAsync$21(RestClusterClient.java:565)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.complete(CompletableFuture.java:1975)
at
org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$8(FutureUtils.java:291)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
... 3 more
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution
failed.
at
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:147)
at
org.apache.flink.client.deployment.ClusterClientJobClientAdapter.lambda$null$6(ClusterClientJobClientAdapter.java:110)
... 19 more
Caused by: org.apache.flink.runtime.JobException: Recovery is suppressed by
NoRestartBackoffTimeStrategy
at
org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.handleFailure(ExecutionFailureHandler.java:110)
at
org.apache.flink.runtime.executiongraph.failover.flip1.ExecutionFailureHandler.getFailureHandlingResult(ExecutionFailureHandler.java:76)
at
org.apache.flink.runtime.scheduler.DefaultScheduler.handleTaskFailure(DefaultScheduler.java:192)
at
org.apache.flink.runtime.scheduler.DefaultScheduler.maybeHandleTaskFailure(DefaultScheduler.java:186)
at
org.apache.flink.runtime.scheduler.DefaultScheduler.updateTaskExecutionStateInternal(DefaultScheduler.java:180)
at
org.apache.flink.runtime.scheduler.SchedulerBase.updateTaskExecutionState(SchedulerBase.java:496)
at
org.apache.flink.runtime.jobmaster.JobMaster.updateTaskExecutionState(JobMaster.java:380)
at sun.reflect.GeneratedMethodAccessor23.invoke(Unknown Source)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcInvocation(AkkaRpcActor.java:284)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleRpcMessage(AkkaRpcActor.java:199)
at
org.apache.flink.runtime.rpc.akka.FencedAkkaRpcActor.handleRpcMessage(FencedAkkaRpcActor.java:74)
at
org.apache.flink.runtime.rpc.akka.AkkaRpcActor.handleMessage(AkkaRpcActor.java:152)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:26)
at akka.japi.pf.UnitCaseStatement.apply(CaseStatements.scala:21)
at scala.PartialFunction$class.applyOrElse(PartialFunction.scala:123)
at akka.japi.pf.UnitCaseStatement.applyOrElse(CaseStatements.scala:21)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:170)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at scala.PartialFunction$OrElse.applyOrElse(PartialFunction.scala:171)
at akka.actor.Actor$class.aroundReceive(Actor.scala:517)
at akka.actor.AbstractActor.aroundReceive(AbstractActor.scala:225)
at akka.actor.ActorCell.receiveMessage(ActorCell.scala:592)
at akka.actor.ActorCell.invoke(ActorCell.scala:561)
at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:258)
at akka.dispatch.Mailbox.run(Mailbox.scala:225)
at akka.dispatch.Mailbox.exec(Mailbox.scala:235)
at akka.dispatch.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
at
akka.dispatch.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1339)
at akka.dispatch.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
at
akka.dispatch.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:107)
Caused by: java.lang.RuntimeException: Emitting the record caused an I/O
exception: Failed to serialize element. Serialized size (> 268435834 bytes)
exceeds JVM heap space
at
org.apache.flink.runtime.operators.shipping.OutputCollector.collect(OutputCollector.java:69)
at
org.apache.flink.runtime.operators.util.metrics.CountingCollector.collect(CountingCollector.java:35)
at
org.apache.beam.runners.flink.translation.functions.SortingFlinkCombineRunner.combine(SortingFlinkCombineRunner.java:141)
at
org.apache.beam.runners.flink.translation.functions.FlinkReduceFunction.reduce(FlinkReduceFunction.java:109)
at
org.apache.flink.api.java.operators.translation.PlanUnwrappingReduceGroupOperator$TupleUnwrappingNonCombinableGroupReducer.reduce(PlanUnwrappingReduceGroupOperator.java:111)
at
org.apache.flink.runtime.operators.GroupReduceDriver.run(GroupReduceDriver.java:131)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:504)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:708)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:533)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.io.IOException: Failed to serialize element. Serialized size (>
268435834 bytes) exceeds JVM heap space
at
org.apache.flink.core.memory.DataOutputSerializer.resize(DataOutputSerializer.java:334)
at
org.apache.flink.core.memory.DataOutputSerializer.write(DataOutputSerializer.java:152)
at
org.apache.beam.runners.flink.translation.wrappers.DataOutputViewWrapper.write(DataOutputViewWrapper.java:44)
at java.io.DataOutputStream.write(DataOutputStream.java:107)
at java.io.ByteArrayOutputStream.writeTo(ByteArrayOutputStream.java:167)
at
org.apache.beam.sdk.coders.LengthPrefixCoder.encode(LengthPrefixCoder.java:58)
at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:72)
at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:63)
at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:37)
at
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:114)
at
org.apache.beam.sdk.coders.IterableLikeCoder.encode(IterableLikeCoder.java:60)
at org.apache.beam.sdk.coders.Coder.encode(Coder.java:136)
at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:73)
at org.apache.beam.sdk.coders.KvCoder.encode(KvCoder.java:37)
at
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:591)
at
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:582)
at
org.apache.beam.sdk.util.WindowedValue$FullWindowedValueCoder.encode(WindowedValue.java:542)
at
org.apache.beam.runners.flink.translation.types.CoderTypeSerializer.serialize(CoderTypeSerializer.java:111)
at
org.apache.flink.runtime.plugable.SerializationDelegate.write(SerializationDelegate.java:54)
at
org.apache.flink.runtime.io.network.api.serialization.SpanningRecordSerializer.serializeRecord(SpanningRecordSerializer.java:71)
at
org.apache.flink.runtime.io.network.api.writer.RecordWriter.emit(RecordWriter.java:113)
at
org.apache.flink.runtime.io.network.api.writer.ChannelSelectorRecordWriter.emit(ChannelSelectorRecordWriter.java:60)
at
org.apache.flink.runtime.operators.shipping.OutputCollector.collect(OutputCollector.java:65)
... 10 more
Caused by: java.lang.OutOfMemoryError: Java heap space
2021/02/25 08:43:52 (): java.lang.OutOfMemoryError: Java heap space
2021/02/25 08:43:52 Job state: FAILED
2021/02/25 08:43:52 Failed to execute job: job
load0tests0go0flink0batch0cogbk0200225065610-root-0225083121-af4df7c4_04606e0d-03eb-4b67-be2f-aae7fbdca39f
failed
panic: Failed to execute job: job
load0tests0go0flink0batch0cogbk0200225065610-root-0225083121-af4df7c4_04606e0d-03eb-4b67-be2f-aae7fbdca39f
failed
goroutine 1 [running]:
github.com/apache/beam/sdks/go/test/load/vendor/github.com/apache/beam/sdks/go/pkg/beam/log.Fatalf(0x11a2e00,
0xc00003e0b0, 0x1060143, 0x19, 0xc000187ee8, 0x1, 0x1)
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/test/load/vendor/github.com/apache/beam/sdks/go/pkg/beam/log/log.go>:153
+0xec
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/.gogradle/project_gopath/src/github.com/apache/beam/sdks/go/test/load/cogbk/cogbk.go>:99
+0x577
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Go_CoGBK_Flink_batch/ws/src/sdks/go/test/load/build.gradle'>
line: 65
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 16m 24s
6 actionable tasks: 6 executed
Publishing build scan...
https://gradle.com/s/6vzpxinqg2klm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]