See
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/590/display/redirect?page=changes>
Changes:
[vitaly.terentyev] [BEAM-14101] Add Spark Receiver IO package and
ReceiverBuilder
[noreply] Bump protobufjs from 6.11.2 to 6.11.3 in /sdks/typescript
[egalpin] Moves timestamp skew override to correct place
[egalpin] Adds TestStream to verify window preservation of ElasticsearchIO#write
[egalpin] Removes unnecessary line
[egalpin] Adds validation that ES#Write outputs are in expected windows
[egalpin] Updates window verification test to assert the exact docs in the
window
[egalpin] Uses guava Iterables over shaded avro version
[Robert Bradshaw] Don't try to parse non-flags as retained pipeline options.
[chamikaramj] Enables UnboundedSource wrapped SDF Kafka source by default for
x-lang
[noreply] Merge pull request #22140 from [Playground Task] Sharing any code API
[bulat.safiullin] [Website] add playground section, update playground, update
get-started
[noreply] RunInference documentation updates. (#22236)
[noreply] Turn pr bot on for remaining common labels (#22257)
[noreply] Reviewing the RunInference ReadMe file for clarity. (#22069)
[noreply] Collect heap profile on OOM on Dataflow (#22225)
[noreply] fixing the missing wrap around ring range read (#21786)
[noreply] Update RunInference documentation (#22250)
[noreply] Rewrote Java multi-language pipeline quickstart (#22263)
------------------------------------------
[...truncated 33.55 KB...]
coders: <
key: "c13"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload:
"Cgl0b3AuYWNjdW0SQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0aoQEKSGdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS90cmFuc2Zvcm1zL3RvcC5hY2N1bUVuYy5mdW5jMRJVCBYiQwgaSj9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vdHJhbnNmb3Jtcy90b3AuYWNjdW0qBggUEgIICCoECBlAASKhAQpIZ2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtRGVjLmZ1bmMxElUIFiIGCBQSAggIKkMIGko/Z2l0aHViLmNvbS9hcGFjaGUvYmVhbS9zZGtzL3YyL2dvL3BrZy9iZWFtL3RyYW5zZm9ybXMvdG9wLmFjY3VtKgQIGUAB"
>
>
>
coders: <
key: "c14"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c13"
>
>
coders: <
key: "c2"
value: <
spec: <
urn: "beam:coder:global_window:v1"
>
>
>
coders: <
key: "c3"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload:
"CgRqc29uEgoIFBIGCBQSAggIGkwKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRW5jEhYIFiIECBlADyoGCBQSAggIKgQIGUABIlIKMmdpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS5qc29uRGVjEhwIFiIECBlAAyIGCBQSAggIKgQIGUAPKgQIGUAB"
>
>
>
coders: <
key: "c4"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c3"
>
>
coders: <
key: "c5"
value: <
spec: <
urn: "beam:coder:kv:v1"
>
component_coder_ids: "c0"
component_coder_ids: "c4"
>
>
coders: <
key: "c6"
value: <
spec: <
urn: "beam:coder:row:v1"
payload:
"\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d"
>
>
>
coders: <
key: "c7"
value: <
spec: <
urn: "beam:go:coder:custom:v1"
payload:
"ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE="
>
>
>
coders: <
key: "c8"
value: <
spec: <
urn: "beam:coder:length_prefix:v1"
>
component_coder_ids: "c7"
>
>
coders: <
key: "c9"
value: <
spec: <
urn: "beam:coder:bool:v1"
>
>
>
environments: <
key: "go"
value: <
urn: "beam:env:docker:v1"
payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest"
capabilities: "beam:protocol:progress_reporting:v1"
capabilities: "beam:protocol:multi_core_bundle_processing:v1"
capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1"
capabilities: "beam:protocol:****_status:v1"
capabilities: "beam:protocol:monitoring_info_short_ids:v1"
capabilities: "beam:version:sdk_base:go"
capabilities: "beam:coder:bytes:v1"
capabilities: "beam:coder:bool:v1"
capabilities: "beam:coder:varint:v1"
capabilities: "beam:coder:double:v1"
capabilities: "beam:coder:string_utf8:v1"
capabilities: "beam:coder:length_prefix:v1"
capabilities: "beam:coder:kv:v1"
capabilities: "beam:coder:iterable:v1"
capabilities: "beam:coder:state_backed_iterable:v1"
capabilities: "beam:coder:windowed_value:v1"
capabilities: "beam:coder:global_window:v1"
capabilities: "beam:coder:interval_window:v1"
capabilities: "beam:coder:row:v1"
capabilities: "beam:coder:nullable:v1"
dependencies: <
type_urn: "beam:artifact:type:file:v1"
role_urn: "beam:artifact:role:go_****_binary:v1"
>
>
>
>
root_transform_ids: "s1"
root_transform_ids: "e4"
root_transform_ids: "s2"
root_transform_ids: "e7"
root_transform_ids: "e8"
requirements: "beam:requirement:pardo:splittable_dofn:v1"
2022/07/16 08:43:18 Using specified **** binary: 'linux_amd64/combine'
2022/07/16 08:43:18 Prepared job with id:
load-tests-go-flink-batch-combine-1-0716065313_d321cd1f-57c0-4d5e-b456-7ad87976435f
and staging token:
load-tests-go-flink-batch-combine-1-0716065313_d321cd1f-57c0-4d5e-b456-7ad87976435f
2022/07/16 08:43:22 Staged binary artifact with token:
2022/07/16 08:43:23 Submitted job:
load0tests0go0flink0batch0combine0100716065313-root-0716084322-c867566d_5e38d5ef-fffc-411d-a496-e55fc5215205
2022/07/16 08:43:23 Job state: STOPPED
2022/07/16 08:43:23 Job state: STARTING
2022/07/16 08:43:23 Job state: RUNNING
2022/07/16 08:44:32 (): java.lang.RuntimeException:
java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error
while waiting for job to be initialized
at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316)
at
org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061)
at
org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958)
at
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85)
at
org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException:
Error while waiting for job to be initialized
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at
org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056)
... 11 more
Caused by: java.lang.RuntimeException: Error while waiting for job to be
initialized
at
org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160)
at
org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82)
at
org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73)
at
java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616)
at
java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591)
at
java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457)
at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289)
at
java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056)
at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692)
at
java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175)
Caused by: java.util.concurrent.ExecutionException:
org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not
complete the operation. Number of retries has been exhausted.
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)
at
org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83)
at
org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140)
... 9 more
Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException:
Could not complete the operation. Number of retries has been exhausted.
at
org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386)
at
java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774)
at
java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750)
at
java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488)
at
java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943)
at
java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:750)
Caused by: java.util.concurrent.CompletionException:
org.apache.flink.runtime.rest.util.RestClientException: Response was neither of
the expected type([simple type, class
org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at
java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326)
at
java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338)
at
java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925)
at
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967)
at
java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940)
... 4 more
Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was
neither of the expected type([simple type, class
org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error.
at
org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502)
at
org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466)
at
java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966)
... 5 more
Caused by:
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException:
Cannot map `null` into type `long` (set
DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to
'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain:
org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569)
at
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867)
at
org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475)
... 7 more
2022/07/16 08:44:32 ():
org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException:
Cannot map `null` into type `long` (set
DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to
'false' to allow)
at [Source: UNKNOWN; line: -1, column: -1] (through reference chain:
org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"])
2022/07/16 08:44:32 Job state: FAILED
2022/07/16 08:44:32 Failed to execute job: job
load0tests0go0flink0batch0combine0100716065313-root-0716084322-c867566d_5e38d5ef-fffc-411d-a496-e55fc5215205
failed
panic: Failed to execute job: job
load0tests0go0flink0batch0combine0100716065313-root-0716084322-c867566d_5e38d5ef-fffc-411d-a496-e55fc5215205
failed
goroutine 1 [running]:
github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1651e68,
0xc00012e000}, {0x14bc54e?, 0x1ff8bb8?}, {0xc0005e9e70?, 0x0?, 0x0?})
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153
+0xa5
main.main()
<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/combine/combine.go>:88
+0x3c7
> Task :sdks:go:test:load:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Go_Combine_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'>
line: 31
* What went wrong:
Execution failed for task ':sdks:go:test:load:run'.
> Process 'command 'sh'' finished with non-zero exit value 2
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 1m 36s
12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/6soyr26zzp2e2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]