See <https://ci-beam.apache.org/job/beam_LoadTests_Go_ParDo_Flink_Batch/643/display/redirect?page=changes>
Changes: [Brian Hulette] Extract utilities in dataframe.schemas [Brian Hulette] Add pandas_type_compatibility with pandas BatchConverter implementations [Brian Hulette] Use Batched DoFns at DataFrame API boundaries [Brian Hulette] Move dtype conversion to pandas_type_compatibility [Brian Hulette] Always register pandas BatchConverters [Brian Hulette] Fix interactive runner tests [Brian Hulette] Use pandas_type_compatibility BatchConverters for dataframe.schemas [Brian Hulette] Skip test cases broken in pandas 1.1.x [Brian Hulette] Address review comments [Brian Hulette] yapf, typo in test [noreply] Filter out unsupported state tests (#22963) [noreply] Add ability to remove/clear map and set state (#22938) [Brian Hulette] Add test to reproduce https://github.com/apache/beam/issues/22854 [Brian Hulette] Exercise row coder with nested optional struct [Brian Hulette] Make RowTypeConstraint callable [Brian Hulette] Add test to exercise RowTypeConstraint.__call__ [noreply] Fix gpu to cpu conversion with warning logs (#22795) [noreply] Add Go stateful DoFns to CHANGES.md and fix linting violations (#22958) [noreply] 22805: Upgrade Jackson version from 2.13.0 to 2.13.3 (#22806) [noreply] Run cred rotation every month (#22977) [noreply] [BEAM-12164] Synchronize access queue in ThroughputEstimator and ------------------------------------------ [...truncated 36.27 KB...] trigger: < default: < > > accumulation_mode: DISCARDING output_time: END_OF_WINDOW closing_behavior: EMIT_IF_NONEMPTY on_time_behavior: FIRE_IF_NONEMPTY environment_id: "go" > > coders: < key: "c0" value: < spec: < urn: "beam:coder:bytes:v1" > > > coders: < key: "c1" value: < spec: < urn: "beam:coder:kv:v1" > component_coder_ids: "c0" component_coder_ids: "c0" > > coders: < key: "c2" value: < spec: < urn: "beam:coder:global_window:v1" > > > coders: < key: "c3" value: < spec: < urn: "beam:coder:row:v1" payload: "\n\021\n\013num_records\032\002\020\004\n\024\n\016initial_splits\032\002\020\004\n\016\n\010key_size\032\002\020\004\n\020\n\nvalue_size\032\002\020\004\n\022\n\014num_hot_keys\032\002\020\004\n\026\n\020hot_key_fraction\032\002\020\006\022$f691cccd-3963-4ed9-9f25-d9fdfd07b30d" > > > coders: < key: "c4" value: < spec: < urn: "beam:go:coder:custom:v1" payload: "ChdvZmZzZXRyYW5nZS5SZXN0cmljdGlvbhJTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24atAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdEVuYxJlCBYiUwgaSk9naXRodWIuY29tL2FwYWNoZS9iZWFtL3Nka3MvdjIvZ28vcGtnL2JlYW0vaW8vcnRyYWNrZXJzL29mZnNldHJhbmdlLlJlc3RyaWN0aW9uKgYIFBICCAgqBAgZQAEitAEKS2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UucmVzdERlYxJlCBYiBggUEgIICCpTCBpKT2dpdGh1Yi5jb20vYXBhY2hlL2JlYW0vc2Rrcy92Mi9nby9wa2cvYmVhbS9pby9ydHJhY2tlcnMvb2Zmc2V0cmFuZ2UuUmVzdHJpY3Rpb24qBAgZQAE=" > > > coders: < key: "c5" value: < spec: < urn: "beam:coder:length_prefix:v1" > component_coder_ids: "c4" > > coders: < key: "c6" value: < spec: < urn: "beam:coder:bool:v1" > > > coders: < key: "c7" value: < spec: < urn: "beam:coder:kv:v1" > component_coder_ids: "c5" component_coder_ids: "c6" > > environments: < key: "go" value: < urn: "beam:env:docker:v1" payload: "\n6gcr.io/apache-beam-testing/beam-sdk/beam_go_sdk:latest" capabilities: "beam:protocol:progress_reporting:v1" capabilities: "beam:protocol:multi_core_bundle_processing:v1" capabilities: "beam:transform:sdf_truncate_sized_restrictions:v1" capabilities: "beam:protocol:****_status:v1" capabilities: "beam:protocol:monitoring_info_short_ids:v1" capabilities: "beam:version:sdk_base:go" capabilities: "beam:coder:bytes:v1" capabilities: "beam:coder:bool:v1" capabilities: "beam:coder:varint:v1" capabilities: "beam:coder:double:v1" capabilities: "beam:coder:string_utf8:v1" capabilities: "beam:coder:length_prefix:v1" capabilities: "beam:coder:kv:v1" capabilities: "beam:coder:iterable:v1" capabilities: "beam:coder:state_backed_iterable:v1" capabilities: "beam:coder:windowed_value:v1" capabilities: "beam:coder:global_window:v1" capabilities: "beam:coder:interval_window:v1" capabilities: "beam:coder:row:v1" capabilities: "beam:coder:nullable:v1" dependencies: < type_urn: "beam:artifact:type:file:v1" role_urn: "beam:artifact:role:go_****_binary:v1" > > > > root_transform_ids: "s1" root_transform_ids: "e4" root_transform_ids: "e5" root_transform_ids: "e6" root_transform_ids: "e7" root_transform_ids: "e8" root_transform_ids: "e9" root_transform_ids: "e10" root_transform_ids: "e11" root_transform_ids: "e12" root_transform_ids: "e13" root_transform_ids: "e14" requirements: "beam:requirement:pardo:splittable_dofn:v1" 2022/09/01 10:36:23 Using specified **** binary: 'linux_amd64/pardo' 2022/09/01 10:36:23 Prepared job with id: load-tests-go-flink-batch-pardo-1-0901100306_1d320af5-cc1b-4b85-b06a-f5ae2b6381f4 and staging token: load-tests-go-flink-batch-pardo-1-0901100306_1d320af5-cc1b-4b85-b06a-f5ae2b6381f4 2022/09/01 10:36:27 Staged binary artifact with token: 2022/09/01 10:36:28 Submitted job: load0tests0go0flink0batch0pardo0100901100306-root-0901103627-d903fc1c_b0ca9fdf-59a5-466b-aafa-1734591976a4 2022/09/01 10:36:28 Job state: STOPPED 2022/09/01 10:36:28 Job state: STARTING 2022/09/01 10:36:28 Job state: RUNNING 2022/09/01 10:37:37 (): java.lang.RuntimeException: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized at org.apache.flink.util.ExceptionUtils.rethrow(ExceptionUtils.java:316) at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1061) at org.apache.flink.api.java.ExecutionEnvironment.execute(ExecutionEnvironment.java:958) at org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:195) at org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:118) at org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:85) at org.apache.beam.runners.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:86) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57) at org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException: Error while waiting for job to be initialized at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357) at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908) at org.apache.flink.api.java.ExecutionEnvironment.executeAsync(ExecutionEnvironment.java:1056) ... 11 more Caused by: java.lang.RuntimeException: Error while waiting for job to be initialized at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:160) at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$execute$2(AbstractSessionClusterExecutor.java:82) at org.apache.flink.util.function.FunctionUtils.lambda$uncheckedFunction$2(FunctionUtils.java:73) at java.util.concurrent.CompletableFuture.uniApply(CompletableFuture.java:616) at java.util.concurrent.CompletableFuture$UniApply.tryFire(CompletableFuture.java:591) at java.util.concurrent.CompletableFuture$Completion.exec(CompletableFuture.java:457) at java.util.concurrent.ForkJoinTask.doExec(ForkJoinTask.java:289) at java.util.concurrent.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java:1056) at java.util.concurrent.ForkJoinPool.runWorker(ForkJoinPool.java:1692) at java.util.concurrent.ForkJoinWorkerThread.run(ForkJoinWorkerThread.java:175) Caused by: java.util.concurrent.ExecutionException: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted. at java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357) at java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908) at org.apache.flink.client.deployment.executors.AbstractSessionClusterExecutor.lambda$null$0(AbstractSessionClusterExecutor.java:83) at org.apache.flink.client.ClientUtils.waitUntilJobInitializationFinished(ClientUtils.java:140) ... 9 more Caused by: org.apache.flink.runtime.concurrent.FutureUtils$RetryException: Could not complete the operation. Number of retries has been exhausted. at org.apache.flink.runtime.concurrent.FutureUtils.lambda$retryOperationWithDelay$9(FutureUtils.java:386) at java.util.concurrent.CompletableFuture.uniWhenComplete(CompletableFuture.java:774) at java.util.concurrent.CompletableFuture$UniWhenComplete.tryFire(CompletableFuture.java:750) at java.util.concurrent.CompletableFuture.postComplete(CompletableFuture.java:488) at java.util.concurrent.CompletableFuture.postFire(CompletableFuture.java:575) at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:943) at java.util.concurrent.CompletableFuture$Completion.run(CompletableFuture.java:456) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:750) Caused by: java.util.concurrent.CompletionException: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error. at java.util.concurrent.CompletableFuture.encodeRelay(CompletableFuture.java:326) at java.util.concurrent.CompletableFuture.completeRelay(CompletableFuture.java:338) at java.util.concurrent.CompletableFuture.uniRelay(CompletableFuture.java:925) at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:967) at java.util.concurrent.CompletableFuture$UniCompose.tryFire(CompletableFuture.java:940) ... 4 more Caused by: org.apache.flink.runtime.rest.util.RestClientException: Response was neither of the expected type([simple type, class org.apache.flink.runtime.rest.messages.job.JobDetailsInfo]) nor an error. at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:502) at org.apache.flink.runtime.rest.RestClient.lambda$submitRequest$3(RestClient.java:466) at java.util.concurrent.CompletableFuture.uniCompose(CompletableFuture.java:966) ... 5 more Caused by: org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow) at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"]) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException.from(MismatchedInputException.java:63) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.DeserializationContext.reportInputMismatch(DeserializationContext.java:1575) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.std.NumberDeserializers$PrimitiveOrWrapperDeserializer.getNullValue(NumberDeserializers.java:176) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer._findMissing(PropertyValueBuffer.java:204) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyValueBuffer.getParameters(PropertyValueBuffer.java:160) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.ValueInstantiator.createFromObjectWith(ValueInstantiator.java:288) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.impl.PropertyBasedCreator.build(PropertyBasedCreator.java:202) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer._deserializeUsingPropertyBased(BeanDeserializer.java:520) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializerBase.deserializeFromObjectUsingNonDefault(BeanDeserializerBase.java:1390) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:362) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.BeanDeserializer.deserialize(BeanDeserializer.java:195) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.deser.DefaultDeserializationContext.readRootValue(DefaultDeserializationContext.java:322) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper._readValue(ObjectMapper.java:4569) at org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.ObjectMapper.readValue(ObjectMapper.java:2867) at org.apache.flink.runtime.rest.RestClient.parseResponse(RestClient.java:475) ... 7 more 2022/09/01 10:37:37 (): org.apache.flink.shaded.jackson2.com.fasterxml.jackson.databind.exc.MismatchedInputException: Cannot map `null` into type `long` (set DeserializationConfig.DeserializationFeature.FAIL_ON_NULL_FOR_PRIMITIVES to 'false' to allow) at [Source: UNKNOWN; line: -1, column: -1] (through reference chain: org.apache.flink.runtime.rest.messages.job.JobDetailsInfo["maxParallelism"]) 2022/09/01 10:37:37 Job state: FAILED 2022/09/01 10:37:37 Failed to execute job: job load0tests0go0flink0batch0pardo0100901100306-root-0901103627-d903fc1c_b0ca9fdf-59a5-466b-aafa-1734591976a4 failed panic: Failed to execute job: job load0tests0go0flink0batch0pardo0100901100306-root-0901103627-d903fc1c_b0ca9fdf-59a5-466b-aafa-1734591976a4 failed goroutine 1 [running]: github.com/apache/beam/sdks/v2/go/pkg/beam/log.Fatalf({0x1695908, 0xc00012a000}, {0x14faff4?, 0xc000013b60?}, {0xc000437e70?, 0x0?, 0x0?}) <https://ci-beam.apache.org/job/beam_LoadTests_Go_ParDo_Flink_Batch/ws/src/sdks/go/pkg/beam/log/log.go>:153 +0xa5 main.main() <https://ci-beam.apache.org/job/beam_LoadTests_Go_ParDo_Flink_Batch/ws/src/sdks/go/test/load/pardo/pardo.go>:109 +0x3aa > Task :sdks:go:test:load:run FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://ci-beam.apache.org/job/beam_LoadTests_Go_ParDo_Flink_Batch/ws/src/sdks/go/test/load/build.gradle'> line: 31 * What went wrong: Execution failed for task ':sdks:go:test:load:run'. > Process 'command 'sh'' finished with non-zero exit value 2 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 1m 37s 12 actionable tasks: 6 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/zojp7yky7v3us Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
