See
<https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/199/display/redirect?page=changes>
Changes:
[kcweaver] [BEAM-8321] fix Flink portable jar test
------------------------------------------
[...truncated 211.37 KB...]
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.io.network.NetworkEnvironment - Shutting down the
network environment and its components.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.taskexecutor.JobLeaderService - Stop job leader
service.
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.filecache.FileCache - removed file cache directory
/tmp/flink-dist-cache-ec50810a-0199-4c22-8923-1825886364dd
[flink-akka.actor.default-dispatcher-4] INFO
org.apache.flink.runtime.taskexecutor.TaskExecutor - Stopped TaskExecutor
akka://flink/user/taskmanager_0.
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Removing cache
directory /tmp/flink-web-ui
[ForkJoinPool.commonPool-worker-9] INFO
org.apache.flink.runtime.dispatcher.DispatcherRestEndpoint - Shut down complete.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.resourcemanager.StandaloneResourceManager - Shut down
cluster because application is in CANCELED, diagnostics
DispatcherResourceManagerComponent has been closed..
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping dispatcher
akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Closing the
SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopping all
currently running jobs of dispatcher akka://flink/user/dispatcher.
[flink-akka.actor.default-dispatcher-6] INFO
org.apache.flink.runtime.resourcemanager.slotmanager.SlotManager - Suspending
the SlotManager.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.rest.handler.legacy.backpressure.StackTraceSampleCoordinator
- Shutting down stack trace sample coordinator.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.dispatcher.StandaloneDispatcher - Stopped dispatcher
akka://flink/user/dispatcher.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Shutting down remote daemon.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remote daemon shut down; proceeding with flushing remote transports.
[flink-metrics-2] INFO akka.remote.RemoteActorRefProvider$RemotingTerminator -
Remoting shut down.
[flink-metrics-2] INFO org.apache.flink.runtime.rpc.akka.AkkaRpcService -
Stopping Akka RPC service.
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.blob.PermanentBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.blob.TransientBlobCache - Shutting down BLOB cache
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.blob.BlobServer - Stopped BLOB server at 0.0.0.0:42245
[flink-akka.actor.default-dispatcher-5] INFO
org.apache.flink.runtime.rpc.akka.AkkaRpcService - Stopped Akka RPC service.
Exception in thread "main" java.lang.RuntimeException: Job
BeamApp-jenkins-0930205023-75ae0e20_5cb35137-ec6b-44c6-be10-ab1540295e15 failed.
at
org.apache.beam.runners.flink.FlinkPipelineRunner.main(FlinkPipelineRunner.java:166)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution
failed.
at
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
at
org.apache.flink.runtime.minicluster.MiniCluster.executeJobBlocking(MiniCluster.java:638)
at
org.apache.flink.client.LocalExecutor.executePlan(LocalExecutor.java:223)
at
org.apache.flink.api.java.LocalEnvironment.execute(LocalEnvironment.java:91)
at
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:104)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:80)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.main(FlinkPipelineRunner.java:164)
Caused by: java.util.concurrent.ExecutionException: java.lang.RuntimeException:
Error received from SDK harness for instruction 3: Traceback (most recent call
last):
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 300, in get
processor = self.cached_bundle_processors[bundle_descriptor_id].pop()
IndexError: pop from empty list
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line
261, in loads
return dill.loads(s)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 317, in
loads
return load(file, ignore)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 305, in load
obj = pik.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were
given
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 158, in _execute
response = task()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 191, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 343, in do_instruction
request.instruction_id)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 363, in process_bundle
instruction_id, request.process_bundle_descriptor_id)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 306, in get
self.data_channel_factory)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 580, in __init__
self.ops = self.create_execution_tree(self.process_bundle_descriptor)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 624, in create_execution_tree
descriptor.transforms, key=topological_height, reverse=True)])
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 623, in <listcomp>
for transform_id in sorted(
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 607, in get_operation
in descriptor.transforms[transform_id].outputs.items()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 606, in <dictcomp>
for tag, pcoll_id
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 605, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 607, in get_operation
in descriptor.transforms[transform_id].outputs.items()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 606, in <dictcomp>
for tag, pcoll_id
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 605, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 607, in get_operation
in descriptor.transforms[transform_id].outputs.items()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 606, in <dictcomp>
for tag, pcoll_id
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 605, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 610, in get_operation
transform_id, transform_consumers)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 869, in create_operation
return creator(self, transform_id, transform_proto, payload, consumers)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1112, in create
serialized_fn, parameter)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1150, in _create_pardo_operation
dofn_data = pickler.loads(serialized_fn)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line
265, in loads
return dill.loads(s)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 317, in
loads
return load(file, ignore)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 305, in load
obj = pik.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were
given
at
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)
at
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1895)
at org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:57)
at
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$ActiveBundle.close(SdkHarnessClient.java:254)
at
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.$closeResource(FlinkExecutableStageFunction.java:195)
at
org.apache.beam.runners.flink.translation.functions.FlinkExecutableStageFunction.mapPartition(FlinkExecutableStageFunction.java:195)
at
org.apache.flink.runtime.operators.MapPartitionDriver.run(MapPartitionDriver.java:103)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:503)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:368)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:711)
at java.lang.Thread.run(Thread.java:748)
Caused by: java.lang.RuntimeException: Error received from SDK harness for
instruction 3: Traceback (most recent call last):
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 300, in get
processor = self.cached_bundle_processors[bundle_descriptor_id].pop()
IndexError: pop from empty list
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line
261, in loads
return dill.loads(s)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 317, in
loads
return load(file, ignore)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 305, in load
obj = pik.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were
given
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 158, in _execute
response = task()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 191, in <lambda>
self._execute(lambda: worker.do_instruction(work), work)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 343, in do_instruction
request.instruction_id)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 363, in process_bundle
instruction_id, request.process_bundle_descriptor_id)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/sdk_worker.py",
line 306, in get
self.data_channel_factory)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 580, in __init__
self.ops = self.create_execution_tree(self.process_bundle_descriptor)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 624, in create_execution_tree
descriptor.transforms, key=topological_height, reverse=True)])
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 623, in <listcomp>
for transform_id in sorted(
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 607, in get_operation
in descriptor.transforms[transform_id].outputs.items()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 606, in <dictcomp>
for tag, pcoll_id
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 605, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 607, in get_operation
in descriptor.transforms[transform_id].outputs.items()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 606, in <dictcomp>
for tag, pcoll_id
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 605, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 607, in get_operation
in descriptor.transforms[transform_id].outputs.items()
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 606, in <dictcomp>
for tag, pcoll_id
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 605, in <listcomp>
tag: [get_operation(op) for op in pcoll_consumers[pcoll_id]]
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 548, in wrapper
result = cache[args] = func(*args)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 610, in get_operation
transform_id, transform_consumers)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 869, in create_operation
return creator(self, transform_id, transform_proto, payload, consumers)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1112, in create
serialized_fn, parameter)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/runners/worker/bundle_processor.py",
line 1150, in _create_pardo_operation
dofn_data = pickler.loads(serialized_fn)
File
"/usr/local/lib/python3.5/site-packages/apache_beam/internal/pickler.py", line
265, in loads
return dill.loads(s)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 317, in
loads
return load(file, ignore)
File "/usr/local/lib/python3.5/site-packages/dill/_dill.py", line 305, in load
obj = pik.load()
TypeError: _create_function() takes from 2 to 6 positional arguments but 7 were
given
at
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:157)
at
org.apache.beam.runners.fnexecution.control.FnApiControlClient$ResponseStreamObserver.onNext(FnApiControlClient.java:140)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.stub.ServerCalls$StreamingServerCallHandler$StreamingServerCallListener.onMessage(ServerCalls.java:249)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.ForwardingServerCallListener.onMessage(ForwardingServerCallListener.java:33)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.Contexts$ContextualizedServerCallListener.onMessage(Contexts.java:76)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerCallImpl$ServerStreamListenerImpl.messagesAvailable(ServerCallImpl.java:297)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ServerImpl$JumpToApplicationThreadServerStreamListener$1MessagesAvailable.runInContext(ServerImpl.java:738)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.ContextRunnable.run(ContextRunnable.java:37)
at
org.apache.beam.vendor.grpc.v1p21p0.io.grpc.internal.SerializingExecutor.run(SerializingExecutor.java:123)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
... 1 more
rm -rf $ENV_DIR
rm -f $OUTPUT_JAR
>>> FAILURE
if [[ "$TEST_EXIT_CODE" -eq 0 ]]; then
echo ">>> SUCCESS"
else
echo ">>> FAILURE"
fi
exit $TEST_EXIT_CODE
> Task :runners:flink:1.8:job-server:testPipelineJar FAILED
FAILURE: Build failed with an exception.
* Where:
Script
'<https://builds.apache.org/job/beam_PostCommit_PortableJar_Flink/ws/src/runners/flink/job-server/flink_job_server.gradle'>
line: 186
* What went wrong:
Execution failed for task ':runners:flink:1.8:job-server:testPipelineJar'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 7m 34s
66 actionable tasks: 50 executed, 15 from cache, 1 up-to-date
Publishing build scan...
https://gradle.com/s/n5h36oo553lnq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]