See
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/182/display/redirect?page=changes>
Changes:
[suztomo] [BEAM-8911] New Guava version: 25.1-jre
[zyichi] [BEAM-8886] Add a mongodb io dataflow integration test
[heejong] [BEAM-8905] matching Java PCollectionTuple translation naming
convention
[echauchot] [BEAM-8894] exclude FlattenWithHeterogeneousCoders category because
[echauchot] [BEAM-8025] Under load we get a NoHostAvailable exception at cluster
[github] [BEAM-8786] Fixes a link in readme
[echauchot] [BEAM-8025] Remove temporary folder rule because it suppresses
files on
[echauchot] [BEAM-8025] Disable auto compaction on Cassandra node to avoid race
[pawel.pasterz] [BEAM-8946] Publish collection size of data written during
MongoDBIOIT
[chamikara] Merge pull request #10347: [BEAM-8885] PubsubGrpcClient doesn't
respect
[kenn] [BEAM-8917] jsr305 dependency declaration for Nullable class (#10324)
[xinyuliu.us] [BEAM-8342]: upgrade to samza 1.3.0 (#10357)
[aaltay] Make model_pcollection snippet self-contained (#10343)
[pabloem] Merge pull request #10050 from [BEAM-8575] Add streaming test case for
[github] Run beam_CancelStaleDataflowJobs every 4 hours.
[tvalentyn] [BEAM-8575] Added a unit test to test Combine works with sessions.
[github] [GoSDK] Improve StateChannel resilience. (#10363)
[tweise] [BEAM-8273] Expand portability environment documentation (#10116)
------------------------------------------
[...truncated 267.75 KB...]
bool_value: false
}
}
fields {
key: "beam:option:flink_version:v1"
value {
string_value: "1.9"
}
}
fields {
key: "beam:option:gcs_performance_metrics:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:job_endpoint:v1"
value {
string_value: "localhost:8099"
}
}
fields {
key: "beam:option:job_name:v1"
value {
string_value: "load_tests_Python_Flink_Batch_GBK_3_1213101743"
}
}
fields {
key: "beam:option:job_port:v1"
value {
string_value: "0"
}
}
fields {
key: "beam:option:job_server_timeout:v1"
value {
string_value: "60"
}
}
fields {
key: "beam:option:no_auth:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:object_reuse:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:parallelism:v1"
value {
string_value: "5"
}
}
fields {
key: "beam:option:pipeline_type_check:v1"
value {
bool_value: true
}
}
fields {
key: "beam:option:profile_cpu:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:profile_memory:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:profile_sample_rate:v1"
value {
number_value: 1.0
}
}
fields {
key: "beam:option:project:v1"
value {
string_value: "apache-beam-testing"
}
}
fields {
key: "beam:option:retain_docker_containers:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:retain_externalized_checkpoints_on_cancellation:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:runtime_type_check:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:save_main_session:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:sdk_location:v1"
value {
string_value: "container"
}
}
fields {
key: "beam:option:sdk_worker_parallelism:v1"
value {
string_value: "1"
}
}
fields {
key: "beam:option:shutdown_sources_on_final_watermark:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:streaming:v1"
value {
bool_value: false
}
}
fields {
key: "beam:option:type_check_strictness:v1"
value {
string_value: "DEFAULT_TO_ANY"
}
}
fields {
key: "beam:option:update:v1"
value {
bool_value: false
}
}
}
job_name: "job"
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STOPPED
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
STARTING
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
RUNNING
root: DEBUG: org.apache.flink.client.program.ProgramInvocationException: Job
failed. (JobID: 9c0195229a203121c5606cda801adf24)
at
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:262)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:338)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:326)
at
org.apache.flink.client.program.ClusterClient.run(ClusterClient.java:301)
at
org.apache.flink.client.RemoteExecutor.executePlanWithJars(RemoteExecutor.java:209)
at
org.apache.flink.client.RemoteExecutor.executePlan(RemoteExecutor.java:186)
at
org.apache.flink.api.java.RemoteEnvironment.execute(RemoteEnvironment.java:173)
at
org.apache.beam.runners.flink.FlinkBatchPortablePipelineTranslator$BatchTranslationContext.execute(FlinkBatchPortablePipelineTranslator.java:191)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.runPipelineWithTranslator(FlinkPipelineRunner.java:116)
at
org.apache.beam.runners.flink.FlinkPipelineRunner.run(FlinkPipelineRunner.java:84)
at
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation.runPipeline(JobInvocation.java:81)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask$TrustedFutureInterruptibleTask.runInterruptibly(TrustedListenableFutureTask.java:125)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.InterruptibleTask.run(InterruptibleTask.java:57)
at
org.apache.beam.vendor.guava.v26_0_jre.com.google.common.util.concurrent.TrustedListenableFutureTask.run(TrustedListenableFutureTask.java:78)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Caused by: org.apache.flink.runtime.client.JobExecutionException: Job execution
failed.
at
org.apache.flink.runtime.jobmaster.JobResult.toJobExecutionResult(JobResult.java:146)
at
org.apache.flink.client.program.rest.RestClusterClient.submitJob(RestClusterClient.java:259)
... 16 more
Caused by: java.lang.Exception: The data preparation for task 'GroupReduce
(GroupReduce at GroupByKey 0)' , caused an error: Error obtaining the sorted
input: Thread 'SortMerger Reading Thread' terminated due to an exception:
Connection unexpectedly closed by remote task manager
'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'.
This might indicate that the remote task manager was lost.
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:480)
at
org.apache.flink.runtime.operators.BatchTask.invoke(BatchTask.java:369)
at org.apache.flink.runtime.taskmanager.Task.doRun(Task.java:705)
at org.apache.flink.runtime.taskmanager.Task.run(Task.java:530)
... 1 more
Caused by: java.lang.RuntimeException: Error obtaining the sorted input: Thread
'SortMerger Reading Thread' terminated due to an exception: Connection
unexpectedly closed by remote task manager
'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'.
This might indicate that the remote task manager was lost.
at
org.apache.flink.runtime.operators.sort.UnilateralSortMerger.getIterator(UnilateralSortMerger.java:650)
at
org.apache.flink.runtime.operators.BatchTask.getInput(BatchTask.java:1109)
at
org.apache.flink.runtime.operators.GroupReduceDriver.prepare(GroupReduceDriver.java:99)
at org.apache.flink.runtime.operators.BatchTask.run(BatchTask.java:474)
... 4 more
Caused by: java.io.IOException: Thread 'SortMerger Reading Thread' terminated
due to an exception: Connection unexpectedly closed by remote task manager
'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'.
This might indicate that the remote task manager was lost.
at
org.apache.flink.runtime.operators.sort.UnilateralSortMerger$ThreadBase.run(UnilateralSortMerger.java:831)
Caused by:
org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException:
Connection unexpectedly closed by remote task manager
'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'.
This might indicate that the remote task manager was lost.
at
org.apache.flink.runtime.io.network.netty.CreditBasedPartitionRequestClientHandler.channelInactive(CreditBasedPartitionRequestClientHandler.java:136)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
at
org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInputClosed(ByteToMessageDecoder.java:390)
at
org.apache.flink.shaded.netty4.io.netty.handler.codec.ByteToMessageDecoder.channelInactive(ByteToMessageDecoder.java:355)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.fireChannelInactive(AbstractChannelHandlerContext.java:224)
at
org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline$HeadContext.channelInactive(DefaultChannelPipeline.java:1429)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:245)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannelHandlerContext.invokeChannelInactive(AbstractChannelHandlerContext.java:231)
at
org.apache.flink.shaded.netty4.io.netty.channel.DefaultChannelPipeline.fireChannelInactive(DefaultChannelPipeline.java:947)
at
org.apache.flink.shaded.netty4.io.netty.channel.AbstractChannel$AbstractUnsafe$8.run(AbstractChannel.java:826)
at
org.apache.flink.shaded.netty4.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163)
at
org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404)
at
org.apache.flink.shaded.netty4.io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:474)
at
org.apache.flink.shaded.netty4.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:909)
at java.lang.Thread.run(Thread.java:748)
root: ERROR:
org.apache.flink.runtime.io.network.netty.exception.RemoteTransportException:
Connection unexpectedly closed by remote task manager
'beam-loadtests-python-gbk-flink-batch-182-w-1.c.apache-beam-testing.internal/10.128.0.125:44701'.
This might indicate that the remote task manager was lost.
apache_beam.runners.portability.portable_runner: INFO: Job state changed to
FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 25.545s
FAILED (errors=1)
> Task :sdks:python:apache_beam:testing:load_tests:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_LoadTests_Python_GBK_Flink_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
line: 55
* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 30s
3 actionable tasks: 2 executed, 1 up-to-date
Publishing build scan...
https://scans.gradle.com/s/cf5fizkmatwde
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]