See
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/112/display/redirect?page=changes>
Changes:
[mxm] [BEAM-7436] Use coder for Flink's encoded key instead of falling back to
[mqi] add error log when fail to begin staging artifact
[iemejia] [BEAM-7114] Move dot pipeline graph renderer to
------------------------------------------
[...truncated 339.22 KB...]
INFO: 2019-05-28T18:23:03.918Z: Fusing consumer Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Extract
into Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:03.966Z: Fusing consumer Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write
using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial
into Write using Hadoop OutputFormat/CollectWriteTasks/WithKeys/AddKeys/Map
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.012Z: Fusing consumer Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Write
into Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.060Z: Fusing consumer Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Reify
into Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey+Write
using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues/Partial
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.108Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Write
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.163Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.211Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
into Write using Hadoop OutputFormat/CreateOutputConfig/Read(CreateSource)
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.236Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.276Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Read
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.334Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.384Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.441Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.495Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Extract
May 28, 2019 6:23:04 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:04.543Z: Fusing consumer Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
into Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Read
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:05.410Z: Executing operation Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Create
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:05.491Z: Executing operation Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly/Create
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:05.562Z: Executing operation Write using Hadoop
OutputFormat/GroupDataByPartition/GroupByTaskId/Create
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:05.577Z: Starting 1 workers in us-central1-a...
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:05.638Z: Executing operation Write using Hadoop
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey/Create
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:05.713Z: Executing operation Prevent fusion before
writing/Reshuffle/GroupByKey/Create
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:06.392Z: Executing operation Write using Hadoop
OutputFormat/CreateOutputConfig/Read(CreateSource)+Write using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map+Write
using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey+Write
using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues/Partial+Write
using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Reify+Write
using Hadoop
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey/Write
May 28, 2019 6:23:06 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:23:06.485Z: Executing operation Generate
sequence/Read(BoundedCountingSource)+Produce db rows+Prevent fusion before
writing/Pair with random key+Prevent fusion before
writing/Reshuffle/Window.Into()/Window.Assign+Prevent fusion before
writing/Reshuffle/GroupByKey/Reify+Prevent fusion before
writing/Reshuffle/GroupByKey/Write
May 28, 2019 6:24:43 PM
org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process
INFO: 2019-05-28T18:24:43.328Z: Workers have started successfully.
May 28, 2019 6:24:58 PM
org.apache.beam.runners.dataflow.DataflowPipelineJob lambda$waitUntilFinish$0
WARNING: Job is already running in Google Cloud Platform, Ctrl-C will not
cancel it.
To cancel the job in the cloud, run:
> gcloud dataflow jobs --project=apache-beam-testing cancel
--region=us-central1 2019-05-28_11_22_48-8937295034484174960
org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT >
writeAndReadUsingHadoopFormat SKIPPED
> Task :sdks:java:io:hadoop-format:integrationTest FAILED
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution worker for ':'
Thread 5,5,main]) completed. Took 2 mins 51.212 secs.
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
90 actionable tasks: 2 executed, 88 up-to-date
Publishing build scan...
STDERR:
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> Process 'Gradle Test Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
* Try:
Run with --debug option to get more log output. Run with --scan to get full
insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task
':sdks:java:io:hadoop-format:integrationTest'.
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:121)
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:117)
at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:184)
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:110)
at
org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
at
org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
at
org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
at
org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
at
org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
at
org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
at
org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
at
org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
at
org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
at
org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
at
org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
at
org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
at
org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
at
org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
at
org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
at
org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
at
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
at
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
at
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
at
org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
at
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
at
org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
at
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
at
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
at
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
at
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
at
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
at
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
at
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
at
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
at
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.process.internal.ExecException: Process 'Gradle Test
Executor 1' finished with non-zero exit value 143
This problem might be caused by incorrect test process configuration.
Please refer to the test execution section in the User Manual at
https://docs.gradle.org/5.2.1/userguide/java_testing.html#sec:test_execution
at
org.gradle.api.internal.tasks.testing.worker.ForkingTestClassProcessor.stop(ForkingTestClassProcessor.java:163)
at
org.gradle.api.internal.tasks.testing.processors.RestartEveryNTestClassProcessor.endBatch(RestartEveryNTestClassProcessor.java:77)
at
org.gradle.api.internal.tasks.testing.processors.RestartEveryNTestClassProcessor.stop(RestartEveryNTestClassProcessor.java:62)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:35)
at
org.gradle.internal.dispatch.ReflectionDispatch.dispatch(ReflectionDispatch.java:24)
at
org.gradle.internal.dispatch.FailureHandlingDispatch.dispatch(FailureHandlingDispatch.java:29)
at
org.gradle.internal.dispatch.AsyncDispatch.dispatchMessages(AsyncDispatch.java:87)
at
org.gradle.internal.dispatch.AsyncDispatch.access$000(AsyncDispatch.java:36)
at
org.gradle.internal.dispatch.AsyncDispatch$1.run(AsyncDispatch.java:71)
at
org.gradle.internal.concurrent.InterruptibleRunnable.run(InterruptibleRunnable.java:42)
at
org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
... 3 more
* Get more help at https://help.gradle.org
BUILD FAILED in 2m 55s
The message received from the daemon indicates that the daemon has disappeared.
Build request sent: Build{id=44da6912-7879-43e2-a393-8790df467bc6,
currentDir=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src}>
Attempting to read last messages from the daemon log...
Daemon pid: 997
log file: /home/jenkins/.gradle/daemon/5.2.1/daemon-997.out.log
----- Last 20 lines from daemon log file - daemon-997.out.log -----
at
org.gradle.internal.dispatch.FailureHandlingDispatch.dispatch(FailureHandlingDispatch.java:29)
at
org.gradle.internal.dispatch.AsyncDispatch.dispatchMessages(AsyncDispatch.java:87)
at
org.gradle.internal.dispatch.AsyncDispatch.access$000(AsyncDispatch.java:36)
at
org.gradle.internal.dispatch.AsyncDispatch$1.run(AsyncDispatch.java:71)
at
org.gradle.internal.concurrent.InterruptibleRunnable.run(InterruptibleRunnable.java:42)
at
org.gradle.internal.operations.CurrentBuildOperationPreservingRunnable.run(CurrentBuildOperationPreservingRunnable.java:42)
... 3 more
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 2m 55s
90 actionable tasks: 2 executed, 88 up-to-date
Publishing build scan...
Daemon vm is shutting down... The daemon has exited normally or was terminated
in response to a user interrupt.
----- End of the daemon log -----
FAILURE: Build failed with an exception.
* What went wrong:
Gradle build daemon disappeared unexpectedly (it may have been killed or may
have crashed)
* Try:
Run with --debug option to get more log output. Run with --scan to get full
insights.
* Exception is:
org.gradle.launcher.daemon.client.DaemonDisappearedException: Gradle build
daemon disappeared unexpectedly (it may have been killed or may have crashed)
at
org.gradle.launcher.daemon.client.DaemonClient.handleDaemonDisappearance(DaemonClient.java:241)
at
org.gradle.launcher.daemon.client.DaemonClient.monitorBuild(DaemonClient.java:217)
at
org.gradle.launcher.daemon.client.DaemonClient.executeBuild(DaemonClient.java:179)
at
org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:142)
at
org.gradle.launcher.daemon.client.DaemonClient.execute(DaemonClient.java:94)
at org.gradle.launcher.cli.RunBuildAction.run(RunBuildAction.java:55)
at
org.gradle.internal.Actions$RunnableActionAdapter.execute(Actions.java:207)
at
org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:403)
at
org.gradle.launcher.cli.CommandLineActionFactory$ParseAndBuildAction.execute(CommandLineActionFactory.java:376)
at
org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:37)
at
org.gradle.launcher.cli.ExceptionReportingAction.execute(ExceptionReportingAction.java:23)
at
org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:369)
at
org.gradle.launcher.cli.CommandLineActionFactory$WithLogging.execute(CommandLineActionFactory.java:299)
at org.gradle.launcher.Main.doAction(Main.java:36)
at org.gradle.launcher.bootstrap.EntryPoint.run(EntryPoint.java:45)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.launcher.bootstrap.ProcessBootstrap.runNoExit(ProcessBootstrap.java:60)
at
org.gradle.launcher.bootstrap.ProcessBootstrap.run(ProcessBootstrap.java:37)
at org.gradle.launcher.GradleMain.main(GradleMain.java:23)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at
org.gradle.wrapper.BootstrapMainStarter.start(BootstrapMainStarter.java:31)
at org.gradle.wrapper.WrapperExecutor.execute(WrapperExecutor.java:108)
at org.gradle.wrapper.GradleWrapperMain.main(GradleWrapperMain.java:61)
* Get more help at https://help.gradle.org
2019-05-28 18:25:00,587 1324a470 MainThread beam_integration_benchmark(1/1)
ERROR Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 760, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 609, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",>
line 160, in Run
job_type=job_type)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",>
line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-28 18:25:00,589 1324a470 MainThread beam_integration_benchmark(1/1)
INFO Cleaning up benchmark beam_integration_benchmark
2019-05-28 18:25:00,589 1324a470 MainThread beam_integration_benchmark(1/1)
INFO Running: kubectl
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-112>
delete -f
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml>
--ignore-not-found
2019-05-28 18:25:00,953 1324a470 MainThread beam_integration_benchmark(1/1)
ERROR Exception running benchmark
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 903, in RunBenchmarkTask
RunBenchmark(spec, collector)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 760, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 609, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",>
line 160, in Run
job_type=job_type)
File
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",>
line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-05-28 18:25:00,954 1324a470 MainThread beam_integration_benchmark(1/1)
ERROR Benchmark 1/1 beam_integration_benchmark (UID:
beam_integration_benchmark0) failed. Execution will continue.
2019-05-28 18:25:00,954 1324a470 MainThread beam_integration_benchmark(1/1)
INFO Benchmark run statuses:
---------------------------------------------------------------------------------
Name UID Status Failed
Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark beam_integration_benchmark0 FAILED
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-28 18:25:00,955 1324a470 MainThread beam_integration_benchmark(1/1)
INFO Complete logs can be found at:
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/1324a470/pkb.log>
2019-05-28 18:25:00,955 1324a470 MainThread beam_integration_benchmark(1/1)
INFO Completion statuses can be found at:
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/1324a470/completion_statuses.json>
Build step 'Execute shell' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]