See <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/1504/display/redirect?page=changes>
Changes: [kcweaver] [BEAM-7067] make cleanArtifactsPerJob configurable for Flink job server [iemejia] [BEAM-7075] Create Redis embedded server on @BeforeClass and simplify [iemejia] [BEAM-7076] Update Spark runner to use spark version 2.4.1 [iemejia] [BEAM-7076] Multiple static analysis fixes on Spark runner ------------------------------------------ [...truncated 279.69 KB...] INFO: Adding PAssert$0/GroupGlobally/Values/Values/Map as step s64 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding PAssert$0/GroupGlobally/ParDo(Concat) as step s65 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding PAssert$0/GetPane/Map as step s66 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding PAssert$0/RunChecks as step s67 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding PAssert$0/VerifyAssertions/ParDo(DefaultConclude) as step s68 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/WithKeys/AddKeys/Map as step s69 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/GroupByKey as step s70 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Combine.perKey(Singleton)/Combine.GroupedValues as step s71 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/Combine.globally(Singleton)/Values/Values/Map as step s72 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey) as step s73 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly as step s74 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow) as step s75 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView as step s76 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep INFO: Adding Delete test files as step s77 Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: Staging pipeline description to gs://temp-storage-for-perf-tests/avroioit0writethenreadall-jenkins-0415121359-b8c96f62/output/results/staging/ Apr 15, 2019 12:14:06 PM org.apache.beam.runners.dataflow.util.PackageUtil tryStagePackage INFO: Uploading <189700 bytes, hash FaBJUieOgvxJTfHGbNoN8Q> to gs://temp-storage-for-perf-tests/avroioit0writethenreadall-jenkins-0415121359-b8c96f62/output/results/staging/pipeline-FaBJUieOgvxJTfHGbNoN8Q.pb org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_OUT Dataflow SDK version: 2.13.0-SNAPSHOT org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_ERROR Apr 15, 2019 12:14:07 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-04-15_05_14_06-2064668138826108957?project=apache-beam-testing org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_OUT Submitted job: 2019-04-15_05_14_06-2064668138826108957 org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll STANDARD_ERROR Apr 15, 2019 12:14:07 PM org.apache.beam.runners.dataflow.DataflowRunner run INFO: To cancel the job using the 'gcloud' tool, run: > gcloud dataflow jobs --project=apache-beam-testing cancel --region=us-central1 2019-04-15_05_14_06-2064668138826108957 Apr 15, 2019 12:14:07 PM org.apache.beam.runners.dataflow.TestDataflowRunner run INFO: Running Dataflow job 2019-04-15_05_14_06-2064668138826108957 with 1 expected assertions. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2019-04-15T12:14:06.807Z: Autoscaling is enabled for job 2019-04-15_05_14_06-2064668138826108957. The number of workers will be between 1 and 1000. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2019-04-15T12:14:06.837Z: Autoscaling was automatically enabled for job 2019-04-15_05_14_06-2064668138826108957. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2019-04-15T12:14:10.044Z: Checking permissions granted to controller Service Account. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2019-04-15T12:14:17.668Z: Worker configuration: n1-standard-1 in us-central1-b. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process SEVERE: 2019-04-15T12:14:18.218Z: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/7235 instances, 1/0 CPUs, 250/115909 disk GB, 0/4046 SSD disk GB, 1/230 instance groups, 1/230 managed instance groups, 1/169 instance templates, 1/490 in-use IP addresses. Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2019-04-15T12:14:18.313Z: Cleaning up. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.util.MonitoringUtil$LoggingHandler process INFO: 2019-04-15T12:14:18.390Z: Worker pool stopped. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.TestDataflowRunner$ErrorMonitorMessagesHandler process INFO: Dataflow job 2019-04-15_05_14_06-2064668138826108957 threw exception. Failure message was: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/7235 instances, 1/0 CPUs, 250/115909 disk GB, 0/4046 SSD disk GB, 1/230 instance groups, 1/230 managed instance groups, 1/169 instance templates, 1/490 in-use IP addresses. Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.DataflowPipelineJob waitUntilFinish INFO: Job 2019-04-15_05_14_06-2064668138826108957 failed with status FAILED. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.TestDataflowRunner checkForPAssertSuccess WARNING: Metrics not present for Dataflow job 2019-04-15_05_14_06-2064668138826108957. Apr 15, 2019 12:14:36 PM org.apache.beam.runners.dataflow.TestDataflowRunner run WARNING: Dataflow job 2019-04-15_05_14_06-2064668138826108957 did not output a success or failure metric. Gradle Test Executor 1 finished executing tests. > Task :beam-sdks-java-io-file-based-io-tests:integrationTest FAILED org.apache.beam.sdk.io.avro.AvroIOIT > writeThenReadAll FAILED java.lang.RuntimeException: Workflow failed. Causes: Project apache-beam-testing has insufficient quota(s) to execute this workflow with 1 instances in region us-central1. Quota summary (required/available): 1/7235 instances, 1/0 CPUs, 250/115909 disk GB, 0/4046 SSD disk GB, 1/230 instance groups, 1/230 managed instance groups, 1/169 instance templates, 1/490 in-use IP addresses. Please see https://cloud.google.com/compute/docs/resource-quotas about requesting more quota. at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134) at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90) at org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55) at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350) at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331) at org.apache.beam.sdk.io.avro.AvroIOIT.writeThenReadAll(AvroIOIT.java:147) Finished generating test XML results (0.025 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/test-results/integrationTest> Generating HTML test report... Finished generating test html results (0.037 secs) into: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest> :beam-sdks-java-io-file-based-io-tests:integrationTest (Thread[Execution worker for ':' Thread 5,5,main]) completed. Took 44.609 secs. Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings 93 actionable tasks: 1 executed, 92 up-to-date Publishing build scan... https://gradle.com/s/444cay7vx2mus STDERR: 1 test completed, 1 failed FAILURE: Build failed with an exception. * What went wrong: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'. > There were failing tests. See the report at: > file://<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html> * Try: Run with --debug option to get more log output. Run with --scan to get full insights. * Exception is: org.gradle.api.tasks.TaskExecutionException: Execution failed for task ':beam-sdks-java-io-file-based-io-tests:integrationTest'. at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:121) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:117) at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:184) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:110) at org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84) at org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91) at org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51) at org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102) at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74) at org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58) at org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109) at org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67) at org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52) at org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46) at org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93) at org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45) at org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94) at org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57) at org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56) at org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46) at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416) at org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406) at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165) at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250) at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158) at org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102) at org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36) at org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46) at org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336) at org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193) at org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129) at org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63) at org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46) at org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55) Caused by: org.gradle.api.GradleException: There were failing tests. See the report at: file://<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/sdks/java/io/file-based-io-tests/build/reports/tests/integrationTest/index.html> at org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:626) at org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:498) at org.gradle.api.tasks.testing.Test.executeTests(Test.java:587) at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103) at org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48) at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41) at org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28) at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705) at org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338) at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402) at org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394) at org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165) at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250) at org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158) at org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92) at org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158) at org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46) at org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34) at org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69) at org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49) at org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34) at org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49) at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42) at org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28) at org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133) at org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83) at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82) at org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37) at org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33) at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38) at org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23) at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95) at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:88) at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:52) at org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36) at org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34) at org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109) ... 40 more * Get more help at https://help.gradle.org BUILD FAILED in 1m 0s 2019-04-15 12:14:38,094 ae16d796 MainThread beam_integration_benchmark(1/1) ERROR Error during benchmark beam_integration_benchmark Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 754, in RunBenchmark DoRunPhase(spec, collector, detailed_timer) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 605, in DoRunPhase samples = spec.BenchmarkRun(spec) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run job_type=job_type) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob assert retcode == 0, "Integration Test Failed." AssertionError: Integration Test Failed. 2019-04-15 12:14:38,095 ae16d796 MainThread beam_integration_benchmark(1/1) INFO Cleaning up benchmark beam_integration_benchmark 2019-04-15 12:14:38,095 ae16d796 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/config-beam-performancetests-avroioit-hdfs-1504> delete -f <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/src/.test-infra/kubernetes/hadoop/LargeITCluster/hdfs-multi-datanode-cluster.yml> --ignore-not-found 2019-04-15 12:16:23,716 ae16d796 MainThread beam_integration_benchmark(1/1) ERROR Exception running benchmark Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 897, in RunBenchmarkTask RunBenchmark(spec, collector) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 754, in RunBenchmark DoRunPhase(spec, collector, detailed_timer) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 605, in DoRunPhase samples = spec.BenchmarkRun(spec) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 160, in Run job_type=job_type) File "<https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob assert retcode == 0, "Integration Test Failed." AssertionError: Integration Test Failed. 2019-04-15 12:16:23,717 ae16d796 MainThread beam_integration_benchmark(1/1) ERROR Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue. 2019-04-15 12:16:23,718 ae16d796 MainThread beam_integration_benchmark(1/1) INFO Benchmark run statuses: --------------------------------------------------------------------------------- Name UID Status Failed Substatus --------------------------------------------------------------------------------- beam_integration_benchmark beam_integration_benchmark0 FAILED --------------------------------------------------------------------------------- Success rate: 0.00% (0/1) 2019-04-15 12:16:23,718 ae16d796 MainThread beam_integration_benchmark(1/1) INFO Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/ae16d796/pkb.log> 2019-04-15 12:16:23,718 ae16d796 MainThread beam_integration_benchmark(1/1) INFO Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_AvroIOIT_HDFS/ws/runs/ae16d796/completion_statuses.json> Build step 'Execute shell' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org