See 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/228/display/redirect>

------------------------------------------
[...truncated 322.52 KB...]
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/ParDo(UseWindowHashAsKeyAndWindowAsSortKey)
 as step s15
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/BatchViewOverrides.GroupByWindowHashAsKeyAndWindowAsSortKey/BatchViewOverrides.GroupByKeyAndSortValuesOnly
 as step s16
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/ParDo(IsmRecordForSingularValuePerWindow)
 as step s17
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/View.AsSingleton/Combine.GloballyAsSingletonView/CreateDataflowView
 as step s18
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop OutputFormat/ParDo(SetupJob) as step s19
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/GroupDataByPartition/AssignTask as step s20
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/GroupDataByPartition/GroupByTaskId as step s21
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/GroupDataByPartition/FlattenGroupedTasks as step s22
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop OutputFormat/Write as step s23
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/CollectWriteTasks/WithKeys/AddKeys/Map as step s24
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/GroupByKey as 
step s25
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/CollectWriteTasks/Combine.perKey(IterableCombiner)/Combine.GroupedValues
 as step s26
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop 
OutputFormat/CollectWriteTasks/Values/Values/Map as step s27
    Jun 27, 2019 12:20:56 AM 
org.apache.beam.runners.dataflow.DataflowPipelineTranslator$Translator addStep
    INFO: Adding Write using Hadoop OutputFormat/CommitWriteJob as step s28
    Jun 27, 2019 12:20:56 AM org.apache.beam.runners.dataflow.DataflowRunner run
    INFO: Staging pipeline description to 
gs://temp-storage-for-perf-tests/hadoopformatioit0writeandreadusinghadoopformat-jenkins-0627002048-3e0f9553/output/results/staging/
    Jun 27, 2019 12:20:56 AM org.apache.beam.runners.dataflow.util.PackageUtil 
tryStagePackage
    INFO: Uploading <130653 bytes, hash isNTJdKRGVV9WDmVKaPPSA> to 
gs://temp-storage-for-perf-tests/hadoopformatioit0writeandreadusinghadoopformat-jenkins-0627002048-3e0f9553/output/results/staging/pipeline-isNTJdKRGVV9WDmVKaPPSA.pb

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > 
writeAndReadUsingHadoopFormat STANDARD_OUT
    Dataflow SDK version: 2.15.0-SNAPSHOT

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > 
writeAndReadUsingHadoopFormat STANDARD_ERROR
    Jun 27, 2019 12:28:19 AM 
org.apache.beam.sdk.extensions.gcp.util.RetryHttpRequestInitializer$LoggingHttpBackOffHandler
 handleResponse
    WARNING: Request failed with code 429, performed 0 retries due to 
IOExceptions, performed 10 retries due to unsuccessful status codes, HTTP 
framework says request cannot be retried, (caller responsible for retrying): 
https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs.
 

Gradle Test Executor 1 finished executing tests.

> Task :sdks:java:io:hadoop-format:integrationTest FAILED

org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT > 
writeAndReadUsingHadoopFormat FAILED
    java.lang.RuntimeException: Failed to create a workflow job: Quota exceeded 
for quota metric 'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.
        at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:910)
        at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:98)
        at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
        at 
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
        at org.apache.beam.sdk.Pipeline.run(Pipeline.java:313)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:350)
        at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:331)
        at 
org.apache.beam.sdk.io.hadoop.format.HadoopFormatIOIT.writeAndReadUsingHadoopFormat(HadoopFormatIOIT.java:182)

        Caused by:
        com.google.api.client.googleapis.json.GoogleJsonResponseException: 429 
Too Many Requests
        {
          "code" : 429,
          "errors" : [ {
            "domain" : "global",
            "message" : "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.",
            "reason" : "rateLimitExceeded"
          } ],
          "message" : "Quota exceeded for quota metric 
'dataflow.googleapis.com/create_requests' and limit 
'CreateRequestsPerMinutePerUser' of service 'dataflow.googleapis.com' for 
consumer 'project_number:844138762903'.",
          "status" : "RESOURCE_EXHAUSTED"
        }
            at 
com.google.api.client.googleapis.json.GoogleJsonResponseException.from(GoogleJsonResponseException.java:150)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:113)
            at 
com.google.api.client.googleapis.services.json.AbstractGoogleJsonClientRequest.newExceptionOnError(AbstractGoogleJsonClientRequest.java:40)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest$1.interceptResponse(AbstractGoogleClientRequest.java:401)
            at 
com.google.api.client.http.HttpRequest.execute(HttpRequest.java:1097)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:499)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.executeUnparsed(AbstractGoogleClientRequest.java:432)
            at 
com.google.api.client.googleapis.services.AbstractGoogleClientRequest.execute(AbstractGoogleClientRequest.java:549)
            at 
org.apache.beam.runners.dataflow.DataflowClient.createJob(DataflowClient.java:61)
            at 
org.apache.beam.runners.dataflow.DataflowRunner.run(DataflowRunner.java:896)
            ... 7 more
Finished generating test XML results (0.039 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.045 secs) into: 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest>
:sdks:java:io:hadoop-format:integrationTest (Thread[Execution worker for 
':',5,main]) completed. Took 7 mins 37.521 secs.

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
90 actionable tasks: 50 executed, 20 from cache, 20 up-to-date

Publishing build scan...
https://gradle.com/s/f4zexj3eregco


STDERR: Note: 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/common/src/test/java/org/apache/beam/sdk/io/common/IOITHelperTest.java>
 uses or overrides a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.
Note: Some input files use unchecked or unsafe operations.
Note: Recompile with -Xlint:unchecked for details.
Note: Some input files use or override a deprecated API.
Note: Recompile with -Xlint:deprecation for details.

1 test completed, 1 failed

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:java:io:hadoop-format:integrationTest'.
> There were failing tests. See the report at: 
> file://<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>

* Try:
Run with --debug option to get more log output. Run with --scan to get full 
insights.

* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task 
':sdks:java:io:hadoop-format:integrationTest'.
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:121)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$2.accept(ExecuteActionsTaskExecuter.java:117)
        at org.gradle.internal.Try$Failure.ifSuccessfulOrElse(Try.java:184)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:110)
        at 
org.gradle.api.internal.tasks.execution.ResolveIncrementalChangesTaskExecuter.execute(ResolveIncrementalChangesTaskExecuter.java:84)
        at 
org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:91)
        at 
org.gradle.api.internal.tasks.execution.FinishSnapshotTaskInputsBuildOperationTaskExecuter.execute(FinishSnapshotTaskInputsBuildOperationTaskExecuter.java:51)
        at 
org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:102)
        at 
org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionStateTaskExecuter.execute(ResolveBeforeExecutionStateTaskExecuter.java:74)
        at 
org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:58)
        at 
org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:109)
        at 
org.gradle.api.internal.tasks.execution.ResolveBeforeExecutionOutputsTaskExecuter.execute(ResolveBeforeExecutionOutputsTaskExecuter.java:67)
        at 
org.gradle.api.internal.tasks.execution.StartSnapshotTaskInputsBuildOperationTaskExecuter.execute(StartSnapshotTaskInputsBuildOperationTaskExecuter.java:52)
        at 
org.gradle.api.internal.tasks.execution.ResolveAfterPreviousExecutionStateTaskExecuter.execute(ResolveAfterPreviousExecutionStateTaskExecuter.java:46)
        at 
org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:93)
        at 
org.gradle.api.internal.tasks.execution.FinalizePropertiesTaskExecuter.execute(FinalizePropertiesTaskExecuter.java:45)
        at 
org.gradle.api.internal.tasks.execution.ResolveTaskExecutionModeExecuter.execute(ResolveTaskExecutionModeExecuter.java:94)
        at 
org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:57)
        at 
org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:56)
        at 
org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:36)
        at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.executeTask(EventFiringTaskExecuter.java:63)
        at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:49)
        at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter$1.call(EventFiringTaskExecuter.java:46)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:416)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$CallableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:406)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.call(DefaultBuildOperationExecutor.java:102)
        at 
org.gradle.internal.operations.DelegatingBuildOperationExecutor.call(DelegatingBuildOperationExecutor.java:36)
        at 
org.gradle.api.internal.tasks.execution.EventFiringTaskExecuter.execute(EventFiringTaskExecuter.java:46)
        at 
org.gradle.execution.plan.LocalTaskNodeExecutor.execute(LocalTaskNodeExecutor.java:43)
        at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:355)
        at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$InvokeNodeExecutorsAction.execute(DefaultTaskExecutionGraph.java:343)
        at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:336)
        at 
org.gradle.execution.taskgraph.DefaultTaskExecutionGraph$BuildOperationAwareExecutionAction.execute(DefaultTaskExecutionGraph.java:322)
        at 
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:134)
        at 
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker$1.execute(DefaultPlanExecutor.java:129)
        at 
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.execute(DefaultPlanExecutor.java:202)
        at 
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.executeNextNode(DefaultPlanExecutor.java:193)
        at 
org.gradle.execution.plan.DefaultPlanExecutor$ExecutorWorker.run(DefaultPlanExecutor.java:129)
        at 
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
        at 
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
        at 
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the 
report at: 
file://<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/sdks/java/io/hadoop-format/build/reports/tests/integrationTest/index.html>
        at 
org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:626)
        at 
org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:498)
        at org.gradle.api.tasks.testing.Test.executeTests(Test.java:587)
        at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:103)
        at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:48)
        at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:41)
        at 
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:28)
        at 
org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:705)
        at 
org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:672)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$4.run(ExecuteActionsTaskExecuter.java:338)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:402)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:394)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor$1.execute(DefaultBuildOperationExecutor.java:165)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:250)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:158)
        at 
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:92)
        at 
org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:327)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:312)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.access$200(ExecuteActionsTaskExecuter.java:75)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$TaskExecution.execute(ExecuteActionsTaskExecuter.java:158)
        at 
org.gradle.internal.execution.impl.steps.ExecuteStep.execute(ExecuteStep.java:46)
        at 
org.gradle.internal.execution.impl.steps.CancelExecutionStep.execute(CancelExecutionStep.java:34)
        at 
org.gradle.internal.execution.impl.steps.TimeoutStep.executeWithoutTimeout(TimeoutStep.java:69)
        at 
org.gradle.internal.execution.impl.steps.TimeoutStep.execute(TimeoutStep.java:49)
        at 
org.gradle.internal.execution.impl.steps.CatchExceptionStep.execute(CatchExceptionStep.java:34)
        at 
org.gradle.internal.execution.impl.steps.CreateOutputsStep.execute(CreateOutputsStep.java:49)
        at 
org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:42)
        at 
org.gradle.internal.execution.impl.steps.SnapshotOutputStep.execute(SnapshotOutputStep.java:28)
        at 
org.gradle.internal.execution.impl.steps.CacheStep.executeWithoutCache(CacheStep.java:133)
        at 
org.gradle.internal.execution.impl.steps.CacheStep.lambda$execute$5(CacheStep.java:83)
        at 
org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:82)
        at 
org.gradle.internal.execution.impl.steps.CacheStep.execute(CacheStep.java:37)
        at 
org.gradle.internal.execution.impl.steps.PrepareCachingStep.execute(PrepareCachingStep.java:33)
        at 
org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:38)
        at 
org.gradle.internal.execution.impl.steps.StoreSnapshotsStep.execute(StoreSnapshotsStep.java:23)
        at 
org.gradle.internal.execution.impl.steps.SkipUpToDateStep.executeBecause(SkipUpToDateStep.java:95)
        at 
org.gradle.internal.execution.impl.steps.SkipUpToDateStep.lambda$execute$0(SkipUpToDateStep.java:88)
        at 
org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:52)
        at 
org.gradle.internal.execution.impl.steps.SkipUpToDateStep.execute(SkipUpToDateStep.java:36)
        at 
org.gradle.internal.execution.impl.DefaultWorkExecutor.execute(DefaultWorkExecutor.java:34)
        at 
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:109)
        ... 40 more


* Get more help at https://help.gradle.org

BUILD FAILED in 8m 13s

2019-06-27 00:28:20,638 f29c5a35 MainThread beam_integration_benchmark(1/1) 
ERROR    Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py";,>
 line 160, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-06-27 00:28:20,639 f29c5a35 MainThread beam_integration_benchmark(1/1) 
INFO     Cleaning up benchmark beam_integration_benchmark
2019-06-27 00:28:20,640 f29c5a35 MainThread beam_integration_benchmark(1/1) 
INFO     Running: kubectl 
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/config-beam-performancetests-hadoopformat-228>
 delete -f 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml>
 --ignore-not-found
2019-06-27 00:28:21,282 f29c5a35 MainThread beam_integration_benchmark(1/1) 
ERROR    Exception running benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 760, in RunBenchmark
    DoRunPhase(spec, collector, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 609, in DoRunPhase
    samples = spec.BenchmarkRun(spec)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py";,>
 line 160, in Run
    job_type=job_type)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py";,>
 line 90, in SubmitJob
    assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2019-06-27 00:28:21,282 f29c5a35 MainThread beam_integration_benchmark(1/1) 
ERROR    Benchmark 1/1 beam_integration_benchmark (UID: 
beam_integration_benchmark0) failed. Execution will continue.
2019-06-27 00:28:21,283 f29c5a35 MainThread beam_integration_benchmark(1/1) 
INFO     Benchmark run statuses:
---------------------------------------------------------------------------------
Name                        UID                          Status  Failed 
Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark  beam_integration_benchmark0  FAILED                 
 
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-06-27 00:28:21,283 f29c5a35 MainThread beam_integration_benchmark(1/1) 
INFO     Complete logs can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/f29c5a35/pkb.log>
2019-06-27 00:28:21,284 f29c5a35 MainThread beam_integration_benchmark(1/1) 
INFO     Completion statuses can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_HadoopFormat/ws/runs/f29c5a35/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to