See
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/203/display/redirect?page=changes>
Changes:
[sidhom] Fix DataStreamsTest checkstyle error
[ankurgoenka] Updating python container version to beam-master-20180518
[ankurgoenka] Reverting fnapi container version
[ankurgoenka] Copyting dependency.py and dependency_test.py
[ankurgoenka] reformatting code
[ankurgoenka] Introducing classes
[ankurgoenka] gruping public and non public methods
[ankurgoenka] Renaming method and updating reference for file_copy and
file_download
[ankurgoenka] Removing Google specific code and unused code in stager.
[ankurgoenka] Removing reference to Google/GCS/Dataflow and restructuring a bit
of
[ankurgoenka] Fixing Stager Tests
[ankurgoenka] Making dependency.py use stager.py and fixing dependency_test.py
test
[ankurgoenka] Moving stager to portability
[ankurgoenka] Removing stage_job_resources from dependencies.py and other minor
[ankurgoenka] Merging filehandler in stager
[ankurgoenka] renaming GCSStager to _ParameterizedStager
[ankurgoenka] Making a few methods static and moving code around
[ankurgoenka] Applying changes from rebase
[ankurgoenka] Fixing lint
[iemejia] [BEAM-4343] Enforce ErrorProne analysis in HBaseIO
[iemejia] [BEAM-4344] Enforce ErrorProne analysis in the HCatalogIO
------------------------------------------
[...truncated 279.83 KB...]
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at
com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting
for a server that matches ReadPreferenceServerSelector{readPreference=primary}.
Client view of cluster state is {type=UNKNOWN,
servers=[{address=104.154.118.62:27017, type=UNKNOWN, state=CONNECTING,
exception={com.mongodb.MongoSocketOpenException: Exception opening socket},
caused by {java.net.SocketTimeoutException: connect timed out}}]
at
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
at
com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
at
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
at
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at
com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting
for a server that matches ReadPreferenceServerSelector{readPreference=primary}.
Client view of cluster state is {type=UNKNOWN,
servers=[{address=104.154.118.62:27017, type=UNKNOWN, state=CONNECTING,
exception={com.mongodb.MongoSocketOpenException: Exception opening socket},
caused by {java.net.SocketTimeoutException: connect timed out}}]
at
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
at
com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
at
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
at
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at
com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting
for a server that matches ReadPreferenceServerSelector{readPreference=primary}.
Client view of cluster state is {type=UNKNOWN,
servers=[{address=104.154.118.62:27017, type=UNKNOWN, state=CONNECTING,
exception={com.mongodb.MongoSocketOpenException: Exception opening socket},
caused by {java.net.SocketTimeoutException: connect timed out}}]
at
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
at
com.mongodb.binding.ClusterBinding.getReadConnectionSource(ClusterBinding.java:63)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:89)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:84)
at
com.mongodb.operation.CommandReadOperation.execute(CommandReadOperation.java:55)
at com.mongodb.Mongo.execute(Mongo.java:772)
at com.mongodb.Mongo$2.execute(Mongo.java:759)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:130)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:124)
at com.mongodb.MongoDatabaseImpl.runCommand(MongoDatabaseImpl.java:114)
at
org.apache.beam.sdk.io.mongodb.MongoDbIO$BoundedMongoDbSource.split(MongoDbIO.java:332)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.splitAndValidate(WorkerCustomSources.java:275)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitTyped(WorkerCustomSources.java:197)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplitWithApiLimit(WorkerCustomSources.java:181)
at
com.google.cloud.dataflow.worker.WorkerCustomSources.performSplit(WorkerCustomSources.java:160)
at
com.google.cloud.dataflow.worker.WorkerCustomSourceOperationExecutor.execute(WorkerCustomSourceOperationExecutor.java:77)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.executeWork(BatchDataflowWorker.java:383)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.doWork(BatchDataflowWorker.java:355)
at
com.google.cloud.dataflow.worker.BatchDataflowWorker.getAndPerformWork(BatchDataflowWorker.java:286)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.doWork(DataflowBatchWorkerHarness.java:134)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:114)
at
com.google.cloud.dataflow.worker.DataflowBatchWorkerHarness$WorkerThread.call(DataflowBatchWorkerHarness.java:101)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
Workflow failed. Causes: S03:Read all
documents/Read(BoundedMongoDbSource)+Map documents to Strings/Map+Calculate
hashcode/WithKeys/AddKeys/Map+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey+Calculate
hashcode/Combine.perKey(Hashing)/Combine.GroupedValues/Partial+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Reify+Calculate
hashcode/Combine.perKey(Hashing)/GroupByKey/Write failed.
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:134)
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:90)
at
org.apache.beam.runners.dataflow.TestDataflowRunner.run(TestDataflowRunner.java:55)
at org.apache.beam.sdk.Pipeline.run(Pipeline.java:311)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:348)
at org.apache.beam.sdk.testing.TestPipeline.run(TestPipeline.java:329)
at
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.testWriteAndRead(MongoDBIOIT.java:131)
com.mongodb.MongoTimeoutException: Timed out after 30000 ms while waiting
for a server that matches WritableServerSelector. Client view of cluster state
is {type=UNKNOWN, servers=[{address=104.154.118.62:27017, type=UNKNOWN,
state=CONNECTING, exception={com.mongodb.MongoSocketOpenException: Exception
opening socket}, caused by {java.net.SocketTimeoutException: connect timed
out}}]
at
com.mongodb.connection.BaseCluster.createTimeoutException(BaseCluster.java:369)
at com.mongodb.connection.BaseCluster.selectServer(BaseCluster.java:101)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:75)
at
com.mongodb.binding.ClusterBinding$ClusterBindingConnectionSource.<init>(ClusterBinding.java:71)
at
com.mongodb.binding.ClusterBinding.getWriteConnectionSource(ClusterBinding.java:68)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:158)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:133)
at
com.mongodb.operation.CommandOperationHelper.executeWrappedCommandProtocol(CommandOperationHelper.java:128)
at
com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:51)
at
com.mongodb.operation.DropDatabaseOperation.execute(DropDatabaseOperation.java:36)
at com.mongodb.Mongo.execute(Mongo.java:781)
at com.mongodb.Mongo$2.execute(Mongo.java:764)
at com.mongodb.MongoDatabaseImpl.drop(MongoDatabaseImpl.java:136)
at
org.apache.beam.sdk.io.mongodb.MongoDBIOIT.tearDown(MongoDBIOIT.java:103)
1 test completed, 1 failed
Finished generating test XML results (0.022 secs) into:
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/test-results/integrationTest>
Generating HTML test report...
Finished generating test html results (0.028 secs) into:
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest>
> Task :beam-sdks-java-io-mongodb:integrationTest FAILED
:beam-sdks-java-io-mongodb:integrationTest (Thread[Task worker for ':',5,main])
completed. Took 16 mins 32.987 secs.
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
See
https://docs.gradle.org/4.7/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 16m 41s
59 actionable tasks: 1 executed, 58 up-to-date
Publishing build scan...
https://gradle.com/s/d3jfpbjfpmlse
STDERR:
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task ':beam-sdks-java-io-mongodb:integrationTest'.
> There were failing tests. See the report at:
> file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html>
* Try:
Run with --debug option to get more log output. Run with --scan to get full
insights.
* Exception is:
org.gradle.api.tasks.TaskExecutionException: Execution failed for task
':beam-sdks-java-io-mongodb:integrationTest'.
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:103)
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.execute(ExecuteActionsTaskExecuter.java:73)
at
org.gradle.api.internal.tasks.execution.OutputDirectoryCreatingTaskExecuter.execute(OutputDirectoryCreatingTaskExecuter.java:51)
at
org.gradle.api.internal.tasks.execution.SkipCachedTaskExecuter.execute(SkipCachedTaskExecuter.java:105)
at
org.gradle.api.internal.tasks.execution.SkipUpToDateTaskExecuter.execute(SkipUpToDateTaskExecuter.java:59)
at
org.gradle.api.internal.tasks.execution.ResolveTaskOutputCachingStateExecuter.execute(ResolveTaskOutputCachingStateExecuter.java:54)
at
org.gradle.api.internal.tasks.execution.ResolveBuildCacheKeyExecuter.execute(ResolveBuildCacheKeyExecuter.java:66)
at
org.gradle.api.internal.tasks.execution.ValidatingTaskExecuter.execute(ValidatingTaskExecuter.java:59)
at
org.gradle.api.internal.tasks.execution.SkipEmptySourceFilesTaskExecuter.execute(SkipEmptySourceFilesTaskExecuter.java:101)
at
org.gradle.api.internal.tasks.execution.FinalizeInputFilePropertiesTaskExecuter.execute(FinalizeInputFilePropertiesTaskExecuter.java:44)
at
org.gradle.api.internal.tasks.execution.CleanupStaleOutputsExecuter.execute(CleanupStaleOutputsExecuter.java:91)
at
org.gradle.api.internal.tasks.execution.ResolveTaskArtifactStateTaskExecuter.execute(ResolveTaskArtifactStateTaskExecuter.java:62)
at
org.gradle.api.internal.tasks.execution.SkipTaskWithNoActionsExecuter.execute(SkipTaskWithNoActionsExecuter.java:59)
at
org.gradle.api.internal.tasks.execution.SkipOnlyIfTaskExecuter.execute(SkipOnlyIfTaskExecuter.java:54)
at
org.gradle.api.internal.tasks.execution.ExecuteAtMostOnceTaskExecuter.execute(ExecuteAtMostOnceTaskExecuter.java:43)
at
org.gradle.api.internal.tasks.execution.CatchExceptionTaskExecuter.execute(CatchExceptionTaskExecuter.java:34)
at
org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker$1.run(DefaultTaskGraphExecuter.java:256)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
at
org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at
org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:249)
at
org.gradle.execution.taskgraph.DefaultTaskGraphExecuter$EventFiringTaskWorker.execute(DefaultTaskGraphExecuter.java:238)
at
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:104)
at
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker$1.execute(DefaultTaskPlanExecutor.java:98)
at
org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.execute(DefaultTaskExecutionPlan.java:663)
at
org.gradle.execution.taskgraph.DefaultTaskExecutionPlan.executeWithTask(DefaultTaskExecutionPlan.java:596)
at
org.gradle.execution.taskgraph.DefaultTaskPlanExecutor$TaskExecutorWorker.run(DefaultTaskPlanExecutor.java:98)
at
org.gradle.internal.concurrent.ExecutorPolicy$CatchAndRecordFailures.onExecute(ExecutorPolicy.java:63)
at
org.gradle.internal.concurrent.ManagedExecutorImpl$1.run(ManagedExecutorImpl.java:46)
at
org.gradle.internal.concurrent.ThreadFactoryImpl$ManagedThreadRunnable.run(ThreadFactoryImpl.java:55)
Caused by: org.gradle.api.GradleException: There were failing tests. See the
report at:
file://<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/sdks/java/io/mongodb/build/reports/tests/integrationTest/index.html>
at
org.gradle.api.tasks.testing.AbstractTestTask.handleTestFailures(AbstractTestTask.java:612)
at
org.gradle.api.tasks.testing.AbstractTestTask.executeTests(AbstractTestTask.java:484)
at org.gradle.api.tasks.testing.Test.executeTests(Test.java:583)
at org.gradle.internal.reflect.JavaMethod.invoke(JavaMethod.java:73)
at
org.gradle.api.internal.project.taskfactory.StandardTaskAction.doExecute(StandardTaskAction.java:46)
at
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:39)
at
org.gradle.api.internal.project.taskfactory.StandardTaskAction.execute(StandardTaskAction.java:26)
at
org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:794)
at
org.gradle.api.internal.AbstractTask$TaskActionWrapper.execute(AbstractTask.java:761)
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter$1.run(ExecuteActionsTaskExecuter.java:124)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:317)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor$RunnableBuildOperationWorker.execute(DefaultBuildOperationExecutor.java:309)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.execute(DefaultBuildOperationExecutor.java:185)
at
org.gradle.internal.operations.DefaultBuildOperationExecutor.run(DefaultBuildOperationExecutor.java:97)
at
org.gradle.internal.operations.DelegatingBuildOperationExecutor.run(DelegatingBuildOperationExecutor.java:31)
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeAction(ExecuteActionsTaskExecuter.java:113)
at
org.gradle.api.internal.tasks.execution.ExecuteActionsTaskExecuter.executeActions(ExecuteActionsTaskExecuter.java:95)
... 31 more
* Get more help at https://help.gradle.org
2018-05-22 00:20:08,252 0fab0156 MainThread beam_integration_benchmark(1/1)
ERROR Error during benchmark beam_integration_benchmark
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 667, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 547, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",>
line 159, in Run
job_type=job_type)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",>
line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-22 00:20:08,252 0fab0156 MainThread beam_integration_benchmark(1/1)
INFO Cleaning up benchmark beam_integration_benchmark
2018-05-22 00:20:08,253 0fab0156 MainThread beam_integration_benchmark(1/1)
INFO Running: kubectl
--kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/config-mongodbioit-1526932887351>
delete -f
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/src/.test-infra/kubernetes/mongodb/load-balancer/mongo.yml>
--ignore-not-found
2018-05-22 00:20:08,823 0fab0156 MainThread beam_integration_benchmark(1/1)
ERROR Exception running benchmark
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 801, in RunBenchmarkTask
RunBenchmark(spec, collector)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 667, in RunBenchmark
DoRunPhase(spec, collector, detailed_timer)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",>
line 547, in DoRunPhase
samples = spec.BenchmarkRun(spec)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",>
line 159, in Run
job_type=job_type)
File
"<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",>
line 90, in SubmitJob
assert retcode == 0, "Integration Test Failed."
AssertionError: Integration Test Failed.
2018-05-22 00:20:08,823 0fab0156 MainThread beam_integration_benchmark(1/1)
ERROR Benchmark 1/1 beam_integration_benchmark (UID:
beam_integration_benchmark0) failed. Execution will continue.
2018-05-22 00:20:08,841 0fab0156 MainThread INFO Benchmark run statuses:
---------------------------------------------------------------------------------
Name UID Status Failed
Substatus
---------------------------------------------------------------------------------
beam_integration_benchmark beam_integration_benchmark0 FAILED
---------------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2018-05-22 00:20:08,842 0fab0156 MainThread INFO Complete logs can be found
at:
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/0fab0156/pkb.log>
2018-05-22 00:20:08,842 0fab0156 MainThread INFO Completion statuses can be
found at:
<https://builds.apache.org/job/beam_PerformanceTests_MongoDBIO_IT/ws/runs/0fab0156/completion_statuses.json>
Build step 'Execute shell' marked build as failure