See <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/108/display/redirect?page=changes>
Changes: [herohde] [BEAM-3250] Migrate Dataflow ValidatesRunner test to Gradle [herohde] CR: use 4 forks in Dataflow test [herohde] [BEAM-3250] Migrate Spark ValidatesRunner tests to Gradle ------------------------------------------ [...truncated 24.40 KB...] --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/pkb-config-local.yml> --beam_it_profile=io-it --benchmarks=beam_integration_benchmark 2018-04-06 12:02:15,940 3348fa82 MainThread WARNING The key "flags" was not in the default config, but was in user overrides. This may indicate a typo. 2018-04-06 12:02:15,941 3348fa82 MainThread INFO Initializing the edw service decoder 2018-04-06 12:02:16,397 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Provisioning resources for benchmark beam_integration_benchmark 2018-04-06 12:02:16,401 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Preparing benchmark beam_integration_benchmark 2018-04-06 12:02:16,402 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: git clone https://github.com/apache/beam.git 2018-04-06 12:02:38,493 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> create -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> 2018-04-06 12:02:40,617 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running benchmark beam_integration_benchmark 2018-04-06 12:02:40,637 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:02:51,105 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:03:01,505 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:03:11,878 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:03:22,345 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:03:32,767 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:03:43,065 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:03:53,329 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:04:03,637 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:04:13,893 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:04:24,129 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:04:34,841 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 12:04:35,610 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Using LoadBalancer IP Address: 35.193.209.106 2018-04-06 12:04:35,640 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify -Dit.test=org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT -DskipITs=false -pl sdks/java/io/hadoop-input-format -Pio-it -Pdataflow-runner -DintegrationTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--postgresPort=5432","--numberOfRecords=600000","--postgresServerName=35.193.209.106","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresSsl=False","--runner=TestDataflowRunner"] 2018-04-06 12:06:34,182 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Ran: {/home/jenkins/tools/maven/latest/bin/mvn -e verify -Dit.test=org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT -DskipITs=false -pl sdks/java/io/hadoop-input-format -Pio-it -Pdataflow-runner -DintegrationTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--postgresPort=5432","--numberOfRecords=600000","--postgresServerName=35.193.209.106","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresSsl=False","--runner=TestDataflowRunner"]} ReturnCode:1 STDOUT: [INFO] Error stacktraces are turned on. [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Detecting the operating system and CPU architecture [INFO] ------------------------------------------------------------------------ [INFO] os.detected.name: linux [INFO] os.detected.arch: x86_64 [INFO] os.detected.version: 4.4 [INFO] os.detected.version.major: 4 [INFO] os.detected.version.minor: 4 [INFO] os.detected.release: ubuntu [INFO] os.detected.release.version: 14.04 [INFO] os.detected.release.like.ubuntu: true [INFO] os.detected.release.like.debian: true [INFO] os.detected.classifier: linux-x86_64 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Beam :: SDKs :: Java :: IO :: Hadoop Input Format 2.5.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-maven-version) @ beam-sdks-java-io-hadoop-input-format --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce) @ beam-sdks-java-io-hadoop-input-format --- [WARNING] Failure to find org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=org.pentaho -DartifactId=pentaho-aggdesigner-algorithm -Dversion=5.1.5-jhyde -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=org.pentaho -DartifactId=pentaho-aggdesigner-algorithm -Dversion=5.1.5-jhyde -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT 2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0 3) org.apache.hive:hive-service:jar:1.2.1 4) org.apache.hive:hive-exec:jar:1.2.1 5) org.apache.calcite:calcite-core:jar:1.2.0-incubating 6) org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde from the specified remote repositories: Nexus (http://repository.apache.org/snapshots, releases=false, snapshots=true), central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false) [WARNING] Failure to find cascading:cascading-hadoop:jar:2.6.3 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=cascading -DartifactId=cascading-hadoop -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-hadoop -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT 2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0 3) cascading:cascading-hadoop:jar:2.6.3 cascading:cascading-hadoop:jar:2.6.3 from the specified remote repositories: Nexus (http://repository.apache.org/snapshots, releases=false, snapshots=true), central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false) [WARNING] Failure to find cascading:cascading-local:jar:2.6.3 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=cascading -DartifactId=cascading-local -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-local -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT 2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0 3) cascading:cascading-local:jar:2.6.3 cascading:cascading-local:jar:2.6.3 from the specified remote repositories: Nexus (http://repository.apache.org/snapshots, releases=false, snapshots=true), central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false) [INFO] Adding ignore: module-info [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) @ beam-sdks-java-io-hadoop-input-format --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) @ beam-sdks-java-io-hadoop-input-format --- [INFO] [INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) @ beam-sdks-java-io-hadoop-input-format --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/resources> [INFO] Copying 3 resources [INFO] [INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ beam-sdks-java-io-hadoop-input-format --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 2 source files to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/target/classes> [INFO] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java> uses unchecked or unsafe operations. [INFO] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>: Recompile with -Xlint:unchecked for details. [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration class file for org.joda.time.Duration not found [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 01:46 min [INFO] Finished at: 2018-04-06T12:06:32Z [INFO] Final Memory: 90M/1883M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.7.0:compile (default-compile) on project beam-sdks-java-io-hadoop-input-format: Compilation failure [ERROR] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration [ERROR] class file for org.joda.time.Duration not found [ERROR] [ERROR] -> [Help 1] org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.7.0:compile (default-compile) on project beam-sdks-java-io-hadoop-input-format: Compilation failure <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration class file for org.joda.time.Duration not found at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194) at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107) at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955) at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290) at org.apache.maven.cli.MavenCli.main (MavenCli.java:194) at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:498) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289) at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415) at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356) Caused by: org.apache.maven.plugin.compiler.CompilationFailureException: Compilation failure <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration class file for org.joda.time.Duration not found at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute (AbstractCompilerMojo.java:1161) at org.apache.maven.plugin.compiler.CompilerMojo.execute (CompilerMojo.java:168) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194) at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107) at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955) at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290) at org.apache.maven.cli.MavenCli.main (MavenCli.java:194) at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:498) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289) at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415) at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356) [ERROR] [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException STDERR: 2018-04-06 12:06:34,183 3348fa82 MainThread beam_integration_benchmark(1/1) ERROR Error during benchmark beam_integration_benchmark Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 624, in RunBenchmark DoRunPhase(spec, collector, detailed_timer) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase samples = spec.BenchmarkRun(spec) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run job_type=job_type) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob assert retcode == 0, "Integration Test Failed." AssertionError: Integration Test Failed. 2018-04-06 12:06:34,184 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Cleaning up benchmark beam_integration_benchmark 2018-04-06 12:06:34,185 3348fa82 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1523008880029> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> 2018-04-06 12:06:35,474 3348fa82 MainThread beam_integration_benchmark(1/1) ERROR Exception running benchmark Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 758, in RunBenchmarkTask RunBenchmark(spec, collector) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 624, in RunBenchmark DoRunPhase(spec, collector, detailed_timer) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase samples = spec.BenchmarkRun(spec) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run job_type=job_type) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob assert retcode == 0, "Integration Test Failed." AssertionError: Integration Test Failed. 2018-04-06 12:06:35,475 3348fa82 MainThread beam_integration_benchmark(1/1) ERROR Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue. 2018-04-06 12:06:35,537 3348fa82 MainThread INFO Benchmark run statuses: --------------------------------------------------------------------------------- Name UID Status Failed Substatus --------------------------------------------------------------------------------- beam_integration_benchmark beam_integration_benchmark0 FAILED --------------------------------------------------------------------------------- Success rate: 0.00% (0/1) 2018-04-06 12:06:35,537 3348fa82 MainThread INFO Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/pkb.log> 2018-04-06 12:06:35,538 3348fa82 MainThread INFO Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/3348fa82/completion_statuses.json> Build step 'Execute shell' marked build as failure