See <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/107/display/redirect?page=changes>
Changes: [ehudm] Add a workaround for a tox pip invocation bug. ------------------------------------------ [...truncated 24.34 KB...] --dpb_log_level=INFO --beam_options_config_file=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/pkb-config-local.yml> --beam_it_profile=io-it --benchmarks=beam_integration_benchmark 2018-04-06 06:00:44,933 f8eeb131 MainThread WARNING The key "flags" was not in the default config, but was in user overrides. This may indicate a typo. 2018-04-06 06:00:44,934 f8eeb131 MainThread INFO Initializing the edw service decoder 2018-04-06 06:00:45,095 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Provisioning resources for benchmark beam_integration_benchmark 2018-04-06 06:00:45,099 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Preparing benchmark beam_integration_benchmark 2018-04-06 06:00:45,100 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: git clone https://github.com/apache/beam.git 2018-04-06 06:00:53,245 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> create -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> 2018-04-06 06:00:53,702 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running benchmark beam_integration_benchmark 2018-04-06 06:00:53,724 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:01:03,958 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:01:14,105 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:01:24,306 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:01:34,449 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:01:44,590 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:01:54,736 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:02:04,877 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:02:15,054 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:02:25,190 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:02:35,317 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> get svc postgres-for-dev -ojsonpath={.status.loadBalancer.ingress[0].ip} 2018-04-06 06:02:35,463 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Using LoadBalancer IP Address: 35.193.227.110 2018-04-06 06:02:35,470 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: /home/jenkins/tools/maven/latest/bin/mvn -e verify -Dit.test=org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT -DskipITs=false -pl sdks/java/io/hadoop-input-format -Pio-it -Pdataflow-runner -DintegrationTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--postgresPort=5432","--numberOfRecords=600000","--postgresServerName=35.193.227.110","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresSsl=False","--runner=TestDataflowRunner"] 2018-04-06 06:03:16,954 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Ran: {/home/jenkins/tools/maven/latest/bin/mvn -e verify -Dit.test=org.apache.beam.sdk.io.hadoop.inputformat.HadoopInputFormatIOIT -DskipITs=false -pl sdks/java/io/hadoop-input-format -Pio-it -Pdataflow-runner -DintegrationTestPipelineOptions=["--tempRoot=gs://temp-storage-for-perf-tests","--project=apache-beam-testing","--postgresPort=5432","--numberOfRecords=600000","--postgresServerName=35.193.227.110","--postgresUsername=postgres","--postgresPassword=uuinkks","--postgresDatabaseName=postgres","--postgresSsl=False","--runner=TestDataflowRunner"]} ReturnCode:1 STDOUT: [INFO] Error stacktraces are turned on. [INFO] Scanning for projects... [INFO] ------------------------------------------------------------------------ [INFO] Detecting the operating system and CPU architecture [INFO] ------------------------------------------------------------------------ [INFO] os.detected.name: linux [INFO] os.detected.arch: x86_64 [INFO] os.detected.version: 4.4 [INFO] os.detected.version.major: 4 [INFO] os.detected.version.minor: 4 [INFO] os.detected.release: ubuntu [INFO] os.detected.release.version: 14.04 [INFO] os.detected.release.like.ubuntu: true [INFO] os.detected.release.like.debian: true [INFO] os.detected.classifier: linux-x86_64 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Apache Beam :: SDKs :: Java :: IO :: Hadoop Input Format 2.5.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-maven-version) @ beam-sdks-java-io-hadoop-input-format --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce) @ beam-sdks-java-io-hadoop-input-format --- [WARNING] Failure to find org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=org.pentaho -DartifactId=pentaho-aggdesigner-algorithm -Dversion=5.1.5-jhyde -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=org.pentaho -DartifactId=pentaho-aggdesigner-algorithm -Dversion=5.1.5-jhyde -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT 2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0 3) org.apache.hive:hive-service:jar:1.2.1 4) org.apache.hive:hive-exec:jar:1.2.1 5) org.apache.calcite:calcite-core:jar:1.2.0-incubating 6) org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde org.pentaho:pentaho-aggdesigner-algorithm:jar:5.1.5-jhyde from the specified remote repositories: Nexus (http://repository.apache.org/snapshots, releases=false, snapshots=true), central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false) [WARNING] Failure to find cascading:cascading-hadoop:jar:2.6.3 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=cascading -DartifactId=cascading-hadoop -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-hadoop -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT 2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0 3) cascading:cascading-hadoop:jar:2.6.3 cascading:cascading-hadoop:jar:2.6.3 from the specified remote repositories: Nexus (http://repository.apache.org/snapshots, releases=false, snapshots=true), central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false) [WARNING] Failure to find cascading:cascading-local:jar:2.6.3 in https://repo.maven.apache.org/maven2 was cached in the local repository, resolution will not be reattempted until the update interval of central has elapsed or updates are forced Try downloading the file manually from the project website. Then, install it using the command: mvn install:install-file -DgroupId=cascading -DartifactId=cascading-local -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file Alternatively, if you host your own repository you can deploy the file there: mvn deploy:deploy-file -DgroupId=cascading -DartifactId=cascading-local -Dversion=2.6.3 -Dpackaging=jar -Dfile=/path/to/file -Durl=[url] -DrepositoryId=[id] Path to dependency: 1) org.apache.beam:beam-sdks-java-io-hadoop-input-format:jar:2.5.0-SNAPSHOT 2) org.elasticsearch:elasticsearch-hadoop:jar:5.0.0 3) cascading:cascading-local:jar:2.6.3 cascading:cascading-local:jar:2.6.3 from the specified remote repositories: Nexus (http://repository.apache.org/snapshots, releases=false, snapshots=true), central (https://repo.maven.apache.org/maven2, releases=true, snapshots=false) [INFO] Adding ignore: module-info [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M1:enforce (enforce-banned-dependencies) @ beam-sdks-java-io-hadoop-input-format --- [INFO] [INFO] --- maven-remote-resources-plugin:1.5:process (process-resource-bundles) @ beam-sdks-java-io-hadoop-input-format --- [INFO] [INFO] --- maven-resources-plugin:3.0.2:resources (default-resources) @ beam-sdks-java-io-hadoop-input-format --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/resources> [INFO] Copying 3 resources [INFO] [INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ beam-sdks-java-io-hadoop-input-format --- [INFO] Changes detected - recompiling the module! [INFO] Compiling 2 source files to <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/target/classes> [INFO] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java> uses unchecked or unsafe operations. [INFO] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>: Recompile with -Xlint:unchecked for details. [INFO] ------------------------------------------------------------- [ERROR] COMPILATION ERROR : [INFO] ------------------------------------------------------------- [ERROR] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration class file for org.joda.time.Duration not found [INFO] 1 error [INFO] ------------------------------------------------------------- [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 39.747 s [INFO] Finished at: 2018-04-06T06:03:16Z [INFO] Final Memory: 91M/1874M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.7.0:compile (default-compile) on project beam-sdks-java-io-hadoop-input-format: Compilation failure [ERROR] <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration [ERROR] class file for org.joda.time.Duration not found [ERROR] [ERROR] -> [Help 1] org.apache.maven.lifecycle.LifecycleExecutionException: Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.7.0:compile (default-compile) on project beam-sdks-java-io-hadoop-input-format: Compilation failure <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration class file for org.joda.time.Duration not found at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:213) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194) at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107) at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955) at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290) at org.apache.maven.cli.MavenCli.main (MavenCli.java:194) at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:498) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289) at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415) at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356) Caused by: org.apache.maven.plugin.compiler.CompilationFailureException: Compilation failure <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/beam/sdks/java/io/hadoop-input-format/src/main/java/org/apache/beam/sdk/io/hadoop/inputformat/HadoopInputFormatIO.java>:[420,41] cannot access org.joda.time.Duration class file for org.joda.time.Duration not found at org.apache.maven.plugin.compiler.AbstractCompilerMojo.execute (AbstractCompilerMojo.java:1161) at org.apache.maven.plugin.compiler.CompilerMojo.execute (CompilerMojo.java:168) at org.apache.maven.plugin.DefaultBuildPluginManager.executeMojo (DefaultBuildPluginManager.java:134) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:208) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:154) at org.apache.maven.lifecycle.internal.MojoExecutor.execute (MojoExecutor.java:146) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:117) at org.apache.maven.lifecycle.internal.LifecycleModuleBuilder.buildProject (LifecycleModuleBuilder.java:81) at org.apache.maven.lifecycle.internal.builder.singlethreaded.SingleThreadedBuilder.build (SingleThreadedBuilder.java:51) at org.apache.maven.lifecycle.internal.LifecycleStarter.execute (LifecycleStarter.java:128) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:309) at org.apache.maven.DefaultMaven.doExecute (DefaultMaven.java:194) at org.apache.maven.DefaultMaven.execute (DefaultMaven.java:107) at org.apache.maven.cli.MavenCli.execute (MavenCli.java:955) at org.apache.maven.cli.MavenCli.doMain (MavenCli.java:290) at org.apache.maven.cli.MavenCli.main (MavenCli.java:194) at sun.reflect.NativeMethodAccessorImpl.invoke0 (Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke (NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke (DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke (Method.java:498) at org.codehaus.plexus.classworlds.launcher.Launcher.launchEnhanced (Launcher.java:289) at org.codehaus.plexus.classworlds.launcher.Launcher.launch (Launcher.java:229) at org.codehaus.plexus.classworlds.launcher.Launcher.mainWithExitCode (Launcher.java:415) at org.codehaus.plexus.classworlds.launcher.Launcher.main (Launcher.java:356) [ERROR] [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException STDERR: 2018-04-06 06:03:16,955 f8eeb131 MainThread beam_integration_benchmark(1/1) ERROR Error during benchmark beam_integration_benchmark Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 624, in RunBenchmark DoRunPhase(spec, collector, detailed_timer) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase samples = spec.BenchmarkRun(spec) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run job_type=job_type) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob assert retcode == 0, "Integration Test Failed." AssertionError: Integration Test Failed. 2018-04-06 06:03:16,956 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Cleaning up benchmark beam_integration_benchmark 2018-04-06 06:03:16,956 f8eeb131 MainThread beam_integration_benchmark(1/1) INFO Running: kubectl --kubeconfig=<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/config-hadoopinputformatioit-1522990924665> delete -f <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/src/.test-infra/kubernetes/postgres/postgres-service-for-local-dev.yml> 2018-04-06 06:03:17,587 f8eeb131 MainThread beam_integration_benchmark(1/1) ERROR Exception running benchmark Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 758, in RunBenchmarkTask RunBenchmark(spec, collector) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 624, in RunBenchmark DoRunPhase(spec, collector, detailed_timer) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py",> line 527, in DoRunPhase samples = spec.BenchmarkRun(spec) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/linux_benchmarks/beam_integration_benchmark.py",> line 159, in Run job_type=job_type) File "<https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataflow.py",> line 90, in SubmitJob assert retcode == 0, "Integration Test Failed." AssertionError: Integration Test Failed. 2018-04-06 06:03:17,587 f8eeb131 MainThread beam_integration_benchmark(1/1) ERROR Benchmark 1/1 beam_integration_benchmark (UID: beam_integration_benchmark0) failed. Execution will continue. 2018-04-06 06:03:17,631 f8eeb131 MainThread INFO Benchmark run statuses: --------------------------------------------------------------------------------- Name UID Status Failed Substatus --------------------------------------------------------------------------------- beam_integration_benchmark beam_integration_benchmark0 FAILED --------------------------------------------------------------------------------- Success rate: 0.00% (0/1) 2018-04-06 06:03:17,632 f8eeb131 MainThread INFO Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/pkb.log> 2018-04-06 06:03:17,634 f8eeb131 MainThread INFO Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_HadoopInputFormat/ws/runs/f8eeb131/completion_statuses.json> Build step 'Execute shell' marked build as failure