See <https://builds.apache.org/job/beam_PerformanceTests_Spark/1882/display/redirect>
------------------------------------------ [...truncated 20.39 KB...] --dpb_log_level=INFO --beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> --config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc --k8s_get_retry_count=36 --benchmarks=dpb_wordcount_benchmark 2018-06-26 13:28:53,006 5b3ed93d MainThread WARNING The key "flags" was not in the default config, but was in user overrides. This may indicate a typo. 2018-06-26 13:28:53,157 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Provisioning resources for benchmark dpb_wordcount_benchmark 2018-06-26 13:28:53,159 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Running: gcloud dataproc clusters create pkb-5b3ed93d --format json --quiet --num-workers 2 --worker-machine-type n1-standard-1 --worker-boot-disk-size 500 --master-machine-type n1-standard-1 --master-boot-disk-size 500 2018-06-26 13:30:50,538 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Running: gcloud dataproc clusters describe pkb-5b3ed93d --format json --quiet 2018-06-26 13:30:51,266 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Preparing benchmark dpb_wordcount_benchmark 2018-06-26 13:30:51,267 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Running benchmark dpb_wordcount_benchmark 2018-06-26 13:30:51,268 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Running: gcloud dataproc jobs submit spark --format json --quiet --cluster pkb-5b3ed93d --jars file:///usr/lib/spark/examples/jars/spark-examples.jar --class org.apache.spark.examples.JavaWordCount --driver-log-levels root=INFO -- gs:///etc/hosts 2018-06-26 13:31:24,692 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Ran: {gcloud dataproc jobs submit spark --format json --quiet --cluster pkb-5b3ed93d --jars file:///usr/lib/spark/examples/jars/spark-examples.jar --class org.apache.spark.examples.JavaWordCount --driver-log-levels root=INFO -- gs:///etc/hosts} ReturnCode:1 STDOUT: STDERR: Job [8ff80af6-64b9-4321-912b-846e62f5bf4b] submitted. Waiting for job output... 2018-06-26 13:30:55 INFO SparkContext:54 - Running Spark version 2.2.1 2018-06-26 13:30:56 INFO SparkContext:54 - Submitted application: JavaWordCount 2018-06-26 13:30:56 INFO SecurityManager:54 - Changing view acls to: root 2018-06-26 13:30:56 INFO SecurityManager:54 - Changing modify acls to: root 2018-06-26 13:30:56 INFO SecurityManager:54 - Changing view acls groups to: 2018-06-26 13:30:56 INFO SecurityManager:54 - Changing modify acls groups to: 2018-06-26 13:30:56 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2018-06-26 13:30:57 INFO Utils:54 - Successfully started service 'sparkDriver' on port 43511. 2018-06-26 13:30:57 INFO SparkEnv:54 - Registering MapOutputTracker 2018-06-26 13:30:57 INFO SparkEnv:54 - Registering BlockManagerMaster 2018-06-26 13:30:57 INFO BlockManagerMasterEndpoint:54 - Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 2018-06-26 13:30:57 INFO BlockManagerMasterEndpoint:54 - BlockManagerMasterEndpoint up 2018-06-26 13:30:57 INFO DiskBlockManager:54 - Created local directory at /hadoop/spark/tmp/blockmgr-848c4754-ff6e-4f0e-9d17-6646a1f74bd8 2018-06-26 13:30:57 INFO MemoryStore:54 - MemoryStore started with capacity 376.8 MB 2018-06-26 13:30:57 INFO SparkEnv:54 - Registering OutputCommitCoordinator 2018-06-26 13:30:58 INFO log:192 - Logging initialized @4776ms 2018-06-26 13:30:58 INFO Server:345 - jetty-9.3.z-SNAPSHOT 2018-06-26 13:30:58 INFO Server:403 - Started @4958ms 2018-06-26 13:30:58 INFO AbstractConnector:270 - Started ServerConnector@2440b67d{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2018-06-26 13:30:58 INFO Utils:54 - Successfully started service 'SparkUI' on port 4040. 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6e4ea0bd{/jobs,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3935e9a8{/jobs/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5b56b654{/jobs/job,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@29006752{/jobs/job/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@66d57c1b{/stages,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@d59970a{/stages/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@53b98ff6{/stages/stage,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@76adb233{/stages/stage/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@36453307{/stages/pool,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@66eb985d{/stages/pool/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@75504cef{/storage,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@56193c7d{/storage/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@5f8890c2{/storage/rdd,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7f9e1534{/storage/rdd/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@81ff872{/environment,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3e598df9{/environment/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@99a65d3{/executors,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@42cc13a0{/executors/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@6813a331{/executors/threadDump,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@39ab59f8{/executors/threadDump/json,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@111610e6{/static,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@38f2e97e{/,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@323659f8{/api,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3cc20577{/jobs/job/kill,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@775594f2{/stages/stage/kill,null,AVAILABLE,@Spark} 2018-06-26 13:30:58 INFO SparkUI:54 - Bound SparkUI to 0.0.0.0, and started at http://10.128.0.52:4040 2018-06-26 13:30:58 INFO SparkContext:54 - Added JAR file:/usr/lib/hadoop/hadoop-common.jar at spark://10.128.0.52:43511/jars/hadoop-common.jar with timestamp 1530019858438 2018-06-26 13:30:58 INFO Utils:54 - Using initial executors = 10000, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances 2018-06-26 13:30:59 INFO GoogleHadoopFileSystemBase:686 - GHFS version: 1.6.7-hadoop2 2018-06-26 13:31:00 INFO RMProxy:123 - Connecting to ResourceManager at pkb-5b3ed93d-m/10.128.0.52:8032 2018-06-26 13:31:01 INFO Client:54 - Requesting a new application from cluster with 2 NodeManagers 2018-06-26 13:31:01 INFO Client:54 - Verifying our application has not requested more than the maximum memory capability of the cluster (3072 MB per container) 2018-06-26 13:31:01 INFO Client:54 - Will allocate AM container, with 1024 MB memory including 384 MB overhead 2018-06-26 13:31:01 INFO Client:54 - Setting up container launch context for our AM 2018-06-26 13:31:01 INFO Client:54 - Setting up the launch environment for our AM container 2018-06-26 13:31:01 INFO Client:54 - Preparing resources for our AM container 2018-06-26 13:31:03 INFO Client:54 - Uploading resource file:/usr/lib/spark/examples/jars/spark-examples.jar -> hdfs://pkb-5b3ed93d-m/user/root/.sparkStaging/application_1530019789314_0001/spark-examples.jar 2018-06-26 13:31:04 INFO Client:54 - Uploading resource file:/hadoop/spark/tmp/spark-5d302b60-e57b-493a-be77-7ae83ca1f71b/__spark_conf__8085077595713877624.zip -> hdfs://pkb-5b3ed93d-m/user/root/.sparkStaging/application_1530019789314_0001/__spark_conf__.zip 2018-06-26 13:31:05 INFO SecurityManager:54 - Changing view acls to: root 2018-06-26 13:31:05 INFO SecurityManager:54 - Changing modify acls to: root 2018-06-26 13:31:05 INFO SecurityManager:54 - Changing view acls groups to: 2018-06-26 13:31:05 INFO SecurityManager:54 - Changing modify acls groups to: 2018-06-26 13:31:05 INFO SecurityManager:54 - SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(root); groups with view permissions: Set(); users with modify permissions: Set(root); groups with modify permissions: Set() 2018-06-26 13:31:05 INFO Client:54 - Submitting application application_1530019789314_0001 to ResourceManager 2018-06-26 13:31:05 INFO YarnClientImpl:278 - Submitted application application_1530019789314_0001 2018-06-26 13:31:05 INFO SchedulerExtensionServices:54 - Starting Yarn extension services with app application_1530019789314_0001 and attemptId None 2018-06-26 13:31:06 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:06 INFO Client:54 - client token: N/A diagnostics: [Tue Jun 26 13:31:05 +0000 2018] Scheduler has assigned a container for AM, waiting for AM container to be launched ApplicationMaster host: N/A ApplicationMaster RPC port: -1 queue: default start time: 1530019865263 final status: UNDEFINED tracking URL: http://pkb-5b3ed93d-m:8088/proxy/application_1530019789314_0001/ user: root 2018-06-26 13:31:07 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:08 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:09 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:10 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:11 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:12 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:13 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:14 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:15 INFO Client:54 - Application report for application_1530019789314_0001 (state: ACCEPTED) 2018-06-26 13:31:15 INFO YarnSchedulerBackend$YarnSchedulerEndpoint:54 - ApplicationMaster registered as NettyRpcEndpointRef(spark-client://YarnAM) 2018-06-26 13:31:15 INFO YarnClientSchedulerBackend:54 - Add WebUI Filter. org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter, Map(PROXY_HOSTS -> pkb-5b3ed93d-m, PROXY_URI_BASES -> http://pkb-5b3ed93d-m:8088/proxy/application_1530019789314_0001), /proxy/application_1530019789314_0001 2018-06-26 13:31:15 INFO JettyUtils:54 - Adding filter: org.apache.hadoop.yarn.server.webproxy.amfilter.AmIpFilter 2018-06-26 13:31:16 INFO Client:54 - Application report for application_1530019789314_0001 (state: RUNNING) 2018-06-26 13:31:16 INFO Client:54 - client token: N/A diagnostics: N/A ApplicationMaster host: 10.128.0.51 ApplicationMaster RPC port: 0 queue: default start time: 1530019865263 final status: UNDEFINED tracking URL: http://pkb-5b3ed93d-m:8088/proxy/application_1530019789314_0001/ user: root 2018-06-26 13:31:16 INFO YarnClientSchedulerBackend:54 - Application application_1530019789314_0001 has started running. 2018-06-26 13:31:16 INFO Utils:54 - Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 36416. 2018-06-26 13:31:16 INFO NettyBlockTransferService:54 - Server created on 10.128.0.52:36416 2018-06-26 13:31:16 INFO BlockManager:54 - Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 2018-06-26 13:31:16 INFO BlockManagerMaster:54 - Registering BlockManager BlockManagerId(driver, 10.128.0.52, 36416, None) 2018-06-26 13:31:16 INFO BlockManagerMasterEndpoint:54 - Registering block manager 10.128.0.52:36416 with 376.8 MB RAM, BlockManagerId(driver, 10.128.0.52, 36416, None) 2018-06-26 13:31:16 INFO BlockManagerMaster:54 - Registered BlockManager BlockManagerId(driver, 10.128.0.52, 36416, None) 2018-06-26 13:31:16 INFO BlockManager:54 - external shuffle service port = 7337 2018-06-26 13:31:16 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, 10.128.0.52, 36416, None) 2018-06-26 13:31:17 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7708b66a{/metrics/json,null,AVAILABLE,@Spark} 2018-06-26 13:31:17 INFO EventLoggingListener:54 - Logging events to hdfs://pkb-5b3ed93d-m/user/spark/eventlog/application_1530019789314_0001 2018-06-26 13:31:17 INFO Utils:54 - Using initial executors = 10000, max of spark.dynamicAllocation.initialExecutors, spark.dynamicAllocation.minExecutors and spark.executor.instances 2018-06-26 13:31:17 INFO YarnClientSchedulerBackend:54 - SchedulerBackend is ready for scheduling beginning after reached minRegisteredResourcesRatio: 0.0 2018-06-26 13:31:17 INFO SharedState:54 - loading hive config file: file:/etc/hive/conf.dist/hive-site.xml 2018-06-26 13:31:17 INFO SharedState:54 - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/tmp/8ff80af6-64b9-4321-912b-846e62f5bf4b/spark-warehouse'). 2018-06-26 13:31:18 INFO SharedState:54 - Warehouse path is 'file:/tmp/8ff80af6-64b9-4321-912b-846e62f5bf4b/spark-warehouse'. 2018-06-26 13:31:18 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@538b3c88{/SQL,null,AVAILABLE,@Spark} 2018-06-26 13:31:18 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@10e56da9{/SQL/json,null,AVAILABLE,@Spark} 2018-06-26 13:31:18 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@3b3056a6{/SQL/execution,null,AVAILABLE,@Spark} 2018-06-26 13:31:18 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@51d8f2f2{/SQL/execution/json,null,AVAILABLE,@Spark} 2018-06-26 13:31:18 INFO ContextHandler:781 - Started o.s.j.s.ServletContextHandler@7cdb7fc{/static/sql,null,AVAILABLE,@Spark} 2018-06-26 13:31:19 INFO StateStoreCoordinatorRef:54 - Registered StateStoreCoordinator endpoint 2018-06-26 13:31:20 WARN GoogleHadoopFileSystemBase:77 - GHFS.configureBuckets: Warning. No GCS bucket provided. Falling back on deprecated fs.gs.system.bucket. Exception in thread "main" org.apache.spark.sql.AnalysisException: Path does not exist: gs://dataproc-6c5fbcbb-a2de-406e-9cf7-8c1ce0b6a604-us/etc/hosts; at org.apache.spark.sql.execution.datasources.DataSource$.org$apache$spark$sql$execution$datasources$DataSource$$checkAndGlobPathIfNecessary(DataSource.scala:626) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:350) at org.apache.spark.sql.execution.datasources.DataSource$$anonfun$14.apply(DataSource.scala:350) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.TraversableLike$$anonfun$flatMap$1.apply(TraversableLike.scala:241) at scala.collection.immutable.List.foreach(List.scala:381) at scala.collection.TraversableLike$class.flatMap(TraversableLike.scala:241) at scala.collection.immutable.List.flatMap(List.scala:344) at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:349) at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178) at org.apache.spark.sql.DataFrameReader.text(DataFrameReader.scala:623) at org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:657) at org.apache.spark.sql.DataFrameReader.textFile(DataFrameReader.scala:632) at org.apache.spark.examples.JavaWordCount.main(JavaWordCount.java:45) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:775) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:180) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:205) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:119) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) 2018-06-26 13:31:21 INFO SparkContext:54 - Invoking stop() from shutdown hook 2018-06-26 13:31:21 INFO AbstractConnector:310 - Stopped Spark@2440b67d{HTTP/1.1,[http/1.1]}{0.0.0.0:4040} 2018-06-26 13:31:21 INFO SparkUI:54 - Stopped Spark web UI at http://10.128.0.52:4040 2018-06-26 13:31:21 INFO YarnClientSchedulerBackend:54 - Interrupting monitor thread 2018-06-26 13:31:21 INFO YarnClientSchedulerBackend:54 - Shutting down all executors 2018-06-26 13:31:21 INFO YarnSchedulerBackend$YarnDriverEndpoint:54 - Asking each executor to shut down 2018-06-26 13:31:21 INFO SchedulerExtensionServices:54 - Stopping SchedulerExtensionServices (serviceOption=None, services=List(), started=false) 2018-06-26 13:31:21 INFO YarnClientSchedulerBackend:54 - Stopped 2018-06-26 13:31:21 INFO MapOutputTrackerMasterEndpoint:54 - MapOutputTrackerMasterEndpoint stopped! 2018-06-26 13:31:21 INFO MemoryStore:54 - MemoryStore cleared 2018-06-26 13:31:21 INFO BlockManager:54 - BlockManager stopped 2018-06-26 13:31:21 INFO BlockManagerMaster:54 - BlockManagerMaster stopped 2018-06-26 13:31:21 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint:54 - OutputCommitCoordinator stopped! 2018-06-26 13:31:21 INFO SparkContext:54 - Successfully stopped SparkContext 2018-06-26 13:31:21 INFO ShutdownHookManager:54 - Shutdown hook called 2018-06-26 13:31:21 INFO ShutdownHookManager:54 - Deleting directory /hadoop/spark/tmp/spark-5d302b60-e57b-493a-be77-7ae83ca1f71b ERROR: (gcloud.dataproc.jobs.submit.spark) Job [8ff80af6-64b9-4321-912b-846e62f5bf4b] entered state [ERROR] while waiting for [DONE]. 2018-06-26 13:31:24,695 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Cleaning up benchmark dpb_wordcount_benchmark 2018-06-26 13:31:24,695 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Tearing down resources for benchmark dpb_wordcount_benchmark 2018-06-26 13:31:24,695 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Running: gcloud dataproc clusters delete pkb-5b3ed93d --format json --quiet 2018-06-26 13:33:01,542 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Running: gcloud dataproc clusters describe pkb-5b3ed93d --format json --quiet 2018-06-26 13:33:02,229 5b3ed93d MainThread dpb_wordcount_benchmark(1/1) INFO Ran: {gcloud dataproc clusters describe pkb-5b3ed93d --format json --quiet} ReturnCode:1 STDOUT: STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: Cluster projects/apache-beam-testing/regions/global/clusters/pkb-5b3ed93d 2018-06-26 13:33:02,260 5b3ed93d MainThread INFO Publishing 2 samples to <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5b3ed93d/perfkit-bq-pubMgwSVz.json> 2018-06-26 13:33:02,261 5b3ed93d MainThread INFO Publishing 2 samples to beam_performance.spark_pkp_results 2018-06-26 13:33:02,261 5b3ed93d MainThread INFO Running: bq load --autodetect --source_format=NEWLINE_DELIMITED_JSON beam_performance.spark_pkp_results <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5b3ed93d/perfkit-bq-pubMgwSVz.json> 2018-06-26 13:33:07,114 5b3ed93d MainThread INFO -------------------------PerfKitBenchmarker Complete Results------------------------- {'metadata': {'dpb_cluster_id': 'pkb-5b3ed93d', 'dpb_cluster_shape': 'n1-standard-1', 'dpb_cluster_size': 2, 'dpb_service': 'dataproc', 'input_location': 'gs:///etc/hosts', 'perfkitbenchmarker_version': 'v1.12.0-691-g79e6328', 'run_number': 0}, 'metric': 'run_time', 'official': True, 'owner': 'jenkins', 'product_name': 'PerfKitBenchmarker', 'run_uri': '5b3ed93d-8e2c5640-380b-49a4-a900-b297b921dcaf', 'sample_uri': 'e7912eb5-97e7-426f-8638-8647b7695359', 'test': 'dpb_wordcount_benchmark', 'timestamp': 1530019884.694425, 'unit': 'seconds', 'value': 33.426533} {'metadata': {'perfkitbenchmarker_version': 'v1.12.0-691-g79e6328'}, 'metric': 'End to End Runtime', 'official': True, 'owner': 'jenkins', 'product_name': 'PerfKitBenchmarker', 'run_uri': '5b3ed93d-8e2c5640-380b-49a4-a900-b297b921dcaf', 'sample_uri': '8d748585-95df-4ae0-af37-05c3efa88bb2', 'test': 'dpb_wordcount_benchmark', 'timestamp': 1530019982.230728, 'unit': 'seconds', 'value': 249.07340097427368} -------------------------PerfKitBenchmarker Results Summary------------------------- DPB_WORDCOUNT_BENCHMARK: dpb_cluster_id="pkb-5b3ed93d" dpb_cluster_shape="n1-standard-1" dpb_cluster_size="2" dpb_service="dataproc" input_location="gs:///etc/hosts" run_number="0" run_time 33.426533 seconds End to End Runtime 249.073401 seconds ------------------------- For all tests: perfkitbenchmarker_version="v1.12.0-691-g79e6328" 2018-06-26 13:33:07,115 5b3ed93d MainThread INFO Publishing 2 samples to <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5b3ed93d/perfkitbenchmarker_results.json> 2018-06-26 13:33:07,115 5b3ed93d MainThread INFO Benchmark run statuses: ------------------------------------------------------------------------------ Name UID Status Failed Substatus ------------------------------------------------------------------------------ dpb_wordcount_benchmark dpb_wordcount_benchmark0 SUCCEEDED ------------------------------------------------------------------------------ Success rate: 100.00% (1/1) 2018-06-26 13:33:07,115 5b3ed93d MainThread INFO Complete logs can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5b3ed93d/pkb.log> 2018-06-26 13:33:07,116 5b3ed93d MainThread INFO Completion statuses can be found at: <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5b3ed93d/completion_statuses.json> [beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins397550869953959357.sh + .env/bin/python src.test-infra/jenkins/verify_performance_tests_results.py .env/bin/python: can't open file 'src.test-infra/jenkins/verify_performance_tests_results.py': [Errno 2] No such file or directory Build step 'Execute shell' marked build as failure
