See 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/3188/display/redirect?page=changes>

Changes:

[robertwb] [BEAM-7364] Avoid possible signed integer overflow in hash.

[altay] [BEAM-6695] Latest PTransform for Python SDK (#8206)

[altay] [BEAM-7354] Starcgen fix when no identifiers specified. (#8611)

------------------------------------------
Started by timer
[EnvInject] - Loading node environment variables.
Building remotely on apache-beam-jenkins-14 (beam) in workspace 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
No credentials specified
Cloning the remote Git repository
Cloning repository https://github.com/apache/beam.git
 > git init <https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src> 
 > # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git --version # timeout=10
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/*
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # 
 > timeout=10
 > git config remote.origin.url https://github.com/apache/beam.git # timeout=10
Fetching upstream changes from https://github.com/apache/beam.git
 > git fetch --tags --progress https://github.com/apache/beam.git 
 > +refs/heads/*:refs/remotes/origin/* 
 > +refs/pull/${ghprbPullId}/*:refs/remotes/origin/pr/${ghprbPullId}/*
 > git rev-parse origin/master^{commit} # timeout=10
Checking out Revision 938ee3e805b40fdd5b1712a8bda9c9e0623fa470 (origin/master)
 > git config core.sparsecheckout # timeout=10
 > git checkout -f 938ee3e805b40fdd5b1712a8bda9c9e0623fa470
Commit message: "[BEAM-7354] Starcgen fix when no identifiers specified. 
(#8611)"
 > git rev-list --no-walk a71f305402efe050c9dcf5ef305141a66efb2953 # timeout=10
Cleaning workspace
 > git rev-parse --verify HEAD # timeout=10
Resetting working tree
 > git reset --hard # timeout=10
 > git clean -fdx # timeout=10
[EnvInject] - Executing scripts and injecting environment variables after the 
SCM step.
[EnvInject] - Injecting as environment variables the properties content 
SPARK_LOCAL_IP=127.0.0.1

[EnvInject] - Variables injected successfully.
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins9105363418224454324.sh
+ rm -rf 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins7926813998748061563.sh
+ rm -rf 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins3065737733774763476.sh
+ virtualenv 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env> 
--python=python2.7
New python executable in 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python2.7>
Also creating executable in 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
Installing setuptools, pip, wheel...
done.
Running virtualenv with interpreter /usr/bin/python2.7
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins2884389708385235710.sh
+ 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip>
 install --upgrade setuptools pip
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7.
Requirement already up-to-date: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (41.0.1)
Requirement already up-to-date: pip in 
./env/.perfkit_env/lib/python2.7/site-packages (19.1.1)
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins1707758000261342209.sh
+ git clone https://github.com/GoogleCloudPlatform/PerfKitBenchmarker.git 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker>
Cloning into 
'<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker'...>
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins6477954645871488847.sh
+ 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/pip>
 install -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
DEPRECATION: Python 2.7 will reach the end of its life on January 1st, 2020. 
Please upgrade your Python as Python 2.7 won't be maintained after that date. A 
future version of pip will drop support for Python 2.7.
Collecting absl-py (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 14))
  Downloading 
https://files.pythonhosted.org/packages/da/3f/9b0355080b81b15ba6a9ffcf1f5ea39e307a2778b2f2dc8694724e8abd5b/absl-py-0.7.1.tar.gz
 (99kB)
Collecting jinja2>=2.7 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 15))
  Using cached 
https://files.pythonhosted.org/packages/1d/e7/fd8b501e7a6dfe492a433deb7b9d833d39ca74916fa8bc63dd1a4947a671/Jinja2-2.10.1-py2.py3-none-any.whl
Requirement already satisfied: setuptools in 
./env/.perfkit_env/lib/python2.7/site-packages (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 16)) (41.0.1)
Collecting colorlog[windows]==2.6.0 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 17))
  Downloading 
https://files.pythonhosted.org/packages/59/1a/46a1bf2044ad8b30b52fed0f389338c85747e093fe7f51a567f4cb525892/colorlog-2.6.0-py2.py3-none-any.whl
Collecting blinker>=1.3 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 18))
  Downloading 
https://files.pythonhosted.org/packages/1b/51/e2a9f3b757eb802f61dc1f2b09c8c99f6eb01cf06416c0671253536517b6/blinker-1.4.tar.gz
 (111kB)
Collecting futures>=3.0.3 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 19))
  Using cached 
https://files.pythonhosted.org/packages/2d/99/b2c4e9d5a30f6471e410a146232b4118e697fa3ffc06d6a65efde84debd0/futures-3.2.0-py2-none-any.whl
Collecting PyYAML==3.12 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 20))
  Downloading 
https://files.pythonhosted.org/packages/4a/85/db5a2df477072b2902b0eb892feb37d88ac635d36245a72a6a69b23b383a/PyYAML-3.12.tar.gz
 (253kB)
Collecting pint>=0.7 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 21))
  Downloading 
https://files.pythonhosted.org/packages/15/9d/bf177ebbc57d25e9e296addc14a1303d1e34d7964af5df428a8332349c42/Pint-0.9-py2.py3-none-any.whl
 (138kB)
Collecting numpy==1.13.3 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 22))
  Downloading 
https://files.pythonhosted.org/packages/eb/be/737f3df5806192ac4096e549e48c8c76cfaa2fb880a1c62a7bb085adaa9b/numpy-1.13.3-cp27-cp27mu-manylinux1_x86_64.whl
 (16.6MB)
Collecting functools32 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 23))
  Downloading 
https://files.pythonhosted.org/packages/c5/60/6ac26ad05857c601308d8fb9e87fa36d0ebf889423f47c3502ef034365db/functools32-3.2.3-2.tar.gz
Collecting contextlib2>=0.5.1 (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 24))
  Downloading 
https://files.pythonhosted.org/packages/a2/71/8273a7eeed0aff6a854237ab5453bc9aa67deb49df4832801c21f0ff3782/contextlib2-0.5.5-py2.py3-none-any.whl
Collecting pywinrm (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/0d/12/13a3117bbd2230043aa32dcfa2198c33269665eaa1a8fa26174ce49b338f/pywinrm-0.3.0-py2.py3-none-any.whl
Collecting timeout-decorator (from -r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 26))
  Downloading 
https://files.pythonhosted.org/packages/07/1c/0d9adcb848f1690f3253dcb1c1557b6cf229a93e724977cb83f266cbd0ae/timeout-decorator-0.4.1.tar.gz
Collecting six (from absl-py->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 14))
  Using cached 
https://files.pythonhosted.org/packages/73/fb/00a976f728d0d1fecfe898238ce23f502a721c0ac0ecfedb80e0d88c64e9/six-1.12.0-py2.py3-none-any.whl
Collecting enum34 (from absl-py->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 14))
  Using cached 
https://files.pythonhosted.org/packages/c5/db/e56e6b4bbac7c4a06de1c50de6fe1ef3810018ae11732a50f15f62c7d050/enum34-1.1.6-py2-none-any.whl
Collecting MarkupSafe>=0.23 (from jinja2>=2.7->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 15))
  Using cached 
https://files.pythonhosted.org/packages/fb/40/f3adb7cf24a8012813c5edb20329eb22d5d8e2a0ecf73d21d6b85865da11/MarkupSafe-1.1.1-cp27-cp27mu-manylinux1_x86_64.whl
Collecting colorama; extra == "windows" (from colorlog[windows]==2.6.0->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 17))
  Downloading 
https://files.pythonhosted.org/packages/4f/a6/728666f39bfff1719fc94c481890b2106837da9318031f71a8424b662e12/colorama-0.4.1-py2.py3-none-any.whl
Collecting funcsigs; python_version == "2.7" (from pint>=0.7->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 21))
  Using cached 
https://files.pythonhosted.org/packages/69/cb/f5be453359271714c01b9bd06126eaf2e368f1fddfff30818754b5ac2328/funcsigs-1.0.2-py2.py3-none-any.whl
Collecting requests-ntlm>=0.3.0 (from pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/03/4b/8b9a1afde8072c4d5710d9fa91433d504325821b038e00237dc8d6d833dc/requests_ntlm-1.1.0-py2.py3-none-any.whl
Collecting requests>=2.9.1 (from pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Using cached 
https://files.pythonhosted.org/packages/51/bd/23c926cd341ea6b7dd0b2a00aba99ae0f828be89d72b2190f27c11d4b7fb/requests-2.22.0-py2.py3-none-any.whl
Collecting xmltodict (from pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/28/fd/30d5c1d3ac29ce229f6bdc40bbc20b28f716e8b363140c26eff19122d8a5/xmltodict-0.12.0-py2.py3-none-any.whl
Collecting cryptography>=1.3 (from requests-ntlm>=0.3.0->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/c3/c1/cf8665c955c9393e9ff0872ba6cd3dc6f46ef915e94afcf6e0410508ca69/cryptography-2.6.1-cp27-cp27mu-manylinux1_x86_64.whl
 (2.3MB)
Collecting ntlm-auth>=1.0.2 (from requests-ntlm>=0.3.0->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/d5/3d/1c54e92f62bbc747a638da94adb439f99dc2d2f3041fe41a06b0da4f2808/ntlm_auth-1.3.0-py2.py3-none-any.whl
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 (from 
requests>=2.9.1->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Using cached 
https://files.pythonhosted.org/packages/39/ec/d93dfc69617a028915df914339ef66936ea976ef24fa62940fd86ba0326e/urllib3-1.25.2-py2.py3-none-any.whl
Collecting certifi>=2017.4.17 (from requests>=2.9.1->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Using cached 
https://files.pythonhosted.org/packages/60/75/f692a584e85b7eaba0e03827b3d51f45f571c2e793dd731e598828d380aa/certifi-2019.3.9-py2.py3-none-any.whl
Collecting chardet<3.1.0,>=3.0.2 (from requests>=2.9.1->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Using cached 
https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl
Collecting idna<2.9,>=2.5 (from requests>=2.9.1->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Using cached 
https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl
Collecting cffi!=1.11.3,>=1.8 (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/8d/e9/0c8afd1579e5cf7bc0f06fbcd7cdb954cbc0baadd505973949a99337da1c/cffi-1.12.3-cp27-cp27mu-manylinux1_x86_64.whl
 (415kB)
Collecting asn1crypto>=0.21.0 (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/ea/cd/35485615f45f30a510576f1a56d1e0a7ad7bd8ab5ed7cdc600ef7cd06222/asn1crypto-0.24.0-py2.py3-none-any.whl
 (101kB)
Collecting ipaddress; python_version < "3" (from 
cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/fc/d0/7fc3a811e011d4b388be48a0e381db8d990042df54aa4ef4599a31d39853/ipaddress-1.0.22-py2.py3-none-any.whl
Collecting pycparser (from 
cffi!=1.11.3,>=1.8->cryptography>=1.3->requests-ntlm>=0.3.0->pywinrm->-r 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/requirements.txt>
 (line 25))
  Downloading 
https://files.pythonhosted.org/packages/68/9e/49196946aee219aead1290e00d1e7fdeab8567783e83e1b9ab5585e6206a/pycparser-2.19.tar.gz
 (158kB)
Building wheels for collected packages: absl-py, blinker, PyYAML, functools32, 
timeout-decorator, pycparser
  Building wheel for absl-py (setup.py): started
  Building wheel for absl-py (setup.py): finished with status 'done'
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/ee/98/38/46cbcc5a93cfea5492d19c38562691ddb23b940176c14f7b48
  Building wheel for blinker (setup.py): started
  Building wheel for blinker (setup.py): finished with status 'done'
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/92/a0/00/8690a57883956a301d91cf4ec999cc0b258b01e3f548f86e89
  Building wheel for PyYAML (setup.py): started
  Building wheel for PyYAML (setup.py): finished with status 'done'
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/03/05/65/bdc14f2c6e09e82ae3e0f13d021e1b6b2481437ea2f207df3f
  Building wheel for functools32 (setup.py): started
  Building wheel for functools32 (setup.py): finished with status 'done'
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/b5/18/32/77a1030457155606ba5e3ec3a8a57132b1a04b1c4f765177b2
  Building wheel for timeout-decorator (setup.py): started
  Building wheel for timeout-decorator (setup.py): finished with status 'done'
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/f1/e6/ea/7387e3629cb46ba65140141f972745b823f4486c6fe884ccb8
  Building wheel for pycparser (setup.py): started
  Building wheel for pycparser (setup.py): finished with status 'done'
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/f2/9a/90/de94f8556265ddc9d9c8b271b0f63e57b26fb1d67a45564511
Successfully built absl-py blinker PyYAML functools32 timeout-decorator 
pycparser
Installing collected packages: six, enum34, absl-py, MarkupSafe, jinja2, 
colorama, colorlog, blinker, futures, PyYAML, funcsigs, pint, numpy, 
functools32, contextlib2, urllib3, certifi, chardet, idna, requests, pycparser, 
cffi, asn1crypto, ipaddress, cryptography, ntlm-auth, requests-ntlm, xmltodict, 
pywinrm, timeout-decorator
Successfully installed MarkupSafe-1.1.1 PyYAML-3.12 absl-py-0.7.1 
asn1crypto-0.24.0 blinker-1.4 certifi-2019.3.9 cffi-1.12.3 chardet-3.0.4 
colorama-0.4.1 colorlog-2.6.0 contextlib2-0.5.5 cryptography-2.6.1 enum34-1.1.6 
funcsigs-1.0.2 functools32-3.2.3.post2 futures-3.2.0 idna-2.8 ipaddress-1.0.22 
jinja2-2.10.1 ntlm-auth-1.3.0 numpy-1.13.3 pint-0.9 pycparser-2.19 
pywinrm-0.3.0 requests-2.22.0 requests-ntlm-1.1.0 six-1.12.0 
timeout-decorator-0.4.1 urllib3-1.25.2 xmltodict-0.12.0
[beam_PerformanceTests_Spark] $ /bin/bash -xe /tmp/jenkins4724622526031149707.sh
+ 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/env/.perfkit_env/bin/python>
 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/pkb.py>
 --project=apache-beam-testing --dpb_log_level=INFO 
--bigquery_table=beam_performance.spark_pkp_results --k8s_get_retry_count=36 
--k8s_get_wait_interval=10 
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/> 
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
 --official=true --dpb_service_zone=fake_zone 
--benchmarks=dpb_wordcount_benchmark --dpb_wordcount_input=/etc/hosts 
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
2019-05-22 06:18:43,236 5e4f0a44 MainThread INFO     Verbose logging to: 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5e4f0a44/pkb.log>
2019-05-22 06:18:43,236 5e4f0a44 MainThread INFO     PerfKitBenchmarker 
version: v1.12.0-1191-g6feeabf
2019-05-22 06:18:43,238 5e4f0a44 MainThread INFO     Flag values:
--k8s_get_wait_interval=10
--bigquery_table=beam_performance.spark_pkp_results
--temp_dir=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/>
--dpb_service_zone=fake_zone
--project=apache-beam-testing
--dpb_wordcount_input=/etc/hosts
--official
--dpb_log_level=INFO
--beam_location=<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/src>
--config_override=dpb_wordcount_benchmark.dpb_service.service_type=dataproc
--k8s_get_retry_count=36
--benchmarks=dpb_wordcount_benchmark
2019-05-22 06:18:43,459 5e4f0a44 MainThread INFO     Setting 
--max_concurrent_threads=200.
2019-05-22 06:18:43,481 5e4f0a44 MainThread WARNING  The key "flags" was not in 
the default config, but was in user overrides. This may indicate a typo.
2019-05-22 06:18:43,501 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Provisioning resources for benchmark dpb_wordcount_benchmark
2019-05-22 06:18:43,504 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) ERROR  
  Error during benchmark dpb_wordcount_benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 555, in DoProvisionPhase
    spec.Provision()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py";,>
 line 533, in Provision
    self.dpb_service.Create()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py";,>
 line 257, in Create
    self._CreateResource()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py";,>
 line 285, in WrappedFunction
    return f(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py";,>
 line 208, in _CreateResource
    self._Create()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py";,>
 line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-05-22 06:18:43,505 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Running: gcloud dataproc clusters delete pkb-5e4f0a44 --format json --quiet 
--project apache-beam-testing
2019-05-22 06:18:45,062 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Ran: {gcloud dataproc clusters delete pkb-5e4f0a44 --format json --quiet 
--project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.delete) NOT_FOUND: Not found: Cluster 
projects/apache-beam-testing/regions/global/clusters/pkb-5e4f0a44

2019-05-22 06:18:45,063 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Running: gcloud dataproc clusters describe pkb-5e4f0a44 --format json --quiet 
--project apache-beam-testing
2019-05-22 06:18:45,634 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Ran: {gcloud dataproc clusters describe pkb-5e4f0a44 --format json --quiet 
--project apache-beam-testing}  ReturnCode:1
STDOUT: 
STDERR: ERROR: (gcloud.dataproc.clusters.describe) NOT_FOUND: Not found: 
Cluster projects/apache-beam-testing/regions/global/clusters/pkb-5e4f0a44

2019-05-22 06:18:45,636 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) ERROR  
  Exception running benchmark
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 903, in RunBenchmarkTask
    RunBenchmark(spec, collector)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 752, in RunBenchmark
    DoProvisionPhase(spec, detailed_timer)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/pkb.py";,>
 line 555, in DoProvisionPhase
    spec.Provision()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/benchmark_spec.py";,>
 line 533, in Provision
    self.dpb_service.Create()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py";,>
 line 257, in Create
    self._CreateResource()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/vm_util.py";,>
 line 285, in WrappedFunction
    return f(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/resource.py";,>
 line 208, in _CreateResource
    self._Create()
  File 
"<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/PerfKitBenchmarker/perfkitbenchmarker/providers/gcp/gcp_dpb_dataproc.py";,>
 line 130, in _Create
    self.spec.worker_group.disk_spec.disk_type]
KeyError: 'nodisk'
2019-05-22 06:18:45,637 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) ERROR  
  Benchmark 1/1 dpb_wordcount_benchmark (UID: dpb_wordcount_benchmark0) failed. 
Execution will continue.
2019-05-22 06:18:45,637 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Benchmark run statuses:
---------------------------------------------------------------------------
Name                     UID                       Status  Failed Substatus
---------------------------------------------------------------------------
dpb_wordcount_benchmark  dpb_wordcount_benchmark0  FAILED                  
---------------------------------------------------------------------------
Success rate: 0.00% (0/1)
2019-05-22 06:18:45,637 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Complete logs can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5e4f0a44/pkb.log>
2019-05-22 06:18:45,637 5e4f0a44 MainThread dpb_wordcount_benchmark(1/1) INFO   
  Completion statuses can be found at: 
<https://builds.apache.org/job/beam_PerformanceTests_Spark/ws/runs/5e4f0a44/completion_statuses.json>
Build step 'Execute shell' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to