See
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/416/display/redirect?page=changes>
Changes:
[douglas.damon] Initiate Go module
[douglas.damon] Rename Introduction-> introduction
[douglas.damon] Fix go.mod vanity URL
[douglas.damon] Rename Core Transforms -> core_transforms
[douglas.damon] Rename core_transforms/{Map -> map}
[douglas.damon] Rename core_transforms/map/{ParDo -> pardo}
[douglas.damon] Rename core_transforms/map/{ParDo OneToMany -> pardo_onetomany}
[douglas.damon] Rename core_transforms/map/{ParDo Struct -> pardo_struct}
[douglas.damon] Rename core_transforms/{GroupByKey/GroupByKey ->
groupbykey/groupbykey}
[douglas.damon] Rename core_transforms/{CoGroupByKey/CoGroupByKey ->
[douglas.damon] Combine/Simple Function => combine/simple_function
[douglas.damon] Rename Flatten -> flatten
[douglas.damon] Rename Partition -> partition
[douglas.damon] Rename {Side Input -> side_input}
[douglas.damon] Rename {Additional Outputs -> additional_outputs}
[douglas.damon] Rename {Branching -> branching}
[Maximilian Michels] [BEAM-10306] Add latency measurements to Python Flink
ParDo load test
[Maximilian Michels] [BEAM-10306] Add latency measurements to Python Flink
GroupByKey load
[Damian Gadomski] removing slack token credentials binding from all CI jobs
except the one
[douglas.damon] Rename CombineFn -> combinefn
[douglas.damon] Rename {Combine Per Key -> combine_perkey}
[noreply] [BEAM-9702] Update Java KinesisIO to support AWS SDK v2 (#11318)
[dcavazos] [BEAM-7390] Add groupintobatches code snippets
[Maximilian Michels] [BEAM-9976] Increase timeout for FlinkSavepointTest
[Maximilian Michels] [BEAM-9976] Add a check to FlinkSavepointTest to ensure no
job is
[Maximilian Michels] [BEAM-9976] Remove legacy workaround to retrieve rest
address
[aromanenko.dev] [BEAM-9702] Update CHANGES.md
[Robert Bradshaw] Fix code that was tripping up the linter.
[Robert Bradshaw] Remove trailing whitespace.
[noreply] [BEAM-10559] Add some comments and clean up SQL example. (#12355)
[douglas.damon] Rename {GroupByKey -> groupbykey}
[douglas.damon] Update stepik
[noreply] [BEAM-7390] Add groupbykey code snippets (#12370)
[noreply] [BEAM-10398] Use GitHub Actions in wheels release process for Python
[noreply] [BEAM-10552] Update protobuf package to 3.12.2 (#12334)
[kcweaver] Fix spotless groovy formatting.
[je.ik] [BEAM-8648] Deprecate OutputHints from Euphoria API.
[noreply] [BEAM-10294] using SparkMetricsContainerStepMap for readable metrics
------------------------------------------
[...truncated 26.92 KB...]
Collecting google-cloud-core<2,>=0.28.1
Using cached google_cloud_core-1.3.0-py2.py3-none-any.whl (26 kB)
Collecting google-cloud-bigtable<1.1.0,>=0.31.1
Using cached google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB)
Collecting google-cloud-spanner<1.14.0,>=1.13.0
Using cached google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB)
Collecting grpcio-gcp<1,>=0.2.2
Using cached grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB)
Collecting google-cloud-dlp<=0.13.0,>=0.12.0
Using cached google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB)
Collecting google-cloud-language<2,>=1.3.0
Using cached google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB)
Collecting google-cloud-videointelligence<1.14.0,>=1.8.0
Using cached google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177
kB)
Collecting google-cloud-vision<0.43.0,>=0.38.0
Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB)
Collecting freezegun>=0.3.12
Using cached freezegun-0.3.15-py2.py3-none-any.whl (14 kB)
Collecting nose>=1.3.7
Using cached nose-1.3.7-py3-none-any.whl (154 kB)
Processing
/home/jenkins/.cache/pip/wheels/2e/29/a9/431158315f33abeaad2905460f5ffb18fc30f7ed7c66c47dee/nose_xunitmp-0.4.1-py3-none-any.whl
Collecting parameterized<0.8.0,>=0.7.1
Using cached parameterized-0.7.4-py2.py3-none-any.whl (25 kB)
Collecting pyhamcrest!=1.10.0,<2.0.0,>=1.9
Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB)
Processing
/home/jenkins/.cache/pip/wheels/5e/03/1e/e1e954795d6f35dfc7b637fe2277bff021303bd9570ecea653/PyYAML-5.3.1-cp37-cp37m-linux_x86_64.whl
Collecting requests_mock<2.0,>=1.7
Using cached requests_mock-1.8.0-py2.py3-none-any.whl (23 kB)
Collecting tenacity<6.0,>=5.0.2
Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB)
Collecting pytest<5.0,>=4.4.0
Using cached pytest-4.6.11-py2.py3-none-any.whl (231 kB)
Collecting pytest-xdist<2,>=1.29.0
Using cached pytest_xdist-1.34.0-py2.py3-none-any.whl (36 kB)
Collecting pytest-timeout<2,>=1.3.3
Using cached pytest_timeout-1.4.2-py2.py3-none-any.whl (10 kB)
Collecting pandas<1,>=0.25.2
Using cached pandas-0.25.3-cp37-cp37m-manylinux1_x86_64.whl (10.4 MB)
Collecting sqlalchemy<2.0,>=1.3
Using cached SQLAlchemy-1.3.18-cp37-cp37m-manylinux2010_x86_64.whl (1.3 MB)
Collecting psycopg2-binary<3.0.0,>=2.8.5
Using cached psycopg2_binary-2.8.5-cp37-cp37m-manylinux1_x86_64.whl (2.9 MB)
Processing
/home/jenkins/.cache/pip/wheels/22/11/6e/126cbd2f544f9966ee22e074e5a3b7ab9b22bb2d4a6217e599/testcontainers-3.0.3-py2.py3-none-any.whl
Requirement already satisfied: six>=1.5.2 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from grpcio<2,>=1.29.0->apache-beam==2.24.0.dev0) (1.15.0)
Processing
/home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl
Collecting pbr>=0.11
Using cached pbr-5.4.5-py2.py3-none-any.whl (110 kB)
Collecting pyasn1>=0.1.7
Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB)
Collecting rsa>=3.1.4
Using cached rsa-4.6-py3-none-any.whl (47 kB)
Collecting pyasn1-modules>=0.0.5
Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB)
Requirement already satisfied: setuptools in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from protobuf<4,>=3.12.2->apache-beam==2.24.0.dev0) (49.2.0)
Collecting pyparsing>=2.1.4
Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB)
Collecting chardet<4,>=3.0.2
Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting certifi>=2017.4.17
Using cached certifi-2020.6.20-py2.py3-none-any.whl (156 kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
Using cached urllib3-1.25.10-py2.py3-none-any.whl (127 kB)
Collecting idna<3,>=2.5
Using cached idna-2.10-py2.py3-none-any.whl (58 kB)
Collecting botocore<1.18.0,>=1.17.29
Using cached botocore-1.17.29-py2.py3-none-any.whl (6.4 MB)
Collecting jmespath<1.0.0,>=0.7.1
Using cached jmespath-0.10.0-py2.py3-none-any.whl (24 kB)
Collecting s3transfer<0.4.0,>=0.3.0
Using cached s3transfer-0.3.3-py2.py3-none-any.whl (69 kB)
Collecting fasteners>=0.14
Using cached fasteners-0.15-py2.py3-none-any.whl (23 kB)
Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0
Using cached google_api_core-1.22.0-py2.py3-none-any.whl (91 kB)
Processing
/home/jenkins/.cache/pip/wheels/b9/ee/67/2e444183030cb8d31ce8b34cee34a7afdbd3ba5959ea846380/grpc_google_iam_v1-0.12.3-py3-none-any.whl
Collecting google-auth<2.0dev,>=1.9.0
Using cached google_auth-1.19.2-py2.py3-none-any.whl (91 kB)
Collecting google-resumable-media<0.6dev,>=0.5.0
Using cached google_resumable_media-0.5.1-py2.py3-none-any.whl (38 kB)
Collecting atomicwrites>=1.0
Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB)
Collecting attrs>=17.4.0
Using cached attrs-19.3.0-py2.py3-none-any.whl (39 kB)
Collecting packaging
Using cached packaging-20.4-py2.py3-none-any.whl (37 kB)
Collecting wcwidth
Using cached wcwidth-0.2.5-py2.py3-none-any.whl (30 kB)
Requirement already satisfied: py>=1.5.0 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from pytest<5.0,>=4.4.0->apache-beam==2.24.0.dev0) (1.9.0)
Requirement already satisfied: pluggy<1.0,>=0.12 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from pytest<5.0,>=4.4.0->apache-beam==2.24.0.dev0) (0.13.1)
Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8"
in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from pytest<5.0,>=4.4.0->apache-beam==2.24.0.dev0) (1.7.0)
Collecting more-itertools>=4.0.0; python_version > "2.7"
Using cached more_itertools-8.4.0-py3-none-any.whl (43 kB)
Collecting execnet>=1.1
Using cached execnet-1.7.1-py2.py3-none-any.whl (39 kB)
Collecting pytest-forked
Using cached pytest_forked-1.3.0-py2.py3-none-any.whl (4.7 kB)
Processing
/home/jenkins/.cache/pip/wheels/62/76/4c/aa25851149f3f6d9785f6c869387ad82b3fd37582fa8147ac6/wrapt-1.12.1-cp37-cp37m-linux_x86_64.whl
Processing
/home/jenkins/.cache/pip/wheels/4e/29/41/2a34b96f3d4bf9ab8ccaac5d86db4334fa17d79e7027a5b86f/blindspin-2.0.1-py3-none-any.whl
Collecting crayons
Using cached crayons-0.3.1-py2.py3-none-any.whl (4.6 kB)
Collecting docker
Using cached docker-4.2.2-py2.py3-none-any.whl (144 kB)
Collecting deprecation
Using cached deprecation-2.1.0-py2.py3-none-any.whl (11 kB)
Collecting docutils<0.16,>=0.10
Using cached docutils-0.15.2-py3-none-any.whl (547 kB)
Collecting monotonic>=0.1
Using cached monotonic-1.5-py2.py3-none-any.whl (5.3 kB)
Collecting googleapis-common-protos<2.0dev,>=1.6.0
Using cached googleapis_common_protos-1.52.0-py2.py3-none-any.whl (100 kB)
Requirement already satisfied: zipp>=0.5 in
<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages>
(from importlib-metadata>=0.12; python_version <
"3.8"->pytest<5.0,>=4.4.0->apache-beam==2.24.0.dev0) (3.1.0)
Collecting apipkg>=1.4
Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB)
Collecting colorama
Using cached colorama-0.4.3-py2.py3-none-any.whl (15 kB)
Collecting websocket-client>=0.32.0
Using cached websocket_client-0.57.0-py2.py3-none-any.whl (200 kB)
Building wheels for collected packages: apache-beam
Building wheel for apache-beam (setup.py): started
Building wheel for apache-beam (setup.py): finished with status 'done'
Created wheel for apache-beam:
filename=apache_beam-2.24.0.dev0-py3-none-any.whl size=2122571
sha256=8009d03e387af559ae5b61114745fb95e6a52c110b4ebac269cc8d39841f8d45
Stored in directory:
/home/jenkins/.cache/pip/wheels/7b/59/6b/8a29b310e3f8232e5e9f8ba3c9c5d92a4bf2dc03051999ab40
Successfully built apache-beam
Installing collected packages: crcmod, dill, pytz, fastavro, docopt, chardet,
certifi, urllib3, idna, requests, hdfs, httplib2, pbr, mock, numpy, pymongo,
pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil,
typing-extensions, avro-python3, pyarrow, jmespath, docutils, botocore,
s3transfer, boto3, cachetools, monotonic, fasteners, google-apitools,
google-auth, googleapis-common-protos, google-api-core, google-cloud-core,
google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub,
google-resumable-media, google-cloud-bigquery, google-cloud-bigtable,
google-cloud-spanner, grpcio-gcp, google-cloud-dlp, google-cloud-language,
google-cloud-videointelligence, google-cloud-vision, freezegun, nose,
nose-xunitmp, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity,
atomicwrites, attrs, packaging, wcwidth, more-itertools, pytest, apipkg,
execnet, pytest-forked, pytest-xdist, pytest-timeout, pandas, sqlalchemy,
psycopg2-binary, wrapt, blindspin, colorama, crayons, websocket-client, docker,
deprecation, testcontainers, apache-beam
Successfully installed apache-beam-2.24.0.dev0 apipkg-1.5 atomicwrites-1.4.0
attrs-19.3.0 avro-python3-1.9.2.1 blindspin-2.0.1 boto3-1.14.29
botocore-1.17.29 cachetools-3.1.1 certifi-2020.6.20 chardet-3.0.4
colorama-0.4.3 crayons-0.3.1 crcmod-1.7 deprecation-2.1.0 dill-0.3.1.1
docker-4.2.2 docopt-0.6.2 docutils-0.15.2 execnet-1.7.1 fastavro-0.23.6
fasteners-0.15 freezegun-0.3.15 google-api-core-1.22.0 google-apitools-0.5.31
google-auth-1.19.2 google-cloud-bigquery-1.24.0 google-cloud-bigtable-1.0.0
google-cloud-core-1.3.0 google-cloud-datastore-1.7.4 google-cloud-dlp-0.13.0
google-cloud-language-1.3.0 google-cloud-pubsub-1.0.2
google-cloud-spanner-1.13.0 google-cloud-videointelligence-1.13.0
google-cloud-vision-0.42.0 google-resumable-media-0.5.1
googleapis-common-protos-1.52.0 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2
hdfs-2.5.8 httplib2-0.17.4 idna-2.10 jmespath-0.10.0 mock-2.0.0 monotonic-1.5
more-itertools-8.4.0 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.19.1
oauth2client-3.0.0 packaging-20.4 pandas-0.25.3 parameterized-0.7.4 pbr-5.4.5
psycopg2-binary-2.8.5 pyarrow-0.17.1 pyasn1-0.4.8 pyasn1-modules-0.2.8
pydot-1.4.1 pyhamcrest-1.10.1 pymongo-3.10.1 pyparsing-2.4.7 pytest-4.6.11
pytest-forked-1.3.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0
python-dateutil-2.8.1 pytz-2020.1 pyyaml-5.3.1 requests-2.24.0
requests-mock-1.8.0 rsa-4.6 s3transfer-0.3.3 sqlalchemy-1.3.18 tenacity-5.1.5
testcontainers-3.0.3 typing-extensions-3.7.4.2 urllib3-1.25.10 wcwidth-0.2.5
websocket-client-0.57.0 wrapt-1.12.1
> Task :sdks:python:apache_beam:testing:load_tests:run
INFO:apache_beam.runners.portability.stager:Copying Beam SDK
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/sdks/python/build/apache-beam.tar.gz">
to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python
3.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.7_sdk:2.24.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0728125337.1595947048.919972/pipeline.pb...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0728125337.1595947048.919972/pipeline.pb
in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0728125337.1595947048.919972/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to
gs://temp-storage-for-perf-tests/loadtests/load-tests-python-dataflow-batch-gbk-6-0728125337.1595947048.919972/dataflow_python_sdk.tar
in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args:
['--fanout=1', '--iterations=4']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args:
['--fanout=1', '--iterations=4']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: '2020-07-28T14:37:30.978681Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-07-28_07_37_29-14988388262633426939'
location: 'us-central1'
name: 'load-tests-python-dataflow-batch-gbk-6-0728125337'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-07-28T14:37:30.978681Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-07-28_07_37_29-14988388262633426939]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2020-07-28_07_37_29-14988388262633426939
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-07-28_07_37_29-14988388262633426939?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-28_07_37_29-14988388262633426939 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:33.707Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:34.634Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:34.685Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey 0: GroupByKey
not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:34.768Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:34.815Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:34.961Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.037Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.077Z:
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: Start into Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.116Z:
JOB_MESSAGE_DETAILED: Fusing consumer Assign timestamps into Measure time: Start
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.155Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey 0/Reify into Assign timestamps
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.183Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey 0/Write into GroupByKey 0/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.206Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey 0/GroupByWindow into
GroupByKey 0/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.235Z:
JOB_MESSAGE_DETAILED: Fusing consumer Ungroup 0 into GroupByKey 0/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.261Z:
JOB_MESSAGE_DETAILED: Fusing consumer Measure latency into Ungroup 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.290Z:
JOB_MESSAGE_DETAILED: Fusing consumer Measure time: End 0 into Measure latency
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.326Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.361Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.383Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.406Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.536Z:
JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.593Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey 0/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.629Z:
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.656Z:
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.696Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey 0/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.884Z:
JOB_MESSAGE_DEBUG: Value "GroupByKey 0/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:37:35.941Z:
JOB_MESSAGE_BASIC: Executing operation Read+Measure time: Start+Assign
timestamps+GroupByKey 0/Reify+GroupByKey 0/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-28_07_37_29-14988388262633426939 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:38:03.014Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 2 based on the
rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:38:03.054Z:
JOB_MESSAGE_DETAILED: Resized **** pool to 2, though goal was 5. This could be
a quota issue.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:38:06.496Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:38:08.445Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 based on the
rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:39:41.356Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:39:41.423Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:47:17.317Z:
JOB_MESSAGE_BASIC: Finished operation Read+Measure time: Start+Assign
timestamps+GroupByKey 0/Reify+GroupByKey 0/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:47:17.396Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey 0/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:47:17.501Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey 0/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:47:17.576Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey 0/Read+GroupByKey
0/GroupByWindow+Ungroup 0+Measure latency+Measure time: End 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:49:13.664Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey 0/Read+GroupByKey
0/GroupByWindow+Ungroup 0+Measure latency+Measure time: End 0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:49:13.722Z:
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:49:13.849Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:49:13.905Z:
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:49:13.925Z:
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:50:15.123Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool from 5 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:50:15.170Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-07-28T14:50:15.204Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-07-28_07_37_29-14988388262633426939 is in state JOB_STATE_DONE
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results
for test: e3e0b4acdfad45e48f6be8de95a3bfc6 and timestamp: 1595947821.801332:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric:
python_dataflow_batch_gbk_6_runtime Value: 381
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric:
python_dataflow_batch_gbk_6_measure_latency_count_latency Value: 400
Traceback (most recent call last):
File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
exec(code, run_globals)
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/group_by_key_test.py",>
line 120, in <module>
GroupByKeyTest().run()
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py",>
line 153, in run
self._metrics_monitor.publish_metrics(self.result)
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test_metrics_utils.py",>
line 227, in publish_metrics
publisher.publish(insert_dicts)
File
"<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test_metrics_utils.py",>
line 375, in publish
% (result[METRICS_TYPE_LABEL], result[VALUE_LABEL])
TypeError: %d format: a number is required, not NoneType
> Task :sdks:python:apache_beam:testing:load_tests:run FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://ci-beam.apache.org/job/beam_LoadTests_Python_GBK_reiterate_Dataflow_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
line: 58
* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 14m 2s
5 actionable tasks: 5 executed
Publishing build scan...
https://gradle.com/s/ydvztl7j3ctxi
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]