See <https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Read_Python/2/display/redirect?page=changes>
Changes: [Kenneth Knowles] Split Nexmark QueryTest and SqlQueryTest for clarity [chuck.yang] Support STRUCT, FLOAT64, INT64 BigQuery types [Robert Bradshaw] [BEAM-9577] Migrate PortablePipelineJarCreator to new artifact service. [Robert Bradshaw] spotless [Robert Bradshaw] [BEAM-9577] Remove use of legacy artifact service in Python. [Robert Bradshaw] [BEAM-6215] Additional tests for FlatMap label. [Robert Bradshaw] Simplify Python on Flink runner instructions. [Robert Bradshaw] Fix stray paragraph, separate and rework python. [Kenneth Knowles] Add ZetaSQL Nexmark variant [noreply] fixup! roll back changes (#11958) [noreply] [ BEAM-3788] Updates kafka.py pydocs (#11928) [annaqin] [BEAM-10225] Add log message when starting job server [tysonjh] [BEAM-9999] Remove Gearpump runner. [noreply] [BEAM-8828] Added BigQueryTableProvider WriteDisposition configuration [valentyn] [BEAM-10227] Switches typing version modifier to python_full_version so [noreply] Prototype schema-inferring Row constructor. (#11901) [noreply] [BEAM-10144] Update PipelineOptions snippets for best practices (#11851) [Kamil Wasilewski] [BEAM-8134] Grafana dashboards for Nexmark tests [Robert Bradshaw] Expand note on runner selection. [Rui Wang] [BEAM-10230] @Ignore: BYTES works with LIKE. [Robert Bradshaw] Move Beam Compatibility table below instructions. [noreply] [BEAM-9742] Add Configurable FluentBackoff to JdbcIO Write (#11396) [noreply] Finalize CHANGES.md for 2.22.0 (#11973) [noreply] [BEAM-9679] Add CombinePerKey to Core Transforms Go Katas (#11936) ------------------------------------------ [...truncated 24.65 KB...] Using cached python_dateutil-2.8.1-py2.py3-none-any.whl (227 kB) Collecting pytz>=2018.3 Using cached pytz-2020.1-py2.py3-none-any.whl (510 kB) Collecting typing-extensions<3.8.0,>=3.7.0 Using cached typing_extensions-3.7.4.2-py3-none-any.whl (22 kB) Processing /home/jenkins/.cache/pip/wheels/bc/49/5f/fdb5b9d85055c478213e0158ac122b596816149a02d82e0ab1/avro_python3-1.9.2.1-py3-none-any.whl Collecting pyarrow<0.18.0,>=0.15.1 Using cached pyarrow-0.17.1-cp37-cp37m-manylinux2014_x86_64.whl (63.8 MB) Collecting boto3>=1.9 Using cached boto3-1.14.0-py2.py3-none-any.whl (128 kB) Collecting cachetools<4,>=3.1.0 Using cached cachetools-3.1.1-py2.py3-none-any.whl (11 kB) Processing /home/jenkins/.cache/pip/wheels/19/b5/2f/1cc3cf2b31e7a9cd1508731212526d9550271274d351c96f16/google_apitools-0.5.31-py3-none-any.whl Collecting google-cloud-datastore<1.8.0,>=1.7.1 Using cached google_cloud_datastore-1.7.4-py2.py3-none-any.whl (82 kB) Collecting google-cloud-pubsub<1.1.0,>=0.39.0 Using cached google_cloud_pubsub-1.0.2-py2.py3-none-any.whl (118 kB) Collecting google-cloud-bigquery<=1.24.0,>=1.6.0 Using cached google_cloud_bigquery-1.24.0-py2.py3-none-any.whl (165 kB) Collecting google-cloud-core<2,>=0.28.1 Using cached google_cloud_core-1.3.0-py2.py3-none-any.whl (26 kB) Collecting google-cloud-bigtable<1.1.0,>=0.31.1 Using cached google_cloud_bigtable-1.0.0-py2.py3-none-any.whl (232 kB) Collecting google-cloud-spanner<1.14.0,>=1.13.0 Using cached google_cloud_spanner-1.13.0-py2.py3-none-any.whl (212 kB) Collecting grpcio-gcp<1,>=0.2.2 Using cached grpcio_gcp-0.2.2-py2.py3-none-any.whl (9.4 kB) Collecting google-cloud-dlp<=0.13.0,>=0.12.0 Using cached google_cloud_dlp-0.13.0-py2.py3-none-any.whl (151 kB) Collecting google-cloud-language<2,>=1.3.0 Using cached google_cloud_language-1.3.0-py2.py3-none-any.whl (83 kB) Collecting google-cloud-videointelligence<1.14.0,>=1.8.0 Using cached google_cloud_videointelligence-1.13.0-py2.py3-none-any.whl (177 kB) Collecting google-cloud-vision<0.43.0,>=0.38.0 Using cached google_cloud_vision-0.42.0-py2.py3-none-any.whl (435 kB) Collecting freezegun>=0.3.12 Using cached freezegun-0.3.15-py2.py3-none-any.whl (14 kB) Collecting nose>=1.3.7 Using cached nose-1.3.7-py3-none-any.whl (154 kB) Processing /home/jenkins/.cache/pip/wheels/2e/29/a9/431158315f33abeaad2905460f5ffb18fc30f7ed7c66c47dee/nose_xunitmp-0.4.1-py3-none-any.whl Collecting pandas<0.25,>=0.23.4 Using cached pandas-0.24.2-cp37-cp37m-manylinux1_x86_64.whl (10.1 MB) Collecting parameterized<0.8.0,>=0.7.1 Using cached parameterized-0.7.4-py2.py3-none-any.whl (25 kB) Collecting pyhamcrest!=1.10.0,<2.0.0,>=1.9 Using cached PyHamcrest-1.10.1-py3-none-any.whl (48 kB) Processing /home/jenkins/.cache/pip/wheels/5e/03/1e/e1e954795d6f35dfc7b637fe2277bff021303bd9570ecea653/PyYAML-5.3.1-cp37-cp37m-linux_x86_64.whl Collecting requests_mock<2.0,>=1.7 Using cached requests_mock-1.8.0-py2.py3-none-any.whl (23 kB) Collecting tenacity<6.0,>=5.0.2 Using cached tenacity-5.1.5-py2.py3-none-any.whl (34 kB) Collecting pytest<5.0,>=4.4.0 Using cached pytest-4.6.11-py2.py3-none-any.whl (231 kB) Collecting pytest-xdist<2,>=1.29.0 Using cached pytest_xdist-1.32.0-py2.py3-none-any.whl (36 kB) Collecting pytest-timeout<2,>=1.3.3 Using cached pytest_timeout-1.3.4-py2.py3-none-any.whl (10 kB) Requirement already satisfied: six>=1.5.2 in /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from grpcio<2,>=1.12.1->apache-beam==2.23.0.dev0) (1.15.0) Collecting requests>=2.7.0 Using cached requests-2.23.0-py2.py3-none-any.whl (58 kB) Processing /home/jenkins/.cache/pip/wheels/9b/04/dd/7daf4150b6d9b12949298737de9431a324d4b797ffd63f526e/docopt-0.6.2-py2.py3-none-any.whl Collecting pbr>=0.11 Using cached pbr-5.4.5-py2.py3-none-any.whl (110 kB) Collecting pyasn1-modules>=0.0.5 Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) Collecting pyasn1>=0.1.7 Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) Collecting rsa>=3.1.4 Using cached rsa-4.1-py3-none-any.whl (32 kB) Requirement already satisfied: setuptools in /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from protobuf<4,>=3.5.0.post1->apache-beam==2.23.0.dev0) (47.1.1) Collecting pyparsing>=2.1.4 Using cached pyparsing-2.4.7-py2.py3-none-any.whl (67 kB) Collecting s3transfer<0.4.0,>=0.3.0 Using cached s3transfer-0.3.3-py2.py3-none-any.whl (69 kB) Collecting jmespath<1.0.0,>=0.7.1 Using cached jmespath-0.10.0-py2.py3-none-any.whl (24 kB) Collecting botocore<1.18.0,>=1.17.0 Using cached botocore-1.17.0-py2.py3-none-any.whl (6.3 MB) Collecting fasteners>=0.14 Using cached fasteners-0.15-py2.py3-none-any.whl (23 kB) Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 Using cached google_api_core-1.20.0-py2.py3-none-any.whl (90 kB) Processing /home/jenkins/.cache/pip/wheels/b9/ee/67/2e444183030cb8d31ce8b34cee34a7afdbd3ba5959ea846380/grpc_google_iam_v1-0.12.3-py3-none-any.whl Collecting google-auth<2.0dev,>=1.9.0 Using cached google_auth-1.16.1-py2.py3-none-any.whl (90 kB) Collecting google-resumable-media<0.6dev,>=0.5.0 Using cached google_resumable_media-0.5.1-py2.py3-none-any.whl (38 kB) Requirement already satisfied: py>=1.5.0 in /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<5.0,>=4.4.0->apache-beam==2.23.0.dev0) (1.8.1) Collecting attrs>=17.4.0 Using cached attrs-19.3.0-py2.py3-none-any.whl (39 kB) Collecting packaging Using cached packaging-20.4-py2.py3-none-any.whl (37 kB) Collecting atomicwrites>=1.0 Using cached atomicwrites-1.4.0-py2.py3-none-any.whl (6.8 kB) Requirement already satisfied: pluggy<1.0,>=0.12 in /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<5.0,>=4.4.0->apache-beam==2.23.0.dev0) (0.13.1) Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from pytest<5.0,>=4.4.0->apache-beam==2.23.0.dev0) (1.6.1) Collecting wcwidth Using cached wcwidth-0.2.4-py2.py3-none-any.whl (30 kB) Collecting more-itertools>=4.0.0; python_version > "2.7" Using cached more_itertools-8.3.0-py3-none-any.whl (44 kB) Collecting execnet>=1.1 Using cached execnet-1.7.1-py2.py3-none-any.whl (39 kB) Collecting pytest-forked Using cached pytest_forked-1.1.3-py2.py3-none-any.whl (4.5 kB) Collecting chardet<4,>=3.0.2 Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB) Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 Using cached urllib3-1.25.9-py2.py3-none-any.whl (126 kB) Collecting idna<3,>=2.5 Using cached idna-2.9-py2.py3-none-any.whl (58 kB) Collecting certifi>=2017.4.17 Using cached certifi-2020.4.5.2-py2.py3-none-any.whl (157 kB) Collecting docutils<0.16,>=0.10 Using cached docutils-0.15.2-py3-none-any.whl (547 kB) Collecting monotonic>=0.1 Using cached monotonic-1.5-py2.py3-none-any.whl (5.3 kB) Collecting googleapis-common-protos<2.0dev,>=1.6.0 Using cached googleapis_common_protos-1.52.0-py2.py3-none-any.whl (100 kB) Requirement already satisfied: zipp>=0.5 in /home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/build/gradleenv/1329484227/lib/python3.7/site-packages (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.23.0.dev0) (3.1.0) Collecting apipkg>=1.4 Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB) Building wheels for collected packages: apache-beam Building wheel for apache-beam (setup.py): started Building wheel for apache-beam (setup.py): finished with status 'done' Created wheel for apache-beam: filename=apache_beam-2.23.0.dev0-py3-none-any.whl size=2072843 sha256=fca5eb1f08aa6f69f11a5b2275b30f40772922f5c1068ae7320ecc135f30f88c Stored in directory: /home/jenkins/.cache/pip/wheels/6c/11/4a/60f026337786f37809c7e48ed3dfb82229207cb6eb97abbbf9 Successfully built apache-beam ERROR: google-auth 1.16.1 has requirement rsa<4.1,>=3.1.4, but you'll have rsa 4.1 which is incompatible. Installing collected packages: crcmod, dill, pytz, fastavro, chardet, urllib3, idna, certifi, requests, docopt, hdfs, httplib2, pbr, mock, numpy, pymongo, pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, typing-extensions, avro-python3, pyarrow, jmespath, docutils, botocore, s3transfer, boto3, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, google-cloud-spanner, grpcio-gcp, google-cloud-dlp, google-cloud-language, google-cloud-videointelligence, google-cloud-vision, freezegun, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, attrs, packaging, atomicwrites, wcwidth, more-itertools, pytest, apipkg, execnet, pytest-forked, pytest-xdist, pytest-timeout, apache-beam Successfully installed apache-beam-2.23.0.dev0 apipkg-1.5 atomicwrites-1.4.0 attrs-19.3.0 avro-python3-1.9.2.1 boto3-1.14.0 botocore-1.17.0 cachetools-3.1.1 certifi-2020.4.5.2 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 docutils-0.15.2 execnet-1.7.1 fastavro-0.23.4 fasteners-0.15 freezegun-0.3.15 google-api-core-1.20.0 google-apitools-0.5.31 google-auth-1.16.1 google-cloud-bigquery-1.24.0 google-cloud-bigtable-1.0.0 google-cloud-core-1.3.0 google-cloud-datastore-1.7.4 google-cloud-dlp-0.13.0 google-cloud-language-1.3.0 google-cloud-pubsub-1.0.2 google-cloud-spanner-1.13.0 google-cloud-videointelligence-1.13.0 google-cloud-vision-0.42.0 google-resumable-media-0.5.1 googleapis-common-protos-1.52.0 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 hdfs-2.5.8 httplib2-0.17.4 idna-2.9 jmespath-0.10.0 mock-2.0.0 monotonic-1.5 more-itertools-8.3.0 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.18.5 oauth2client-3.0.0 packaging-20.4 pandas-0.24.2 parameterized-0.7.4 pbr-5.4.5 pyarrow-0.17.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pydot-1.4.1 pyhamcrest-1.10.1 pymongo-3.10.1 pyparsing-2.4.7 pytest-4.6.11 pytest-forked-1.1.3 pytest-timeout-1.3.4 pytest-xdist-1.32.0 python-dateutil-2.8.1 pytz-2020.1 pyyaml-5.3.1 requests-2.23.0 requests-mock-1.8.0 rsa-4.1 s3transfer-0.3.3 tenacity-5.1.5 typing-extensions-3.7.4.2 urllib3-1.25.9 wcwidth-0.2.4 > Task :sdks:python:apache_beam:testing:load_tests:run INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds. INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds. INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:apache_beam.runners.portability.stager:Copying Beam SDK "/home/jenkins/jenkins-slave/workspace/beam_PerformanceTests_BiqQueryIO_Read_Python/src/sdks/python/build/apache-beam.tar.gz" to staging location. WARNING:root:Make sure that locally built Python SDK docker image has Python 3.7 interpreter. INFO:root:Using Python SDK docker image: apache/beam_python3.7_sdk:2.23.0.dev. If the image is not available at local, we will try to pull from hub.docker.com INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-read-python-10gb0610220139.1591826710.979099/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-read-python-10gb0610220139.1591826710.979099/pipeline.pb in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-read-python-10gb0610220139.1591826710.979099/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-read-python-10gb0610220139.1591826710.979099/dataflow_python_sdk.tar in 0 seconds. WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--input_dataset=beam_performance', '--input_table=bqio_read_10GB'] WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: ['--input_dataset=beam_performance', '--input_table=bqio_read_10GB'] INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: '2020-06-10T22:05:13.097527Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2020-06-10_15_05_11-10222654289997226399' location: 'us-central1' name: 'performance-tests-bqio-read-python-10gb0610220139' projectId: 'apache-beam-testing' stageStates: [] startTime: '2020-06-10T22:05:13.097527Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-06-10_15_05_11-10222654289997226399] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-06-10_15_05_11-10222654289997226399 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-10_15_05_11-10222654289997226399?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-06-10_15_05_11-10222654289997226399 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:15.809Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:16.804Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:16.855Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:16.931Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:16.965Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.073Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.217Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.267Z: JOB_MESSAGE_DETAILED: Fusing consumer Count messages into Read from BigQuery INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.307Z: JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Count messages INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.350Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/KeyWithVoid into Measure time INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.387Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial into Count/CombineGlobally(CountCombineFn)/KeyWithVoid INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.423Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.455Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.495Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.532Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.567Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/UnKey into Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.603Z: JOB_MESSAGE_DETAILED: Unzipping flatten s16 for input s14.None INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.638Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.661Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/GroupByWindow into assert_that/Group/GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.695Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/GroupByWindow INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.737Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.779Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.814Z: JOB_MESSAGE_DETAILED: Unzipping flatten s16-u31 for input s17-reify-value9-c29 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.856Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/Write, through flatten assert_that/Group/Flatten/Unzipped-1, into producer assert_that/Group/GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.894Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.932Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:17.970Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.010Z: JOB_MESSAGE_DETAILED: Fusing consumer Count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into Count/CombineGlobally(CountCombineFn)/DoOnce/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.044Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.083Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.107Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.146Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.190Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.234Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.281Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.460Z: JOB_MESSAGE_DEBUG: Executing wait step start41 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.544Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.581Z: JOB_MESSAGE_BASIC: Executing operation Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.604Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.643Z: JOB_MESSAGE_BASIC: Starting 5 workers in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.689Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.702Z: JOB_MESSAGE_BASIC: Finished operation Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.762Z: JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.800Z: JOB_MESSAGE_DEBUG: Value "Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.855Z: JOB_MESSAGE_BASIC: Executing operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:18.894Z: JOB_MESSAGE_BASIC: Executing operation Read from BigQuery+Count messages+Measure time+Count/CombineGlobally(CountCombineFn)/KeyWithVoid+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:19.746Z: JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_5607975880142083237" started. You can check its status with the bq tool: "bq show -j --project_id=apache-beam-testing dataflow_job_5607975880142083237". INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:45.972Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 3 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:46.023Z: JOB_MESSAGE_DETAILED: Resized worker pool to 3, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:50.295Z: JOB_MESSAGE_DETAILED: BigQuery export job progress: "dataflow_job_5607975880142083237" observed total of 60 exported files thus far. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:50.330Z: JOB_MESSAGE_BASIC: BigQuery export job finished: "dataflow_job_5607975880142083237" INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:51.549Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 4 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:51.589Z: JOB_MESSAGE_DETAILED: Resized worker pool to 4, though goal was 5. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:51.867Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:05:57.078Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running stage(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:07:22.173Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:07:22.234Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-10T22:11:20.382Z: JOB_MESSAGE_BASIC: Finished operation assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write FATAL: command execution failed java.io.IOException: Backing channel 'temporal-beam-jenkins2' is disconnected. at hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:214) at hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283) at com.sun.proxy.$Proxy163.isAlive(Unknown Source) at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150) at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142) at hudson.Launcher$ProcStarter.join(Launcher.java:470) at hudson.plugins.gradle.Gradle.perform(Gradle.java:317) at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20) at hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741) at hudson.model.Build$BuildExecution.build(Build.java:206) at hudson.model.Build$BuildExecution.doRun(Build.java:163) at hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504) at hudson.model.Run.execute(Run.java:1853) at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43) at hudson.model.ResourceController.execute(ResourceController.java:97) at hudson.model.Executor.run(Executor.java:427) Caused by: java.io.IOException: Pipe closed after 0 cycles at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:118) at org.apache.sshd.common.channel.ChannelPipedInputStream.read(ChannelPipedInputStream.java:101) at hudson.remoting.FlightRecorderInputStream.read(FlightRecorderInputStream.java:91) at hudson.remoting.ChunkedInputStream.readHeader(ChunkedInputStream.java:73) at hudson.remoting.ChunkedInputStream.readUntilBreak(ChunkedInputStream.java:103) at hudson.remoting.ChunkedCommandTransport.readBlock(ChunkedCommandTransport.java:39) at hudson.remoting.AbstractSynchronousByteArrayCommandTransport.read(AbstractSynchronousByteArrayCommandTransport.java:34) at hudson.remoting.SynchronousCommandTransport$ReaderThread.run(SynchronousCommandTransport.java:63) Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure ERROR: temporal-beam-jenkins2 is offline; cannot locate JDK 1.8 (latest) --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
