See <https://builds.apache.org/job/beam_python_mongoio_load_test/4/display/redirect>
Changes: ------------------------------------------ [...truncated 45.25 KB...] Collecting fasteners>=0.14 Using cached https://files.pythonhosted.org/packages/18/bd/55eb2d6397b9c0e263af9d091ebdb756b15756029b3cededf6461481bc63/fasteners-0.15-py2.py3-none-any.whl Collecting google-api-core[grpc]<2.0.0dev,>=1.6.0 Using cached https://files.pythonhosted.org/packages/29/3a/c528ef37f48d6ffba16f0f3c0426456ba21e0dd32be9c61a2ade93e07faa/google_api_core-1.14.3-py2.py3-none-any.whl Processing /home/jenkins/.cache/pip/wheels/de/3a/83/77a1e18e1a8757186df834b86ce6800120ac9c79cd8ca4091b/grpc_google_iam_v1-0.12.3-cp35-none-any.whl Collecting google-resumable-media<0.5.0dev,>=0.3.1 Using cached https://files.pythonhosted.org/packages/96/d7/b29a41b01b854480891dfc408211ffb0cc7a2a3d5f15a3b6740ec18c845b/google_resumable_media-0.4.1-py2.py3-none-any.whl Collecting wcwidth Using cached https://files.pythonhosted.org/packages/7e/9f/526a6947247599b084ee5232e4f9190a38f398d7300d866af3ab571a5bfe/wcwidth-0.1.7-py2.py3-none-any.whl Collecting pathlib2>=2.2.0; python_version < "3.6" Using cached https://files.pythonhosted.org/packages/e9/45/9c82d3666af4ef9f221cbb954e1d77ddbb513faf552aea6df5f37f1a4859/pathlib2-2.3.5-py2.py3-none-any.whl Requirement already satisfied: py>=1.5.0 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.8.0) Requirement already satisfied: importlib-metadata>=0.12; python_version < "3.8" in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (1.2.0) Collecting atomicwrites>=1.0 Using cached https://files.pythonhosted.org/packages/52/90/6155aa926f43f2b2a22b01be7241be3bfd1ceaf7d0b3267213e8127d41f4/atomicwrites-1.3.0-py2.py3-none-any.whl Requirement already satisfied: pluggy<1.0,>=0.12 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.13.1) Collecting packaging Using cached https://files.pythonhosted.org/packages/cf/94/9672c2d4b126e74c4496c6b3c58a8b51d6419267be9e70660ba23374c875/packaging-19.2-py2.py3-none-any.whl Requirement already satisfied: more-itertools>=4.0.0; python_version > "2.7" in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (8.0.2) Collecting attrs>=17.4.0 Using cached https://files.pythonhosted.org/packages/a2/db/4313ab3be961f7a763066401fb77f7748373b6094076ae2bda2806988af6/attrs-19.3.0-py2.py3-none-any.whl Collecting execnet>=1.1 Using cached https://files.pythonhosted.org/packages/d3/2e/c63af07fa471e0a02d05793c7a56a9f7d274a8489442a5dc4fb3b2b3c705/execnet-1.7.1-py2.py3-none-any.whl Collecting pytest-forked Using cached https://files.pythonhosted.org/packages/03/1e/81235e1fcfed57a4e679d34794d60c01a1e9a29ef5b9844d797716111d80/pytest_forked-1.1.3-py2.py3-none-any.whl Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1 Using cached https://files.pythonhosted.org/packages/b4/40/a9837291310ee1ccc242ceb6ebfd9eb21539649f193a7c8c86ba15b98539/urllib3-1.25.7-py2.py3-none-any.whl Collecting certifi>=2017.4.17 Using cached https://files.pythonhosted.org/packages/b9/63/df50cac98ea0d5b006c55a399c3bf1db9da7b5a24de7890bc9cfd5dd9e99/certifi-2019.11.28-py2.py3-none-any.whl Collecting chardet<3.1.0,>=3.0.2 Using cached https://files.pythonhosted.org/packages/bc/a9/01ffebfb562e4274b6487b4bb1ddec7ca55ec7510b22e4c51f14098443b8/chardet-3.0.4-py2.py3-none-any.whl Collecting idna<2.9,>=2.5 Using cached https://files.pythonhosted.org/packages/14/2c/cd551d81dbe15200be1cf41cd03869a46fe7226e7450af7a6545bfc474c9/idna-2.8-py2.py3-none-any.whl Collecting monotonic>=0.1 Using cached https://files.pythonhosted.org/packages/ac/aa/063eca6a416f397bd99552c534c6d11d57f58f2e94c14780f3bbf818c4cf/monotonic-1.5-py2.py3-none-any.whl Processing /home/jenkins/.cache/pip/wheels/9e/3d/a2/1bec8bb7db80ab3216dbc33092bb7ccd0debfb8ba42b5668d5/googleapis_common_protos-1.6.0-cp35-none-any.whl Collecting google-auth<2.0dev,>=0.4.0 Using cached https://files.pythonhosted.org/packages/ec/11/1d90cbfa72a084b08498e8cea1fee199bc965cdac391d241f5ae6257073e/google_auth-1.7.2-py2.py3-none-any.whl Requirement already satisfied: zipp>=0.5 in <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages> (from importlib-metadata>=0.12; python_version < "3.8"->pytest<5.0,>=4.4.0->apache-beam==2.19.0.dev0) (0.6.0) Collecting apipkg>=1.4 Using cached https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl Installing collected packages: crcmod, dill, fastavro, future, urllib3, certifi, chardet, idna, requests, docopt, hdfs, httplib2, pbr, mock, numpy, pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, python-dateutil, pytz, avro-python3, pyarrow, cachetools, monotonic, fasteners, google-apitools, googleapis-common-protos, google-auth, google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, tenacity, wcwidth, pathlib2, atomicwrites, packaging, attrs, pytest, apipkg, execnet, pytest-forked, pytest-xdist, apache-beam Running setup.py develop for apache-beam Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.11.28 chardet-3.0.4 crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 fasteners-0.15 future-0.18.2 google-api-core-1.14.3 google-apitools-0.5.28 google-auth-1.7.2 google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 google-cloud-core-1.1.0 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.4 oauth2client-3.0.0 packaging-19.2 pandas-0.24.2 parameterized-0.6.3 pathlib2-2.3.5 pbr-5.4.4 pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.7 pydot-1.4.1 pyhamcrest-1.9.0 pymongo-3.9.0 pyparsing-2.4.5 pytest-4.6.7 pytest-forked-1.1.3 pytest-xdist-1.30.0 python-dateutil-2.8.1 pytz-2019.3 pyyaml-5.2 requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 urllib3-1.25.7 wcwidth-0.1.7 > Task :sdks:python:test-suites:dataflow:py35:mongodbioIT --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:79: FutureWarning: WriteToMongoDB is experimental. known_args.batch_size)) INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/pipeline.pb... INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:oauth2client.transport:Attempting refresh to obtain initial access_token INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/pipeline.pb in 0 seconds. INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206212037-804024.1575667237.805009/dataflow_python_sdk.tar in 0 seconds. INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job createTime: '2019-12-06T21:20:40.467265Z' currentStateTime: '1970-01-01T00:00:00Z' id: '2019-12-06_13_20_38-4196865902256457411' location: 'us-central1' name: 'beamapp-jenkins-1206212037-804024' projectId: 'apache-beam-testing' stageStates: [] startTime: '2019-12-06T21:20:40.467265Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2019-12-06_13_20_38-4196865902256457411] INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-06_13_20_38-4196865902256457411?project=apache-beam-testing INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2019-12-06_13_20_38-4196865902256457411 is in state JOB_STATE_PENDING INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:38.729Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-12-06_13_20_38-4196865902256457411. The number of workers will be between 1 and 1000. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:38.729Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-12-06_13_20_38-4196865902256457411. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:42.617Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.289Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.935Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.960Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:43.987Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.017Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.089Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.125Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.149Z: JOB_MESSAGE_DETAILED: Fusing consumer Create documents into Create/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.173Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/ParDo(_GenerateObjectIdFn) into Create documents INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.197Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/AddRandomKeys into WriteToMongoDB/ParDo(_GenerateObjectIdFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.234Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into WriteToMongoDB/Reshuffle/AddRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.268Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify into WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.292Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write into WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.326Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.361Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.388Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/Reshuffle/RemoveRandomKeys into WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.424Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteToMongoDB/ParDo(_WriteMongoFn) into WriteToMongoDB/Reshuffle/RemoveRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.462Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.498Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.527Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.563Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.718Z: JOB_MESSAGE_DEBUG: Executing wait step start13 INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.802Z: JOB_MESSAGE_BASIC: Executing operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.849Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.883Z: JOB_MESSAGE_BASIC: Starting 5 workers in us-central1-f... INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:44.915Z: JOB_MESSAGE_BASIC: Finished operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Create INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:45.002Z: JOB_MESSAGE_DEBUG: Value "WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:20:45.076Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Create documents+WriteToMongoDB/ParDo(_GenerateObjectIdFn)+WriteToMongoDB/Reshuffle/AddRandomKeys+WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2019-12-06_13_20_38-4196865902256457411 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:21:10.627Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:21:13.346Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:22:41.112Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:22:41.143Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:24:19.896Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 1 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:24:24.928Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 5 to 1. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:26:44.686Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:32:44.686Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.283Z: JOB_MESSAGE_BASIC: Finished operation Create/Read+Create documents+WriteToMongoDB/ParDo(_GenerateObjectIdFn)+WriteToMongoDB/Reshuffle/AddRandomKeys+WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Reify+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.418Z: JOB_MESSAGE_BASIC: Executing operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.469Z: JOB_MESSAGE_BASIC: Finished operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Close INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:35:08.553Z: JOB_MESSAGE_BASIC: Executing operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+WriteToMongoDB/Reshuffle/RemoveRandomKeys+WriteToMongoDB/ParDo(_WriteMongoFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:20.764Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 1 to 6. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:26.771Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 5 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:26.803Z: JOB_MESSAGE_DETAILED: Resized worker pool to 5, though goal was 6. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:36:32.222Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 6 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:38:44.689Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:42:51.688Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 6 to 7. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:42:57.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 7 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:43:56.052Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 7 to 8. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:44:07.266Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 8 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:44:44.688Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:44:55.967Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 8 to 9. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:45:01.698Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 9 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:45:54.070Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 9 to 10. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:45:59.919Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 10 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:24.452Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 10 to 14. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:30.323Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 13 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:30.358Z: JOB_MESSAGE_DETAILED: Resized worker pool to 13, though goal was 14. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:48:35.795Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 14 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:49:22.785Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 14 to 16. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:49:28.598Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 16 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:50:44.689Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:51:55.176Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 16 to 21. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:52:01.466Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 21 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:53:51.083Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 21 to 29. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:53:57.184Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 28 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:53:57.217Z: JOB_MESSAGE_DETAILED: Resized worker pool to 28, though goal was 29. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:54:02.637Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 29 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:55:53.913Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 29 to 33. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:55:59.736Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 32 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:55:59.803Z: JOB_MESSAGE_DETAILED: Resized worker pool to 32, though goal was 33. This could be a quota issue. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:56:05.218Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 33 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:56:44.688Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:57:23.523Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 33 to 42. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:57:29.337Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 42 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:58:54.318Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 42 to 58. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T21:59:00.069Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 58 based on the rate of progress in the currently running step(s). INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:00:25.086Z: JOB_MESSAGE_BASIC: Autoscaling: Resizing worker pool from 58 to 66. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:00:30.963Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 66 based on the rate of progress in the currently running step(s). INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2) INFO:oauth2client.transport:Refreshing due to a 401 (attempt 2/2) INFO:oauth2client.transport:Refreshing due to a 401 (attempt 1/2) INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:02:44.691Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.204Z: JOB_MESSAGE_BASIC: Finished operation WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read+WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+WriteToMongoDB/Reshuffle/RemoveRandomKeys+WriteToMongoDB/ParDo(_WriteMongoFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.323Z: JOB_MESSAGE_DEBUG: Executing success step success11 INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.447Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.501Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:04:24.532Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:07:39.691Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 66 to 0. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:07:39.796Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2019-12-06T22:07:39.865Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2019-12-06_13_20_38-4196865902256457411 is in state JOB_STATE_DONE INFO:__main__:Writing 10000000 documents to mongodb finished in 2835.091 seconds WARNING:apache_beam.options.pipeline_options:--region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1575667237 <https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:94: FutureWarning: ReadFromMongoDB is experimental. | 'Map' >> beam.Map(lambda doc: doc['number']) INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/pipeline.pb... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/pipeline.pb in 2 seconds. INFO:apache_beam.runners.portability.stager:Copying Beam SDK "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/dataflow_python_sdk.tar... INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to gs://temp-storage-for-perf-tests/loadtests/beamapp-jenkins-1206221756-641140.1575670676.641643/dataflow_python_sdk.tar in 0 seconds. Traceback (most recent call last): File "/usr/lib/python3.5/runpy.py", line 184, in _run_module_as_main "__main__", mod_spec) File "/usr/lib/python3.5/runpy.py", line 85, in _run_code exec(code, run_globals) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py",> line 107, in <module> run() File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py",> line 96, in run r, equal_to([number for number in range(known_args.num_documents)])) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__ self.run().wait_until_finish() File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run else test_runner_api)) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run self._options).run(False) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 513, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 551, in create_job return self.submit_job_description(job) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 598, in submit_job_description response = self._client.projects_locations_jobs.Create(request) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 629, in Create config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpBadRequestError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs?alt=json>: response: <{'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'content-length': '229', 'x-frame-options': 'SAMEORIGIN', '-content-encoding': 'gzip', 'x-xss-protection': '0', 'vary': 'Origin, X-Origin, Referer', 'cache-control': 'private', 'server': 'ESF', 'content-type': 'application/json; charset=UTF-8', 'date': 'Fri, 06 Dec 2019 22:19:59 GMT', 'status': '400'}>, content <{ "error": { "code": 400, "message": "(e17212cdce498236): The job graph is too large. Please try again with a smaller job graph, or split your job into two or more smaller jobs.", "status": "INVALID_ARGUMENT" } } > > Task :sdks:python:test-suites:dataflow:py35:mongodbioIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_python_mongoio_load_test/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 115 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:mongodbioIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 0m 14s 5 actionable tasks: 5 executed Publishing build scan... https://scans.gradle.com/s/iclrx53y73ayq Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
