See <https://builds.apache.org/job/beam_PostCommit_Python37/1345/display/redirect?page=changes>
Changes: [iemejia] [BEAM-8956] Begin unifying contributor instructions into a single ------------------------------------------ [...truncated 3.26 MB...] currentStateTime: '1970-01-01T00:00:00Z' id: '2020-01-13_02_21_06-10599528324914403815' location: 'us-central1' name: 'beamapp-jenkins-0113095624-125184' projectId: 'apache-beam-testing' stageStates: [] startTime: '2020-01-13T10:21:07.230193Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: [2020-01-13_02_21_06-10599528324914403815] apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_21_06-10599528324914403815?project=apache-beam-testing apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-13_02_21_06-10599528324914403815 is in state JOB_STATE_RUNNING apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:06.203Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-01-13_02_21_06-10599528324914403815. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:06.203Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-01-13_02_21_06-10599528324914403815. The number of workers will be between 1 and 1000. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:09.242Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:10.249Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:11.866Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:11.899Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from datastore/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:11.930Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:11.950Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.011Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.051Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.085Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read from datastore/UserQuery/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.120Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/AddRandomKeys into read from datastore/SplitQuery apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.146Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into read from datastore/Reshuffle/AddRandomKeys apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.179Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify into read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.206Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write into read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.241Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.268Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.298Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Reshuffle/RemoveRandomKeys into read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.327Z: JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from datastore/Reshuffle/RemoveRandomKeys apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.355Z: JOB_MESSAGE_DETAILED: Fusing consumer To Keys into read from datastore/Read apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.382Z: JOB_MESSAGE_DETAILED: Fusing consumer delete entities/Write Batch to Datastore into To Keys apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.417Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.447Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.476Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.505Z: JOB_MESSAGE_DEBUG: Assigning stage ids. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.641Z: JOB_MESSAGE_DEBUG: Executing wait step start13 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.745Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.790Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.824Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.859Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Create apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:12.927Z: JOB_MESSAGE_DEBUG: Value "read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:13.006Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:39.061Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:21:40.741Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:23:02.886Z: JOB_MESSAGE_DETAILED: Workers have started successfully. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:23:02.904Z: JOB_MESSAGE_DETAILED: Workers have started successfully. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:39.928Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read from datastore/SplitQuery+read from datastore/Reshuffle/AddRandomKeys+read from datastore/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Reify+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:40.063Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:40.114Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:40.188Z: JOB_MESSAGE_BASIC: Executing operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from datastore/Reshuffle/RemoveRandomKeys+read from datastore/Read+To Keys+delete entities/Write Batch to Datastore apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:51.894Z: JOB_MESSAGE_BASIC: Finished operation read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/Read+read from datastore/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+read from datastore/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+read from datastore/Reshuffle/RemoveRandomKeys+read from datastore/Read+To Keys+delete entities/Write Batch to Datastore apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:51.966Z: JOB_MESSAGE_DEBUG: Executing success step success11 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:52.074Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:52.116Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:26:52.134Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:28:21.982Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:28:22.028Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-01-13T10:28:22.066Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-01-13_02_21_06-10599528324914403815 is in state JOB_STATE_DONE apache_beam.io.gcp.datastore.v1new.datastore_write_it_pipeline: INFO: Querying for the entities to make sure there are none present. root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] root: DEBUG: Unhandled type_constraint: Union[] apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/pipeline.pb... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/pipeline.pb in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/requirements.txt... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/requirements.txt in 0 seconds. apache_beam.runners.portability.stager: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-42.0.2.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-42.0.2.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/PyHamcrest-1.10.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/PyHamcrest-1.10.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/PyHamcrest-1.9.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/PyHamcrest-1.9.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/mock-3.0.5.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/mock-3.0.5.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-45.0.0.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-45.0.0.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-43.0.0.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-43.0.0.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/six-1.13.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/six-1.13.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-44.0.0.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/setuptools-44.0.0.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/funcsigs-1.0.2.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/funcsigs-1.0.2.tar.gz in 0 seconds. apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/dataflow_python_sdk.tar... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/dataflow_python_sdk.tar in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0113095624-125184.1578909384.125323/beamapp-jenkins-0113095624-125184.1578909823.944199/beamapp-jenkins-0113095624-125184.1578910370.955679/beamapp-jenkins-0113095624-125184.1578910852.217725/beamapp-jenkins-0113095624-125184.1578911308.576770/dataflow-worker.jar... root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0113095624-125184.1578909384.125323%2Fbeamapp-jenkins-0113095624-125184.1578909823.944199%2Fbeamapp-jenkins-0113095624-125184.1578910370.955679%2Fbeamapp-jenkins-0113095624-125184.1578910852.217725%2Fbeamapp-jenkins-0113095624-125184.1578911308.576770%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UorySAA6lv9P2iQUAm6hK8vj_cyFpTliqYwazsTw9czShCroswBb_2UhzVrlAxGnS5XomIojb4lnq3xz2vt5Q5WVdxFjJU_P_E6pRZziJIYWaIs9bA after exception The read operation timed out root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0113095624-125184.1578909384.125323%2Fbeamapp-jenkins-0113095624-125184.1578909823.944199%2Fbeamapp-jenkins-0113095624-125184.1578910370.955679%2Fbeamapp-jenkins-0113095624-125184.1578910852.217725%2Fbeamapp-jenkins-0113095624-125184.1578911308.576770%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UorySAA6lv9P2iQUAm6hK8vj_cyFpTliqYwazsTw9czShCroswBb_2UhzVrlAxGnS5XomIojb4lnq3xz2vt5Q5WVdxFjJU_P_E6pRZziJIYWaIs9bA after exception The read operation timed out root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0113095624-125184.1578909384.125323%2Fbeamapp-jenkins-0113095624-125184.1578909823.944199%2Fbeamapp-jenkins-0113095624-125184.1578910370.955679%2Fbeamapp-jenkins-0113095624-125184.1578910852.217725%2Fbeamapp-jenkins-0113095624-125184.1578911308.576770%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UorySAA6lv9P2iQUAm6hK8vj_cyFpTliqYwazsTw9czShCroswBb_2UhzVrlAxGnS5XomIojb4lnq3xz2vt5Q5WVdxFjJU_P_E6pRZziJIYWaIs9bA after exception The read operation timed out root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0113095624-125184.1578909384.125323%2Fbeamapp-jenkins-0113095624-125184.1578909823.944199%2Fbeamapp-jenkins-0113095624-125184.1578910370.955679%2Fbeamapp-jenkins-0113095624-125184.1578910852.217725%2Fbeamapp-jenkins-0113095624-125184.1578911308.576770%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2UorySAA6lv9P2iQUAm6hK8vj_cyFpTliqYwazsTw9czShCroswBb_2UhzVrlAxGnS5XomIojb4lnq3xz2vt5Q5WVdxFjJU_P_E6pRZziJIYWaIs9bA after exception The read operation timed out --------------------- >> end captured logging << --------------------- Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_19-6640865796984099731?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:259: FutureWarning: _ReadFromBigQuery is experimental. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_02_24-3303396136286835419?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_10_26-16295524565482272378?project=apache-beam-testing query=self.query, use_standard_sql=True, project=self.project)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_17_49-12291476769817970709?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_14-16876477337795678843?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_07_43-8906704035203160114?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_15_54-13144798982165111845?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_23_34-7415612342994605606?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_17-6348910396401614201?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_00_21-15578160807866465466?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_08_07-5029871930635857478?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_25_28-3209743750115824247?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1418: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_14-15192082687983936989?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:757: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported streaming = self.test_pipeline.options.view_as(StandardOptions).streaming Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_06_33-11366327799842284916?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_14_27-11494070647677078080?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_22_21-2836458003064744173?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_30_16-5691577685177469049?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_14-10570337417157168694?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_57_23-17604112606233680162?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_06_51-9825224965743057681?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_16_01-5194413498911485083?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_24_30-14809402726569028088?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:298: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_13-10030710732367148412?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_56_36-10423357780774184232?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_05_24-7405628021195129543?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_13_34-4054143868905121633?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_21_01-3128479080609216044?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155: FutureWarning: _ReadFromBigQuery is experimental. query=self.query, use_standard_sql=True, project=self.project)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_28_59-5068866173925092774?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_15-16811799057674625938?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_56_58-10052243013444340918?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_04_20-4211503424889730029?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_12_15-3471588490649534197?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_19_48-17063954361779496723?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:740: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:75: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_48_17-9856267588803391005?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_01_56_36-2307665545397978562?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_03_58-12377253958246190142?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_13_05-15621321249688019181?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-13_02_21_06-10599528324914403815?project=apache-beam-testing ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 50 tests in 2976.068s FAILED (SKIP=9, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/py37/build.gradle'> line: 51 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 89 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 50m 53s 85 actionable tasks: 64 executed, 21 from cache Publishing build scan... https://gradle.com/s/kx5gyw2dnaavu Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org For additional commands, e-mail: builds-h...@beam.apache.org