See <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/633/display/redirect?page=changes>
Changes: [david.moravek] [BEAM-10164] Flink Batch Runner: Memory efficient combine implementation [aromanenko.dev] [BEAM-10096] Fix Spark runners numbering [iemejia] [BEAM-9948] Fix mascot invalid directory link [github] [BEAM-9646] Add Google Cloud vision integration transform (#11331) ------------------------------------------ [...truncated 5.27 MB...] name: u'beamapp-jenkins-0603191131-770165' projectId: u'apache-beam-testing' stageStates: [] startTime: u'2020-06-03T19:11:42.119703Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_STREAMING, 2)> INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: [2020-06-03_12_11_41-16850643785747227192] INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 2020-06-03_12_11_41-16850643785747227192 INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_11_41-16850643785747227192?project=apache-beam-testing WARNING:apache_beam.runners.dataflow.test_dataflow_runner:Waiting indefinitely for streaming job. INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-06-03_12_11_41-16850643785747227192 is in state JOB_STATE_RUNNING INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:41.073Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2020-06-03_12_11_41-16850643785747227192. The number of workers will be between 1 and 100. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:41.073Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2020-06-03_12_11_41-16850643785747227192. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:41.073Z: JOB_MESSAGE_WARNING: Autoscaling is enabled for Dataflow Streaming Engine. Workers will scale between 1 and 100 unless maxNumWorkers is specified. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.055Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-2 in us-central1-a. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.748Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.792Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.858Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.893Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.928Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:47.969Z: JOB_MESSAGE_DETAILED: Expanding SplittableProcessKeyed operations into optimizable parts. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.002Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into streaming Read/Write steps INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.186Z: JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into MergeBucketsMappingFns INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.287Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.349Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.392Z: JOB_MESSAGE_DETAILED: Unzipping flatten s17 for input s15.None INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.424Z: JOB_MESSAGE_DETAILED: Fusing unzipped copy of assert_that/Group/GroupByKey/WriteStream, through flatten assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.474Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/WriteStream into assert_that/Group/pair_with_1 INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.502Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.538Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.582Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.618Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.652Z: JOB_MESSAGE_DETAILED: Fusing consumer Key param into Create/Map(decode) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.692Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into Key param INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.728Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.762Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into assert_that/ToVoidKey INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.798Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/MergeBuckets into assert_that/Group/GroupByKey/ReadStream INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.837Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/Map(_merge_tagged_vals_under_key) into assert_that/Group/GroupByKey/MergeBuckets INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.870Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.895Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.932Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2623>) into Create/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:48.974Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/FlatMap(<lambda at core.py:2623>) into assert_that/Create/Impulse INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.002Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Create/Map(decode) into assert_that/Create/FlatMap(<lambda at core.py:2623>) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.034Z: JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into assert_that/Create/Map(decode) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.062Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2623>) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.097Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.131Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/WriteStream into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.180Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.217Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.254Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:49.291Z: JOB_MESSAGE_DEBUG: Assigning stage ids. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:51.572Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:51.604Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:11:51.642Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:12:04.018Z: JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric descriptors and Stackdriver will not create new Dataflow custom metrics for this job. Each unique user-defined metric name (independent of the DoFn in which it is defined) produces a new metric descriptor. To delete old / unused metric descriptors see https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list and https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:12:16.584Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 so that the pipeline can catch up with its backlog and keep up with its input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:12:58.661Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:12:58.700Z: JOB_MESSAGE_DETAILED: Workers have started successfully. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:18:53.224Z: JOB_MESSAGE_DETAILED: Cleaning up. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:18:53.281Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:18:53.318Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:18:53.383Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:18:53.422Z: JOB_MESSAGE_BASIC: Stopping worker pool... INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:21:07.294Z: JOB_MESSAGE_DETAILED: Autoscaling: Reduced the number of workers to 0 based on low average worker CPU utilization, and the pipeline having sufficiently low backlog and keeping up with input rate. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:21:07.334Z: JOB_MESSAGE_BASIC: Worker pool stopped. INFO:apache_beam.runners.dataflow.dataflow_runner:2020-06-03T19:21:07.375Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... INFO:apache_beam.runners.dataflow.dataflow_runner:Job 2020-06-03_12_11_41-16850643785747227192 is in state JOB_STATE_DONE test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok ====================================================================== ERROR: test_flatten_pcollections (apache_beam.transforms.ptransform_test.PTransformTest) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/transforms/ptransform_test.py",> line 551, in test_flatten_pcollections assert_that(result, equal_to([0, 1, 2, 3, 4, 5, 6, 7])) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 547, in __exit__ self.run().wait_until_finish() File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/testing/test_pipeline.py",> line 112, in run False if self.not_use_test_runner_api else test_runner_api)) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 513, in run allow_proto_holders=True).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/pipeline.py",> line 526, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline self).run_pipeline(pipeline, options) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 584, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 666, in create_job self.create_job_description(job) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 722, in create_job_description resources = self._stage_resources(job.proto_pipeline, job.options) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 619, in _stage_resources resources, staging_location=google_cloud_options.staging_location) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 305, in stage_job_resources file_path, FileSystems.join(staging_location, staged_path)) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 985, in stage_artifact local_path_to_artifact, artifact_name) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 236, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 583, in _gcs_file_copy self.stage_file(to_folder, to_name, f, total_size=total_size) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 644, in stage_file response = self._storage_client.objects.Insert(request, upload=upload) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1156, in Insert upload=upload, upload_config=upload_config) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 715, in _RunMethod http_request, client=self.client) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py",> line 908, in InitializeUpload return self.StreamInChunks() File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py",> line 1020, in StreamInChunks additional_headers=additional_headers) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py",> line 957, in __StreamMedia response = send_func(self.stream.tell()) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py",> line 943, in CallSendChunk start, additional_headers=additional_headers) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py",> line 1120, in __SendChunk return self.__SendMediaRequest(request, end) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/transfer.py",> line 1033, in __SendMediaRequest retries=self.num_retries, check_response_func=CheckResponse) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 350, in MakeRequest check_response_func=check_response_func) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/http_wrapper.py",> line 400, in _MakeRequestNoRetry redirections=redirections, connection_type=connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 2189, in request cachekey, File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1845, in _request conn, request_uri, method, body, headers File "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/httplib2/__init__.py",> line 1786, in _conn_request response = conn.getresponse() File "/usr/lib/python2.7/httplib.py", line 1165, in getresponse response.begin() File "/usr/lib/python2.7/httplib.py", line 463, in begin version, status, reason = self._read_status() File "/usr/lib/python2.7/httplib.py", line 419, in _read_status line = self.fp.readline(_MAXLINE + 1) File "/usr/lib/python2.7/socket.py", line 480, in readline data = self._sock.recv(self._rbufsize) File "/usr/lib/python2.7/ssl.py", line 756, in recv return self.read(buflen) File "/usr/lib/python2.7/ssl.py", line 643, in read v = self._sslobj.read(len) SSLError: ('The read operation timed out',) -------------------- >> begin captured logging << -------------------- apache_beam.runners.portability.stager: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/build/gradleenv/-194514014/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. root: WARNING: Make sure that locally built Python SDK docker image has Python 2.7 interpreter. root: INFO: Using Python SDK docker image: apache/beam_python2.7_sdk:2.23.0.dev. If the image is not available at local, we will try to pull from hub.docker.com apache_beam.internal.gcp.auth: INFO: Setting socket default timeout to 60 seconds. apache_beam.internal.gcp.auth: INFO: socket default timeout is 60.0 seconds. root: DEBUG: Connecting using Google Application Default Credentials. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/pipeline.pb... oauth2client.transport: INFO: Attempting refresh to obtain initial access_token oauth2client.transport: INFO: Attempting refresh to obtain initial access_token apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/pipeline.pb in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/requirements.txt... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/requirements.txt in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/parameterized-0.7.4.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/parameterized-0.7.4.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/mock-2.0.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/mock-2.0.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/six-1.15.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/six-1.15.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/funcsigs-1.0.2.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/funcsigs-1.0.2.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/pbr-5.4.5.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/pbr-5.4.5.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/PyHamcrest-1.10.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/PyHamcrest-1.10.1.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/dataflow_python_sdk.tar... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/dataflow_python_sdk.tar in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0603184307-930203.1591209787.930358/dataflow-worker.jar... root: DEBUG: Caught socket error, retrying: ('The read operation timed out',) root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AAANsUnCTcOUqG6u1VOmGU2A5rDtK8DtSt9iQ29ua0uIDFKMFh7uvheHqmDAGUwWAqJDcM3FzrlnoKCzzErEq7Tm7quauPWgcQ&name=staging-it%2Fbeamapp-jenkins-0603184307-930203.1591209787.930358%2Fdataflow-worker.jar after exception ('The read operation timed out',) root: DEBUG: Caught socket error, retrying: ('The read operation timed out',) root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AAANsUnCTcOUqG6u1VOmGU2A5rDtK8DtSt9iQ29ua0uIDFKMFh7uvheHqmDAGUwWAqJDcM3FzrlnoKCzzErEq7Tm7quauPWgcQ&name=staging-it%2Fbeamapp-jenkins-0603184307-930203.1591209787.930358%2Fdataflow-worker.jar after exception ('The read operation timed out',) root: DEBUG: Caught socket error, retrying: ('The read operation timed out',) root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AAANsUnCTcOUqG6u1VOmGU2A5rDtK8DtSt9iQ29ua0uIDFKMFh7uvheHqmDAGUwWAqJDcM3FzrlnoKCzzErEq7Tm7quauPWgcQ&name=staging-it%2Fbeamapp-jenkins-0603184307-930203.1591209787.930358%2Fdataflow-worker.jar after exception ('The read operation timed out',) root: DEBUG: Caught socket error, retrying: ('The read operation timed out',) root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?uploadType=resumable&alt=json&upload_id=AAANsUnCTcOUqG6u1VOmGU2A5rDtK8DtSt9iQ29ua0uIDFKMFh7uvheHqmDAGUwWAqJDcM3FzrlnoKCzzErEq7Tm7quauPWgcQ&name=staging-it%2Fbeamapp-jenkins-0603184307-930203.1591209787.930358%2Fdataflow-worker.jar after exception ('The read operation timed out',) --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- XML: nosetests-validatesRunnerStreamingTests-df-py27.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 27 tests in 2299.460s FAILED (errors=1) Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_16-3239049900917232808?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_53_01-11024184377087336639?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_02_25-18392673333523175325?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_11_41-16850643785747227192?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_18-9902189538076888956?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_51_12-83246881943144010?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_00_36-18223212446869281010?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_49_08-1162678646828341315?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_58_36-17730239330649012826?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_19-17952318589066468140?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_52_30-3403036669261239998?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_02_04-4746952935190198197?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_16-12833961750303216790?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_52_19-5092755926342259690?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_01_46-12324582407698445289?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_19-14834390573586299425?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_52_00-8196231940259417620?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_01_24-4330565201943393268?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_19-13501649666145744309?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_52_15-10951366875983015482?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_43_18-8788183390959861921?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_11_52_45-13464610021375773001?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobs/us-central1/2020-06-03_12_02_25-14912076340817770656?project=apache-beam-testing > Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests > FAILED FAILURE: Build failed with an exception. * Where: Script '<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow_V2/ws/src/sdks/python/test-suites/dataflow/common.gradle'> line: 173 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 20m 52s 64 actionable tasks: 46 executed, 18 from cache Publishing build scan... https://gradle.com/s/mexlbpqysq5wa Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
