See <https://builds.apache.org/job/beam_PostCommit_Python37/1474/display/redirect?page=changes>
Changes: [lcwik] [BEAM-8298] Fully specify the necessary details to support side input ------------------------------------------ [...truncated 2.70 MB...] File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 57, in run_pipeline pipeline, options) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 526, in run_pipeline self.dataflow_client.create_job(self.job), self) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 226, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 539, in create_job self.create_job_description(job) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 569, in create_job_description resources = self._stage_resources(job.options) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 499, in _stage_resources staging_location=google_cloud_options.staging_location) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/portability/stager.py",> line 290, in stage_job_resources self.stage_artifact(dataflow_worker_jar, staged_path) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 829, in stage_artifact artifact_name) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 226, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 486, in _gcs_file_copy self.stage_file(to_folder, to_name, f, total_size=total_size) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 520, in stage_file response = self._storage_client.objects.Insert(request, upload=upload) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/storage/storage_v1_client.py",> line 1156, in Insert upload=upload, upload_config=upload_config) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 715, in _RunMethod http_request, client=self.client) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/transfer.py",> line 908, in InitializeUpload return self.StreamInChunks() File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/transfer.py",> line 1020, in StreamInChunks additional_headers=additional_headers) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/transfer.py",> line 957, in __StreamMedia response = send_func(self.stream.tell()) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/transfer.py",> line 943, in CallSendChunk start, additional_headers=additional_headers) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/transfer.py",> line 1120, in __SendChunk return self.__SendMediaRequest(request, end) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/transfer.py",> line 1033, in __SendMediaRequest retries=self.num_retries, check_response_func=CheckResponse) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/http_wrapper.py",> line 346, in MakeRequest check_response_func=check_response_func) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/http_wrapper.py",> line 396, in _MakeRequestNoRetry redirections=redirections, connection_type=connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/oauth2client/transport.py",> line 169, in new_request redirections, connection_type) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/httplib2/__init__.py",> line 1924, in request cachekey, File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/httplib2/__init__.py",> line 1595, in _request conn, request_uri, method, body, headers File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/httplib2/__init__.py",> line 1533, in _conn_request response = conn.getresponse() File "/usr/lib/python3.7/http/client.py", line 1321, in getresponse response.begin() File "/usr/lib/python3.7/http/client.py", line 296, in begin version, status, reason = self._read_status() File "/usr/lib/python3.7/http/client.py", line 257, in _read_status line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1") File "/usr/lib/python3.7/socket.py", line 589, in readinto return self._sock.recv_into(b) File "/usr/lib/python3.7/ssl.py", line 1052, in recv_into return self.read(nbytes, buffer) File "/usr/lib/python3.7/ssl.py", line 911, in read return self._sslobj.read(len, buffer) socket.timeout: The read operation timed out -------------------- >> begin captured logging << -------------------- apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones apache_beam.io.gcp.datastore.v1new.datastore_write_it_pipeline: INFO: Writing 1001 entities to apache-beam-testing apache_beam.options.pipeline_options: WARNING: --region not set; will default to us-central1. Future releases of Beam will require the user to set --region explicitly, or else have a default set via the gcloud tool. https://cloud.google.com/compute/docs/regions-zones apache_beam.runners.dataflow.dataflow_runner: WARNING: Typical end users should not use this worker jar feature. It can only be used when FnAPI is enabled. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/pipeline.pb... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/pipeline.pb in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/requirements.txt... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/requirements.txt in 0 seconds. apache_beam.runners.portability.stager: INFO: Executing command: ['<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/build/gradleenv/-1734967052/bin/python',> '-m', 'pip', 'download', '--dest', '/tmp/dataflow-requirements-cache', '-r', 'postcommit_requirements.txt', '--exists-action', 'i', '--no-binary', ':all:'] apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-42.0.2.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-42.0.2.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/PyHamcrest-1.10.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/PyHamcrest-1.10.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/PyHamcrest-1.9.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/PyHamcrest-1.9.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/six-1.14.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/six-1.14.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/mock-3.0.5.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/mock-3.0.5.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-45.0.0.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-45.0.0.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-43.0.0.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-43.0.0.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/six-1.13.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/six-1.13.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/mock-2.0.0.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/mock-2.0.0.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-44.0.0.zip... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/setuptools-44.0.0.zip in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/pbr-5.4.4.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/pbr-5.4.4.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/funcsigs-1.0.2.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/funcsigs-1.0.2.tar.gz in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/PyHamcrest-1.10.1.tar.gz... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/PyHamcrest-1.10.1.tar.gz in 0 seconds. apache_beam.runners.portability.stager: INFO: Copying Beam SDK "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/build/apache-beam.tar.gz"> to staging location. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/dataflow_python_sdk.tar... apache_beam.runners.dataflow.internal.apiclient: INFO: Completed GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/dataflow_python_sdk.tar in 0 seconds. apache_beam.runners.dataflow.internal.apiclient: INFO: Starting GCS upload to gs://temp-storage-for-end-to-end-tests/staging-it/beamapp-jenkins-0128180542-460855.1580234742.460998/dataflow-worker.jar... root: DEBUG: Caught socket error, retrying: [Errno 32] Broken pipe root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0128180542-460855.1580234742.460998%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2Ur5CcDqDHCgXnKtrMrsuLBu7tQPTg2VrX-1ogQj-gy-Wnq-UcNxgGV2uYnfyghM3mTtiuTP5n9w704RV-9SVDjrEPbErQ after exception [Errno 32] Broken pipe root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0128180542-460855.1580234742.460998%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2Ur5CcDqDHCgXnKtrMrsuLBu7tQPTg2VrX-1ogQj-gy-Wnq-UcNxgGV2uYnfyghM3mTtiuTP5n9w704RV-9SVDjrEPbErQ after exception The read operation timed out root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0128180542-460855.1580234742.460998%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2Ur5CcDqDHCgXnKtrMrsuLBu7tQPTg2VrX-1ogQj-gy-Wnq-UcNxgGV2uYnfyghM3mTtiuTP5n9w704RV-9SVDjrEPbErQ after exception The read operation timed out root: DEBUG: Caught socket error, retrying: The read operation timed out root: DEBUG: Retrying request to url https://www.googleapis.com/resumable/upload/storage/v1/b/temp-storage-for-end-to-end-tests/o?alt=json&name=staging-it%2Fbeamapp-jenkins-0128180542-460855.1580234742.460998%2Fdataflow-worker.jar&uploadType=resumable&upload_id=AEnB2Ur5CcDqDHCgXnKtrMrsuLBu7tQPTg2VrX-1ogQj-gy-Wnq-UcNxgGV2uYnfyghM3mTtiuTP5n9w704RV-9SVDjrEPbErQ after exception The read operation timed out --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:155: FutureWarning: _ReadFromBigQuery is experimental. query=self.query, use_standard_sql=True, project=self.project)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_57_00-4019303813277654210?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_11_25-1809502175048210716?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_20_29-17340296568354491070?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_29_01-5309368501090294627?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_57_01-12764461251425588600?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_17_37-10760896405788075506?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1418: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_57_02-6010665023703426564?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_09_13-6757273799957279297?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:75: FutureWarning: _ReadFromBigQuery is experimental. known_args.input_table)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_18_09-11173022889827468711?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_28_10-6616578152311560705?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:75: FutureWarning: _ReadFromBigQuery is experimental. known_args.input_table)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_56_58-17501684670251540967?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_16_21-11025142172759306399?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_25_58-13032251309310359569?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_33_29-17934145772668834934?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_57_01-5677298443267815219?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_06_06-5932641788608202393?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_14_19-218436196555764496?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_22_55-14104483892934641209?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_31_10-2184542613166639698?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:75: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_56_57-13463802715607097909?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_05_08-10185712087845945196?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_14_48-681407327848746702?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_24_29-5000333390711000706?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:757: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_33_48-16356778875835677516?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_41_50-16061299988765938107?project=apache-beam-testing streaming = self.test_pipeline.options.view_as(StandardOptions).streaming <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:753: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_57_00-4871876445652392535?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_10_48-14729884904610989283?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_19_43-2912115937765104176?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_28_23-5870492719456681215?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_36_37-5340213472683414831?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:259: FutureWarning: _ReadFromBigQuery is experimental. query=self.query, use_standard_sql=True, project=self.project)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1605: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_09_56_58-12324698694586228967?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_06_48-2138873328285058157?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_16_50-2234219320512045984?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_26_21-6937468589070044809?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-01-28_10_34_57-11290067919431448721?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1421: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:771: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:298: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:309: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 51 tests in 3202.061s FAILED (SKIP=9, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 89 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 54m 49s 85 actionable tasks: 66 executed, 19 from cache Publishing build scan... https://gradle.com/s/dum47ynwrvwxq Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
