See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1317/display/redirect?page=changes>
Changes: [ttanay100] [BEAM-7437] Add streaming flag to BQ streaming inserts IT test [ttanay100] Change default timeout to 5 mins [kcweaver] [BEAM-7708] don't expect SQL shell bundled dependencies to be shadowed [github] [SQL][Doc] fix broken gradle command. [lcwik] Added new example on how to create a custom unbounded streaming source ------------------------------------------ [...truncated 926.13 KB...] root: INFO: 2019-07-09T18:40:20.814Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseMonitorLoadJobs/Read root: INFO: 2019-07-09T18:40:20.866Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue root: INFO: 2019-07-09T18:40:20.900Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:562>) into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read root: INFO: 2019-07-09T18:40:20.941Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) into WriteWithMultipleDests/BigQueryBatchFileLoads/WaitForLoadJobs/WaitForLoadJobs root: INFO: 2019-07-09T18:40:20.982Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:562>) into WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read root: INFO: 2019-07-09T18:40:21.020Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow root: INFO: 2019-07-09T18:40:21.053Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/Delete into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames root: INFO: 2019-07-09T18:40:21.088Z: JOB_MESSAGE_DETAILED: Fusing consumer WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify root: INFO: 2019-07-09T18:40:21.133Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2019-07-09T18:40:21.173Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-07-09T18:40:21.207Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-07-09T18:40:21.240Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-07-09T18:40:21.460Z: JOB_MESSAGE_DEBUG: Executing wait step start92 root: INFO: 2019-07-09T18:40:21.546Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:562>) root: INFO: 2019-07-09T18:40:21.615Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDests/BigQueryBatchFileLoads/GenerateFilePrefix root: INFO: 2019-07-09T18:40:21.617Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-07-09T18:40:21.644Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... root: INFO: 2019-07-09T18:40:21.655Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ImpulseJobName/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:562>) root: INFO: 2019-07-09T18:40:21.689Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/CreateFilePrefixView/Read+WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GenerateFilePrefix root: INFO: 2019-07-09T18:40:21.721Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create root: INFO: 2019-07-09T18:40:21.762Z: JOB_MESSAGE_BASIC: Executing operation MakeSchemas/Read root: INFO: 2019-07-09T18:40:21.782Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create root: INFO: 2019-07-09T18:40:21.782Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Create root: INFO: 2019-07-09T18:40:21.794Z: JOB_MESSAGE_BASIC: Finished operation MakeSchemas/Read root: INFO: 2019-07-09T18:40:21.817Z: JOB_MESSAGE_BASIC: Executing operation MakeTables/Read root: INFO: 2019-07-09T18:40:21.840Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create root: INFO: 2019-07-09T18:40:21.859Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create root: INFO: 2019-07-09T18:40:21.865Z: JOB_MESSAGE_BASIC: Finished operation MakeTables/Read root: INFO: 2019-07-09T18:40:21.896Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create root: INFO: 2019-07-09T18:40:21.917Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create root: INFO: 2019-07-09T18:40:21.941Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create root: INFO: 2019-07-09T18:40:21.944Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Create root: INFO: 2019-07-09T18:40:21.983Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create root: INFO: 2019-07-09T18:40:22.005Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create root: INFO: 2019-07-09T18:40:22.009Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. root: INFO: 2019-07-09T18:40:22.029Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create root: INFO: 2019-07-09T18:40:22.050Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create root: INFO: 2019-07-09T18:40:22.077Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized. root: INFO: 2019-07-09T18:40:22.102Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create root: INFO: 2019-07-09T18:40:22.118Z: JOB_MESSAGE_DEBUG: Value "MakeSchemas/Read.out" materialized. root: INFO: 2019-07-09T18:40:22.164Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized. root: INFO: 2019-07-09T18:40:22.206Z: JOB_MESSAGE_DEBUG: Value "MakeTables/Read.out" materialized. root: INFO: 2019-07-09T18:40:22.252Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. root: INFO: 2019-07-09T18:40:22.304Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized. root: INFO: 2019-07-09T18:40:22.352Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. root: INFO: 2019-07-09T18:40:22.397Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" materialized. root: INFO: 2019-07-09T18:40:22.448Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" materialized. root: INFO: 2019-07-09T18:40:22.494Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0) root: INFO: 2019-07-09T18:40:22.537Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0) root: INFO: 2019-07-09T18:40:22.540Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0) root: INFO: 2019-07-09T18:40:22.575Z: JOB_MESSAGE_BASIC: Executing operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0) root: INFO: 2019-07-09T18:40:22.578Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0) root: INFO: 2019-07-09T18:40:22.621Z: JOB_MESSAGE_BASIC: Executing operation Create/Read+Map(<lambda at bigquery_file_loads_test.py:442>)+GroupByKey/Reify+GroupByKey/Write root: INFO: 2019-07-09T18:40:22.624Z: JOB_MESSAGE_BASIC: Finished operation WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0) root: INFO: 2019-07-09T18:40:22.668Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDests/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized. root: INFO: 2019-07-09T18:40:22.718Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)/_UnpickledSideInput(Read.out.0).output" materialized. root: INFO: 2019-07-09T18:40:22.765Z: JOB_MESSAGE_DEBUG: Value "WriteWithMultipleDestsFreely/BigQueryBatchFileLoads/AppendDestination/_UnpickledSideInput(Read.out.0).output" materialized. root: INFO: 2019-07-09T18:41:36.519Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2019-07-09T18:42:19.251Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-07-09T18:42:19.294Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-07-09T18:43:22.005Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. root: INFO: Deleting dataset python_bq_file_loads_15626975876365 in project apache-beam-testing --------------------- >> end captured logging << --------------------- Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_25-13301517086932140763?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_44_52-16667213219521077315?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_50_03-14923521608703385733?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_59_15-14561690750720580594?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Exception in thread Thread-8: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 157, in poll_for_job_completion response = runner.dataflow_client.get_job(job_id) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 663, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 689, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-09_11_44_52-16667213219521077315?alt=json>: response: <{'server': 'ESF', 'cache-control': 'private', 'status': '404', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'vary': 'Origin, X-Origin, Referer', 'date': 'Tue, 09 Jul 2019 18:45:33 GMT', 'transfer-encoding': 'chunked', 'x-xss-protection': '0', 'content-length': '279', '-content-encoding': 'gzip', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 404, "message": "(1c0aec717cacf2e): Information about job 2019-07-09_11_44_52-16667213219521077315 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_21-17551381255098425012?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_51_47-11099806145997981337?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_02_12-13023060553416462622?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_25-16852639389430599376?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_41_53-7083961327054885316?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_51_48-11470378243343085705?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_00_37-14969256733894577429?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_21-1170423379244642340?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_51_03-8851016594585164802?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_00_34-17519285677560762863?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_09_47-1999747741881463995?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_20-15482759113042183790?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_39_02-10226957033410328756?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_49_10-7980504513005716588?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_58_12-3512955636214119652?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_07_32-11418937270819952569?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_20-14417695227691129837?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_38_03-16965740915390394990?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_48_40-8649007938132474534?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_58_15-15991965543533037147?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_23-2714916816160510652?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Exception in thread Thread-42: Traceback (most recent call last): File "/usr/lib/python3.5/threading.py", line 914, in _bootstrap_inner self.run() File "/usr/lib/python3.5/threading.py", line 862, in run self._target(*self._args, **self._kwargs) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_40_13-3690242499882260448?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_49_28-14135442298229708172?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_59_26-10112642609757754306?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_08_37-16319810518660316697?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion job_id, page_token=page_token, start_time=last_message_time) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages response = self._client.projects_locations_jobs_messages.List(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-09_11_40_13-3690242499882260448/messages?alt=json&startTime=2019-07-09T18%3A43%3A22.005Z>: response: <{'server': 'ESF', 'cache-control': 'private', 'status': '404', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'vary': 'Origin, X-Origin, Referer', 'date': 'Tue, 09 Jul 2019 18:45:39 GMT', 'transfer-encoding': 'chunked', 'x-xss-protection': '0', 'content-length': '279', '-content-encoding': 'gzip', 'content-type': 'application/json; charset=UTF-8'}>, content <{ "error": { "code": 404, "message": "(e726f0dd0bda812c): Information about job 2019-07-09_11_40_13-3690242499882260448 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_28_22-2444813701059379855?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_38_44-17453001228000981792?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_48_26-14142565160094825337?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_11_57_42-9675114617432515631?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_09_18-10335968294809850359?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-09_12_18_52-14219817083943125609?project=apache-beam-testing. ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 42 tests in 3621.792s FAILED (SKIP=5, failures=2) > Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 1m 21s 77 actionable tasks: 60 executed, 17 from cache Publishing build scan... https://gradle.com/s/pf4mxwttzgf46 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
