See <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/1291/display/redirect>
------------------------------------------ [...truncated 1.17 MB...] startTime: '2019-07-04T18:30:25.755386Z' steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2019-07-04_11_30_24-13169797084724485907] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_30_24-13169797084724485907?project=apache-beam-testing root: INFO: Job 2019-07-04_11_30_24-13169797084724485907 is in state JOB_STATE_RUNNING root: INFO: 2019-07-04T18:30:24.723Z: JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 2019-07-04_11_30_24-13169797084724485907. The number of workers will be between 1 and 1000. root: INFO: 2019-07-04T18:30:24.761Z: JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 2019-07-04_11_30_24-13169797084724485907. root: INFO: 2019-07-04T18:30:27.546Z: JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service Account. root: INFO: 2019-07-04T18:30:28.234Z: JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a. root: INFO: 2019-07-04T18:30:28.784Z: JOB_MESSAGE_DETAILED: Expanding SplittableParDo operations into optimizable parts. root: INFO: 2019-07-04T18:30:28.835Z: JOB_MESSAGE_DETAILED: Expanding CollectionToSingleton operations into optimizable parts. root: INFO: 2019-07-04T18:30:28.928Z: JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2019-07-04T18:30:28.969Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2019-07-04T18:30:29.007Z: JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2019-07-04T18:30:29.041Z: JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts. root: INFO: 2019-07-04T18:30:29.086Z: JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information. root: INFO: 2019-07-04T18:30:29.141Z: JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2019-07-04T18:30:29.187Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/FlatMap(<lambda at core.py:2257>) into Create/Impulse root: INFO: 2019-07-04T18:30:29.224Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) into Create/MaybeReshuffle/Reshuffle/AddRandomKeys root: INFO: 2019-07-04T18:30:29.269Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow root: INFO: 2019-07-04T18:30:29.317Z: JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow root: INFO: 2019-07-04T18:30:29.363Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into GroupByKey/Read root: INFO: 2019-07-04T18:30:29.410Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read root: INFO: 2019-07-04T18:30:29.459Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key root: INFO: 2019-07-04T18:30:29.507Z: JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify root: INFO: 2019-07-04T18:30:29.547Z: JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics root: INFO: 2019-07-04T18:30:29.594Z: JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Map(decode) root: INFO: 2019-07-04T18:30:29.645Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/Map(decode) into Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys root: INFO: 2019-07-04T18:30:29.692Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps) root: INFO: 2019-07-04T18:30:29.740Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/AddRandomKeys into Create/FlatMap(<lambda at core.py:2257>) root: INFO: 2019-07-04T18:30:29.783Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps) root: INFO: 2019-07-04T18:30:29.827Z: JOB_MESSAGE_DETAILED: Fusing consumer Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write into Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify root: INFO: 2019-07-04T18:30:29.876Z: JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec. root: INFO: 2019-07-04T18:30:29.923Z: JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph. root: INFO: 2019-07-04T18:30:29.968Z: JOB_MESSAGE_DEBUG: Adding workflow start and stop steps. root: INFO: 2019-07-04T18:30:30.019Z: JOB_MESSAGE_DEBUG: Assigning stage ids. root: INFO: 2019-07-04T18:30:30.277Z: JOB_MESSAGE_DEBUG: Executing wait step start23 root: INFO: 2019-07-04T18:30:30.382Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create root: INFO: 2019-07-04T18:30:30.426Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create root: INFO: 2019-07-04T18:30:30.438Z: JOB_MESSAGE_DEBUG: Starting worker pool setup. root: INFO: 2019-07-04T18:30:30.478Z: JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a... root: INFO: 2019-07-04T18:30:30.531Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create root: INFO: 2019-07-04T18:30:30.545Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Create root: INFO: 2019-07-04T18:30:30.613Z: JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized. root: INFO: 2019-07-04T18:30:30.662Z: JOB_MESSAGE_DEBUG: Value "Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Session" materialized. root: INFO: 2019-07-04T18:30:30.764Z: JOB_MESSAGE_BASIC: Executing operation Create/Impulse+Create/FlatMap(<lambda at core.py:2257>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write root: INFO: 2019-07-04T18:31:11.061Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-07-04T18:31:41.606Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2019-07-04T18:31:42.121Z: JOB_MESSAGE_DETAILED: Workers have started successfully. root: INFO: 2019-07-04T18:34:29.083Z: JOB_MESSAGE_BASIC: Finished operation Create/Impulse+Create/FlatMap(<lambda at core.py:2257>)+Create/MaybeReshuffle/Reshuffle/AddRandomKeys+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Reify+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write root: INFO: 2019-07-04T18:34:29.164Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close root: INFO: 2019-07-04T18:34:29.190Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Close root: INFO: 2019-07-04T18:34:29.245Z: JOB_MESSAGE_BASIC: Executing operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write root: INFO: 2019-07-04T18:34:46.979Z: JOB_MESSAGE_BASIC: Finished operation Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/GroupByWindow+Create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)+Create/MaybeReshuffle/Reshuffle/RemoveRandomKeys+Create/Map(decode)+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write root: INFO: 2019-07-04T18:34:47.078Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close root: INFO: 2019-07-04T18:34:47.128Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close root: INFO: 2019-07-04T18:34:47.216Z: JOB_MESSAGE_BASIC: Executing operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out root: INFO: 2019-07-04T18:34:53.363Z: JOB_MESSAGE_BASIC: Finished operation GroupByKey/Read+GroupByKey/GroupByWindow+m_out root: INFO: 2019-07-04T18:34:53.433Z: JOB_MESSAGE_DEBUG: Executing success step success21 root: INFO: 2019-07-04T18:34:53.551Z: JOB_MESSAGE_DETAILED: Cleaning up. root: INFO: 2019-07-04T18:34:53.589Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. root: INFO: 2019-07-04T18:34:53.611Z: JOB_MESSAGE_BASIC: Stopping worker pool... --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_44-6678957920209072926?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_17_54-2704992314993134877?project=apache-beam-testing. temp_location = p.options.view_as(GoogleCloudOptions).temp_location Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_25_48-15786078503323935869?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_41-11728255460371138727?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_22_59-7339920861148378953?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_30_24-13169797084724485907?project=apache-beam-testing. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_37_40-13578912758359448099?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Exception in thread Thread-3: Traceback (most recent call last): File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 865, in run self._target(*self._args, **self._kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion job_id, page_token=page_token, start_time=last_message_time) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages response = self._client.projects_locations_jobs_messages.List(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-04_11_30_24-13169797084724485907/messages?alt=json&startTime=2019-07-04T18%3A34%3A53.611Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Thu, 04 Jul 2019 18:35:46 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{ "error": { "code": 404, "message": "(3b8da452fdce7b6d): Information about job 2019-07-04_11_30_24-13169797084724485907 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_43-9371995569670187627?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_15_08-6484043619661715636?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_22_59-6072473762300075393?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_31_30-10133845275636128628?project=apache-beam-testing. method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_40-14879658465100866441?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_20_37-13464011391569257493?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_27_54-888462296695956200?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_35_56-8424777239698916116?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_41-12195850617633471794?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_10_02-6876983406584652205?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_17_19-9209815732002556658?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_25_14-18125976780799588679?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_34_39-5409210821595752778?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_41_45-9970760650813415313?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Exception in thread Thread-5: Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_41-3318278399457681869?project=apache-beam-testing. Traceback (most recent call last): Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_08_54-16886020114119728762?project=apache-beam-testing. File "/usr/lib/python3.7/threading.py", line 917, in _bootstrap_inner self.run() File "/usr/lib/python3.7/threading.py", line 865, in run Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_17_22-15982843010259896474?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_24_49-10630390354293609498?project=apache-beam-testing. self._target(*self._args, **self._kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 190, in poll_for_job_completion job_id, page_token=page_token, start_time=last_message_time) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/utils/retry.py",> line 197, in wrapper return fun(*args, **kwargs) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_33_16-10229734327404759502?project=apache-beam-testing. File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 748, in list_messages response = self._client.projects_locations_jobs_messages.List(request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 553, in List config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 731, in _RunMethod return self.ProcessHttpResponse(method_config, http_response, request) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 737, in ProcessHttpResponse self.__ProcessHttpResponse(method_config, http_response, request)) File "<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967052/lib/python3.7/site-packages/apitools/base/py/base_api.py",> line 604, in __ProcessHttpResponse http_response, method_config=method_config, request=request) apitools.base.py.exceptions.HttpNotFoundError: HttpError accessing <https://dataflow.googleapis.com/v1b3/projects/apache-beam-testing/locations/us-central1/jobs/2019-07-04_11_33_16-10229734327404759502/messages?alt=json&startTime=2019-07-04T18%3A35%3A10.040Z>: response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 'application/json; charset=UTF-8', 'date': 'Thu, 04 Jul 2019 18:35:54 GMT', 'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 'transfer-encoding': 'chunked', 'status': '404', 'content-length': '280', '-content-encoding': 'gzip'}>, content <{ "error": { "code": 404, "message": "(228f6b69912b1850): Information about job 2019-07-04_11_33_16-10229734327404759502 could not be found in our system. Please double check the id is correct. If it is please contact customer support.", "status": "NOT_FOUND" } } > Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_44-9525469136593920665?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_11_00-2100227860010402864?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1140: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported method_to_use = self._compute_method(p, p.options) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_20_16-1634498757204522169?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:557: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_28_20-5380626054495392675?project=apache-beam-testing. temp_location = p.options.view_as(GoogleCloudOptions).temp_location Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_01_45-9592933668786360235?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:687: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_10_36-3993402983541721185?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_18_23-17681088236519206806?project=apache-beam-testing. <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_26_05-4427988925293511502?project=apache-beam-testing. Found: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-07-04_11_34_52-4435258250994618140?project=apache-beam-testing. ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 42 tests in 2900.826s FAILED (SKIP=5, failures=2) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build completed with 3 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'> line: 48 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 3: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 78 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 49m 24s 77 actionable tasks: 60 executed, 17 from cache Publishing build scan... https://gradle.com/s/g34e73xrfqfvw Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
