See <https://builds.apache.org/job/beam_PostCommit_Python37/1720/display/redirect?page=changes>
Changes: [robertwb] Allow metrics update to be tolerant to uninitalized metric containers. ------------------------------------------ [...truncated 2.89 MB...] apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:42.153Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:42.187Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:42.205Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:42.240Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:42.273Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:42.310Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:49.380Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:49.445Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap.out0" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:49.516Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/View apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:49.571Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/View apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:49.665Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey1.out.0)/View.out0" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:52.322Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:52.404Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/ToIsmRecordForMultimap.out0" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:52.475Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/View apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:52.527Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/View apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:52.600Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(MapToVoidKey1.out.0)/View.out0" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:52.675Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/PreFinalize/PreFinalize+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:55.719Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/PreFinalize/PreFinalize+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/MapToVoidKey2+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/CreateIsmShardKeyAndSortKey+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Write apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:55.790Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:55.847Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Close apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:55.918Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:58.441Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/GroupByKeyHashAndSortByKeyAndWindow/Read+write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:58.511Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/ToIsmRecordForMultimap.out0" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:58.572Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/View apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:58.632Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/View apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:58.715Z: JOB_MESSAGE_DEBUG: Value "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(MapToVoidKey2.out.0)/View.out0" materialized. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:31:58.796Z: JOB_MESSAGE_BASIC: Executing operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:32:01.115Z: JOB_MESSAGE_BASIC: Finished operation write/Write/WriteImpl/FinalizeWrite/FinalizeWrite apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:32:01.181Z: JOB_MESSAGE_DEBUG: Executing success step success118 apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:32:01.297Z: JOB_MESSAGE_DETAILED: Cleaning up. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:32:01.363Z: JOB_MESSAGE_DEBUG: Starting worker pool teardown. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:32:01.403Z: JOB_MESSAGE_BASIC: Stopping worker pool... apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:33:19.855Z: JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:33:19.902Z: JOB_MESSAGE_BASIC: Worker pool stopped. apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-02-25T09:33:19.946Z: JOB_MESSAGE_DEBUG: Tearing down pending resources... apache_beam.runners.dataflow.dataflow_runner: INFO: Job 2020-02-25_01_26_13-5753520243978791357 is in state JOB_STATE_DONE apache_beam.testing.pipeline_verifiers: INFO: Wait 20 seconds... apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1582622756263/results[^/\\\\]*\\-of\\-[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.03523707389831543 seconds. apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 4.66047706040711 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-* Traceback for above exception (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 234, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry raise IOError('No such file or directory: %s' % self.file_path) apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1582622756263/results[^/\\\\]*\\-of\\-[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.031828880310058594 seconds. apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 6.796291690337679 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-* Traceback for above exception (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 234, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry raise IOError('No such file or directory: %s' % self.file_path) apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1582622756263/results[^/\\\\]*\\-of\\-[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.03608226776123047 seconds. apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 11.302702759218391 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-* Traceback for above exception (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 234, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry raise IOError('No such file or directory: %s' % self.file_path) apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1582622756263/results[^/\\\\]*\\-of\\-[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.037833452224731445 seconds. apache_beam.utils.retry: WARNING: Retry with exponential backoff: waiting for 29.344752704889714 seconds before retrying _read_with_retry because we caught exception: OSError: No such file or directory: gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-* Traceback for above exception (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/utils/retry.py",> line 234, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/testing/pipeline_verifiers.py",> line 122, in _read_with_retry raise IOError('No such file or directory: %s' % self.file_path) apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*-of-*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1582622756263/results[^/\\\\]*\\-of\\-[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.04205131530761719 seconds. apache_beam.io.filesystem: DEBUG: Listing files in 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results' apache_beam.io.filesystem: DEBUG: translate_pattern: 'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1582622756263/results*' -> 'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1582622756263/results[^/\\\\]*' apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input apache_beam.io.gcp.gcsio: INFO: Finished listing 0 files in 0.03615069389343262 seconds. --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_17-7467271807849787203?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_32_37-17041271045933938597?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_41_30-13819076383607941556?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:818: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_49_13-6086438842955382621?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_57_26-6722777290814014813?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:818: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_11-3298716423340011089?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_35_50-10147966739868697699?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_43_31-3797892185498611228?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:823: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported streaming = self.test_pipeline.options.view_as(StandardOptions).streaming <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1463: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_13-2394669679868915361?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_30_25-9830559464050682199?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_37_42-16009248374968002886?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_44_22-3390861160343651017?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87: FutureWarning: _ReadFromBigQuery is experimental. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_50_46-15573922486980235339?project=apache-beam-testing kms_key=kms_key) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_58_09-2465360849666045439?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87: FutureWarning: _ReadFromBigQuery is experimental. kms_key=kms_key) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:95: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87: FutureWarning: _ReadFromBigQuery is experimental. kms_key=kms_key) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:95: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_09-15017925416861973106?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_34_21-11448497255114866590?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_41_43-1142045193618392593?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:275: FutureWarning: _ReadFromBigQuery is experimental. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_49_15-2068574106615047652?project=apache-beam-testing query=self.query, use_standard_sql=True, project=self.project)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_56_45-17406877026635363381?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:303: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_09-12744944373867115757?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_26_13-14744319246742302730?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_33_52-10249304423275518028?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_41_24-16992235529953687537?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:84: FutureWarning: _ReadFromBigQuery is experimental. (options.view_as(GoogleCloudOptions).project, known_args.input_table)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_48_55-14436863297767586164?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_56_07-574888172407035415?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:84: FutureWarning: _ReadFromBigQuery is experimental. (options.view_as(GoogleCloudOptions).project, known_args.input_table)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_08-270488343591637451?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_26_13-5753520243978791357?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_34_56-1141233167241786261?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_42_20-8739449604214273342?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:162: FutureWarning: _ReadFromBigQuery is experimental. query=self.query, use_standard_sql=True, project=self.project)) <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_49_30-2224238934371694807?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_56_34-6528591030589338262?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_09-6311283581181336129?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/avroio.py>:205: UserWarning: Due to a known issue in avro-python3 package, it is recommended to use fastavro with Beam Avro IO on Python 3 until BEAM-6522 is addressed. "Due to a known issue in avro-python3 package, it is " Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_26_59-11097294893865098525?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_36_36-14620062183158323651?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:818: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_43_35-6578505346345180778?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_50_03-7557623367039490603?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_57_14-17014302851706253357?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_18_11-8800210246364955947?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_26_23-2735887090808197716?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_33_00-14976754603540168520?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_40_32-2062238241103923160?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_48_01-3406647168670415003?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-25_01_55_13-12423560920358679217?project=apache-beam-testing ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py37.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 52 tests in 2841.295s FAILED (SKIP=8, errors=1) > Task :sdks:python:test-suites:dataflow:py37:postCommitIT FAILED FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'> line: 89 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 48m 48s 85 actionable tasks: 64 executed, 21 from cache Publishing build scan... https://gradle.com/s/y7h7iah5mddxm Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
