See <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/25/display/redirect?page=changes>
Changes: [david.prieto.rivera] Missing contribution [noreply] Merge pull request #15848 from [BEAM-13835] An any-type implementation [Valentyn Tymofieiev] [BEAM-12920] Assume that bare generators types define simple generators. [noreply] [release-2.36.0][website] Fix github release notes script, header for [noreply] Use shell to run python for setupVirtualenv (#16796) [Daniel Oliveira] [BEAM-13830] Properly shut down Debezium expansion service in IT script. [noreply] Merge pull request #16659 from [BEAM-13774][Playground] Add user to [noreply] [BEAM-13776][Playground] (#16731) ------------------------------------------ [...truncated 801.22 KB...] apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported | _PassThroughThenCleanup(files_to_remove_pcoll)) apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. table_ref = client.dataset(dataset_id).table(table_id) apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:45: FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 'numeric_only=None') is deprecated; in a future version this will raise TypeError. Select only valid columns before calling the reduction. return airline_df[at_top_airports].mean() apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental. sink=lambda _: _WriteToPandasFileSink( apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/pytest_postCommitExamples-direct-py37.xml> - ===== 22 passed, 1 skipped, 5184 deselected, 39 warnings in 185.55 seconds ===== > Task :sdks:python:test-suites:direct:py36:examples INFO root:bigquery_tools.py:625 Job status: DONE INFO apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:557 Started BigQuery job: <JobReference jobId: 'beam_bq_job_EXPORT_BQ_EXPORT_JOB_bbf10e78-e_1644452254_543' location: 'US' projectId: 'apache-beam-testing'> bq show -j --format=prettyjson --project_id=apache-beam-testing beam_bq_job_EXPORT_BQ_EXPORT_JOB_bbf10e78-e_1644452254_543 INFO root:bigquery_tools.py:625 Job status: RUNNING INFO root:bigquery_tools.py:625 Job status: DONE INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in 0.04726052284240723 seconds. INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running (((((((((((((((((ref_PCollection_PCollection_6_split/Read)+(read table/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-ParDo-PassThrough-ParDo-PassThrough-_16))+(ref_AppliedPTransform_assign-timestamp_23))+(ref_PCollection_PCollection_9/Write))+(ref_AppliedPTransform_set-schema-Map-lambda-at-core-py-2836-_25))+(ref_AppliedPTransform_daily-windows_26))+(ref_AppliedPTransform_BatchElements-daily-BatchElements-ParDo-_WindowAwareBatchingDoFn-_29))+(ref_AppliedPTransform_BatchElements-daily-Map-lambda-at-schemas-py-140-_30))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__33))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__35))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__38))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__39))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__40))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__43))+(ToPCollection(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/[ComputedExpression[apply_DataFrame_140315601137736]]:140315602071904/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__44))+(ToPCollection(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/[ComputedExpression[apply_DataFrame_140315601137736]]:140315602071904/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((((((ToPCollection(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/[ComputedExpression[apply_DataFrame_140315601137736]]:140315602071904/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__46))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__47))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__48))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__50))+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__54))+(WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/Flatten/Write/0))+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__55))+(WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/GroupRecordsByDestinationAndShard/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__57))+(WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/Flatten/Write/1) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__59))+(WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/GroupTempFilesByDestination/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running (WriteToPandas(df) - gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad/output.csv/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__61) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp855fe0ae-f1d8-4859-ae5c-6a961bb583d7/7974773739052588114_6c718c2f-4864-44b6-aa4f-a67f05f25524 to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00000-of-00001. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp855fe0ae-f1d8-4859-ae5c-6a961bb583d7/7974773739052588114_6c718c2f-4864-44b6-aa4f-a67f05f25524', shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window [1356220800.0, 1356307200.0) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.03866839408874512 seconds. INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp855fe0ae-f1d8-4859-ae5c-6a961bb583d7/8486159956238836654_c412816a-2b34-4930-8dc5-27c18ff2870d to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00000-of-00001. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp855fe0ae-f1d8-4859-ae5c-6a961bb583d7/8486159956238836654_c412816a-2b34-4930-8dc5-27c18ff2870d', shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window [1356393600.0, 1356480000.0) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.04186058044433594 seconds. INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file gs://temp-storage-for-end-to-end-tests/temp-it/.temp855fe0ae-f1d8-4859-ae5c-6a961bb583d7/4273148420993145938_ccc91a38-f608-4467-a596-20adeab0e59a to dir: gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-152b51ef-c338-45bd-a90f-b47c84164dad as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00000-of-00001. Res: FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.temp855fe0ae-f1d8-4859-ae5c-6a961bb583d7/4273148420993145938_ccc91a38-f608-4467-a596-20adeab0e59a', shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0), pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window [1356307200.0, 1356393600.0) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 0.03099679946899414 seconds. INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running (((ref_AppliedPTransform_read-table-_PassThroughThenCleanup-Create-Impulse_18)+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-Create-FlatMap-lambda-at-core-py-3228-_19))+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-Create-Map-decode-_21))+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-ParDo-RemoveExtractedFiles-_22) INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in 0.09845590591430664 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in 0.03463912010192871 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in 0.07831072807312012 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in 0.03490090370178223 seconds. INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of the input INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 3 files in 0.04360628128051758 seconds. PASSED [ 95%] apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics -------------------------------- live log call --------------------------------- INFO root:pipeline.py:188 Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner. INFO root:transforms.py:182 Computing dataframe stage <ComputeStage(PTransform) label=[[ComputedExpression[set_column_DataFrame_140316614100080], ComputedExpression[set_index_DataFrame_140315597801232], ComputedExpression[pre_combine_sum_DataFrame_140315597324240]]:140316616010664]> for Stage[inputs={PlaceholderExpression[placeholder_DataFrame_140315597772952]}, partitioning=Arbitrary, ops=[ComputedExpression[set_column_DataFrame_140316614100080], ComputedExpression[set_index_DataFrame_140315597801232], ComputedExpression[pre_combine_sum_DataFrame_140315597324240]], outputs={ComputedExpression[pre_combine_sum_DataFrame_140315597324240], PlaceholderExpression[placeholder_DataFrame_140315597772952]}] INFO root:transforms.py:182 Computing dataframe stage <ComputeStage(PTransform) label=[[ComputedExpression[post_combine_sum_DataFrame_140315597801064]]:140315597772672]> for Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_140315597324240]}, partitioning=Index, ops=[ComputedExpression[post_combine_sum_DataFrame_140315597801064]], outputs={ComputedExpression[post_combine_sum_DataFrame_140315597801064]}] INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory /tmp/.temp87424769-fef3-4b76-8e05-d199e85c1deb WARNING root:environments.py:374 Make sure that locally built Python SDK docker image has Python 3.6 interpreter. INFO root:environments.py:380 Default Python SDK image for environment is apache/beam_python3.6_sdk:2.37.0.dev INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function annotate_downstream_side_inputs at 0x7f9dc9bd4268> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function fix_side_input_pcoll_coders at 0x7f9dc9bd4378> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function pack_combiners at 0x7f9dc9bd4840> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function lift_combiners at 0x7f9dc9bd48c8> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function expand_sdf at 0x7f9dc9bd4a60> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function expand_gbk at 0x7f9dc9bd4ae8> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sink_flattens at 0x7f9dc9bd4bf8> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function greedily_fuse at 0x7f9dc9bd4c80> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function read_to_impulse at 0x7f9dc9bd4d08> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function impulse_to_input at 0x7f9dc9bd4d90> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function sort_stages at 0x7f9dc9bd1048> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function setup_timer_mapping at 0x7f9dc9bd4f28> ==================== INFO apache_beam.runners.portability.fn_api_runner.translations:translations.py:678 ==================== <function populate_data_channel_coders at 0x7f9dc9bd10d0> ==================== INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state cache with size 100 INFO apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894 Created Worker handler <apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler object at 0x7f9dc54df160> for environment ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'') INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((ref_AppliedPTransform_Read-Read-Impulse_4)+(ref_AppliedPTransform_Read-Read-Map-lambda-at-iobase-py-898-_5))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((((((((((((ref_PCollection_PCollection_2_split/Read)+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Split_8))+(ref_AppliedPTransform_ToRows_9))+(ref_AppliedPTransform_BatchElements-words-BatchElements-ParDo-_GlobalWindowsBatchingDoFn-_12))+(ref_AppliedPTransform_BatchElements-words-Map-lambda-at-schemas-py-140-_13))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-set_column_DataFr_16))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-set_column_DataFr_18))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__21))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__22))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__23))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__26))+(ToPCollection(df) - /tmp/tmpmgidok1p.result/[ComputedExpression[post_combine_sum_DataFrame_140315597801064]]:140315597772672/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__27))+(ToPCollection(df) - /tmp/tmpmgidok1p.result/[ComputedExpression[post_combine_sum_DataFrame_140315597801064]]:140315597772672/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((((((((((((ToPCollection(df) - /tmp/tmpmgidok1p.result/[ComputedExpression[post_combine_sum_DataFrame_140315597801064]]:140315597772672/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__29))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__30))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__31))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmpmgidok1p-result-ComputedExpression-post_combine_sum__33))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpmgidok1p-result-WriteToFiles-ParDo-_WriteUnshardedRe_37))+(ref_AppliedPTransform_Unbatch-post_combine_sum_DataFrame_140315597801064-with-indexes-ParDo-_Unbatch_46))+(WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/Flatten/Write/0))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpmgidok1p-result-WriteToFiles-ParDo-_AppendShardedDes_38))+(WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/GroupRecordsByDestinationAndShard/Write))+(ref_AppliedPTransform_Filter-lambda-at-wordcount-py-80-_47))+(ref_AppliedPTransform_Map-lambda-at-wordcount-py-81-_48))+(ref_AppliedPTransform_Map-print-_49) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpmgidok1p-result-WriteToFiles-ParDo-_WriteShardedReco_40))+(WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/Flatten/Write/1) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running ((WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpmgidok1p-result-WriteToFiles-Map-lambda-at-fileio-py_42))+(WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/GroupTempFilesByDestination/Write) INFO apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621 Running (WriteToPandas(df) - /tmp/tmpmgidok1p.result/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmpmgidok1p-result-WriteToFiles-ParDo-_MoveTempFilesInt_44) INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file /tmp/.temp87424769-fef3-4b76-8e05-d199e85c1deb/5562587925071239802_f67e5baf-08a8-44c8-a087-0fd4484eaccb to dir: /tmp as tmpmgidok1p.result-00000-of-00001. Res: FileResult(file_name='/tmp/.temp87424769-fef3-4b76-8e05-d199e85c1deb/5562587925071239802_f67e5baf-08a8-44c8-a087-0fd4484eaccb', shard_index=-1, total_shards=0, window=GlobalWindow, pane=None, destination=None) INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files for destination None and window GlobalWindow INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in the temporary folder: [] PASSED [100%] =============================== warnings summary =============================== apache_beam/io/filesystems_test.py:54 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54: DeprecationWarning: invalid escape sequence \c self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint: disable=anomalous-backslash-in-string apache_beam/io/filesystems_test.py:62 <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62: DeprecationWarning: invalid escape sequence \d self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), # pylint: disable=anomalous-backslash-in-string apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. dataset_ref = client.dataset(unique_dataset_name, project=project) apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported is_streaming_pipeline = p.options.view_as(StandardOptions).streaming apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME') apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported | _PassThroughThenCleanup(files_to_remove_pcoll)) apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100: PendingDeprecationWarning: Client.dataset is deprecated and will be removed in a future version. Use a string like 'my_project.my_dataset' or a cloud.google.bigquery.DatasetReference object, instead. table_ref = client.dataset(dataset_id).table(table_id) apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/dataframe/io.py>:632: FutureWarning: WriteToFiles is experimental. sink=lambda _: _WriteToPandasFileSink( apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/fileio.py>:550: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported p.options.view_as(GoogleCloudOptions).temp_location or -- Docs: https://docs.pytest.org/en/latest/warnings.html - generated xml file: <https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/pytest_postCommitExamples-direct-py36.xml> - ===== 22 passed, 1 skipped, 5184 deselected, 38 warnings in 191.08 seconds ===== FAILURE: Build failed with an exception. * Where: Script '<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/test-suites/direct/common.gradle'> line: 92 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py38:examples'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: > Run with --stacktrace option to get the stack trace. > Run with --info or --debug option to get more log output. > Run with --scan to get full insights. * Get more help at https://help.gradle.org BUILD FAILED in 4m 47s 21 actionable tasks: 15 executed, 4 from cache, 2 up-to-date Publishing build scan... https://gradle.com/s/cv5rhqvpgxns4 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
