See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/20/display/redirect?page=changes>
Changes:
[mmack] [BEAM-13246] Add support for S3 Bucket Key at the object level (AWS Sdk
[Pablo Estrada] Output successful rows from BQ Streaming Inserts
[schapman] BEAM-13439 Type annotation for ptransform_fn
[noreply] [BEAM-13606] Fail bundles with failed BigTable mutations (#16751)
------------------------------------------
[...truncated 800.06 KB...]
sink=lambda _: _WriteToPandasFileSink(
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/pytest_postCommitExamples-direct-py37.xml>
-
===== 22 passed, 1 skipped, 5178 deselected, 39 warnings in 236.54 seconds =====
> Task :sdks:python:test-suites:direct:py36:examples
INFO apache_beam.io.filebasedsink:filebasedsink.py:348 Renamed 1 shards in
0.10 seconds.
PASSED [ 91%]
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
-------------------------------- live log call ---------------------------------
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[move_grouped_columns_to_index_DataFrame_140184668457392]]:140184671521760]>
for
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_140184667637520]},
partitioning=Arbitrary,
ops=[ComputedExpression[move_grouped_columns_to_index_DataFrame_140184668457392]],
outputs={ComputedExpression[move_grouped_columns_to_index_DataFrame_140184668457392],
PlaceholderExpression[placeholder_DataFrame_140184667637520]}]
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[apply_DataFrame_140184671520192]]:140185556580504]>
for
Stage[inputs={ComputedExpression[move_grouped_columns_to_index_DataFrame_140184668457392]},
partitioning=Index['airline'],
ops=[ComputedExpression[apply_DataFrame_140184671520192]],
outputs={ComputedExpression[apply_DataFrame_140184671520192]}]
INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory
gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2
WARNING apache_beam.options.pipeline_options:pipeline_options.py:309
Discarding unparseable args: ['--sleep_secs=20',
'--kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test']
WARNING root:environments.py:374 Make sure that locally built Python SDK
docker image has Python 3.6 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.37.0.dev
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function annotate_downstream_side_inputs at
0x7f7f4ddb72f0> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function fix_side_input_pcoll_coders at 0x7f7f4ddb7400>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function pack_combiners at 0x7f7f4ddb78c8>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function lift_combiners at 0x7f7f4ddb7950>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function expand_sdf at 0x7f7f4ddb7ae8>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function expand_gbk at 0x7f7f4ddb7b70>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sink_flattens at 0x7f7f4ddb7c80>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function greedily_fuse at 0x7f7f4ddb7d08>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function read_to_impulse at 0x7f7f4ddb7d90>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function impulse_to_input at 0x7f7f4ddb7e18>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sort_stages at 0x7f7f4ddb50d0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function setup_timer_mapping at 0x7f7f4ddb5048>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function populate_data_channel_coders at 0x7f7f4ddb5158>
====================
INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state
cache with size 100
INFO
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7f7f7e03ce10> for environment
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
((((ref_AppliedPTransform_read-table-Read-Impulse_10)+(ref_AppliedPTransform_read-table-Read-Map-lambda-at-iobase-py-898-_11))+(read
table/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read
table/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_6_split/Write)
INFO apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:557 Started
BigQuery job: <JobReference
location: 'US'
projectId:
'apache-beam-testing'>
bq show -j
--format=prettyjson --project_id=apache-beam-testing None
INFO apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:439 Using location
'US' from table <TableReference
datasetId:
'airline_ontime_data'
projectId:
'bigquery-samples'
tableId:
'flights'> referenced by query
SELECT
date,
airline,
departure_airport,
arrival_airport,
departure_delay,
arrival_delay
FROM
`bigquery-samples.airline_ontime_data.flights`
WHERE date
>= '2012-12-23' AND date <= '2012-12-25'
WARNING apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:886 Dataset
apache-beam-testing:beam_temp_dataset_f025373bf8164d378da909fe88198221 does not
exist so we will create it as temporary with location=US
INFO apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:557 Started
BigQuery job: <JobReference
jobId:
'beam_bq_job_QUERY_BQ_EXPORT_JOB_2fe9caa0-c_1644343832_782'
location: 'US'
projectId:
'apache-beam-testing'>
bq show -j
--format=prettyjson --project_id=apache-beam-testing
beam_bq_job_QUERY_BQ_EXPORT_JOB_2fe9caa0-c_1644343832_782
INFO root:bigquery_tools.py:625 Job status: RUNNING
INFO root:bigquery_tools.py:625 Job status: DONE
INFO apache_beam.io.gcp.bigquery_tools:bigquery_tools.py:557 Started
BigQuery job: <JobReference
jobId:
'beam_bq_job_EXPORT_BQ_EXPORT_JOB_2fe9caa0-c_1644343838_536'
location: 'US'
projectId:
'apache-beam-testing'>
bq show -j
--format=prettyjson --project_id=apache-beam-testing
beam_bq_job_EXPORT_BQ_EXPORT_JOB_2fe9caa0-c_1644343838_536
INFO root:bigquery_tools.py:625 Job status: RUNNING
INFO root:bigquery_tools.py:625 Job status: DONE
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in
0.03880000114440918 seconds.
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running (((((((((((((((((ref_PCollection_PCollection_6_split/Read)+(read
table/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-ParDo-PassThrough-ParDo-PassThrough-_16))+(ref_PCollection_PCollection_9/Write))+(ref_AppliedPTransform_assign-timestamp_23))+(ref_AppliedPTransform_set-schema-Map-lambda-at-core-py-2836-_25))+(ref_AppliedPTransform_daily-windows_26))+(ref_AppliedPTransform_BatchElements-daily-BatchElements-ParDo-_WindowAwareBatchingDoFn-_29))+(ref_AppliedPTransform_BatchElements-daily-Map-lambda-at-schemas-py-140-_30))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__33))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__35))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__38))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__39))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__40))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__43))+(ToPCollection(df)
-
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/[ComputedExpression[apply_DataFrame_140184671520192]]:140185556580504/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__44))+(ToPCollection(df)
-
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/[ComputedExpression[apply_DataFrame_140184671520192]]:140185556580504/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
((((ref_AppliedPTransform_read-table-FilesToRemoveImpulse-Impulse_4)+(ref_AppliedPTransform_read-table-FilesToRemoveImpulse-FlatMap-lambda-at-core-py-3228-_5))+(ref_AppliedPTransform_read-table-FilesToRemoveImpulse-Map-decode-_7))+(ref_AppliedPTransform_read-table-MapFilesToRemove_8))+(ref_PCollection_PCollection_4/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
(((ref_AppliedPTransform_read-table-_PassThroughThenCleanup-Create-Impulse_18)+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-Create-FlatMap-lambda-at-core-py-3228-_19))+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-Create-Map-decode-_21))+(ref_AppliedPTransform_read-table-_PassThroughThenCleanup-ParDo-RemoveExtractedFiles-_22)
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in
0.03973245620727539 seconds.
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((((((((ToPCollection(df) -
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/[ComputedExpression[apply_DataFrame_140184671520192]]:140185556580504/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__46))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__47))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__48))+(ref_AppliedPTransform_ToPCollection-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__50))+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__54))+(WriteToPandas(df)
-
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/Flatten/Write/0))+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__55))+(WriteToPandas(df)
-
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/GroupRecordsByDestinationAndShard/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((WriteToPandas(df) -
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__57))+(WriteToPandas(df)
-
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/Flatten/Write/1)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((WriteToPandas(df) -
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__59))+(WriteToPandas(df)
-
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/GroupTempFilesByDestination/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running (WriteToPandas(df) -
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c/output.csv/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---gs-temp-storage-for-end-to-end-tests-temp-it-flight_delays__61)
INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file
gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2/4273148420993145938_3f75d6b1-3a6a-496e-826d-066a631c5db4
to dir:
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c
as output.csv-2012-12-24T00:00:00-2012-12-25T00:00:00-00000-of-00001. Res:
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2/4273148420993145938_3f75d6b1-3a6a-496e-826d-066a631c5db4',
shard_index=-1, total_shards=0, window=[1356307200.0, 1356393600.0),
pane=None, destination=None)
INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files
for destination None and window [1356307200.0, 1356393600.0)
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in
0.04448843002319336 seconds.
INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in
the temporary folder: []
INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file
gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2/8486159956238836654_9e138ac0-96ec-477d-b772-db95ef70d26f
to dir:
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c
as output.csv-2012-12-25T00:00:00-2012-12-26T00:00:00-00000-of-00001. Res:
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2/8486159956238836654_9e138ac0-96ec-477d-b772-db95ef70d26f',
shard_index=-1, total_shards=0, window=[1356393600.0, 1356480000.0),
pane=None, destination=None)
INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files
for destination None and window [1356393600.0, 1356480000.0)
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in
0.03538680076599121 seconds.
INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in
the temporary folder: []
INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file
gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2/7974773739052588114_0d264aaf-ab8f-4a29-a6df-4968048362b6
to dir:
gs://temp-storage-for-end-to-end-tests/temp-it/flight_delays_it-9c45ba2f-2ec0-4322-983a-6106e1837e6c
as output.csv-2012-12-23T00:00:00-2012-12-24T00:00:00-00000-of-00001. Res:
FileResult(file_name='gs://temp-storage-for-end-to-end-tests/temp-it/.tempe58639c7-1045-4fdc-9f74-1b69234c89d2/7974773739052588114_0d264aaf-ab8f-4a29-a6df-4968048362b6',
shard_index=-1, total_shards=0, window=[1356220800.0, 1356307200.0),
pane=None, destination=None)
INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files
for destination None and window [1356220800.0, 1356307200.0)
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in
0.033559560775756836 seconds.
INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in
the temporary folder: []
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in
0.035944223403930664 seconds.
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in
0.03842592239379883 seconds.
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 1 files in
0.038726806640625 seconds.
INFO apache_beam.io.gcp.gcsio:gcsio.py:559 Starting the size estimation of
the input
INFO apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 3 files in
0.031705617904663086 seconds.
PASSED [ 95%]
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
-------------------------------- live log call ---------------------------------
INFO root:pipeline.py:188 Missing pipeline option (runner). Executing
pipeline using the default runner: DirectRunner.
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[set_column_DataFrame_140184300578576],
ComputedExpression[set_index_DataFrame_140184670703064],
ComputedExpression[pre_combine_sum_DataFrame_140185558535696]]:140184667797824]>
for
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_140184672062488]},
partitioning=Arbitrary,
ops=[ComputedExpression[set_column_DataFrame_140184300578576],
ComputedExpression[set_index_DataFrame_140184670703064],
ComputedExpression[pre_combine_sum_DataFrame_140185558535696]],
outputs={PlaceholderExpression[placeholder_DataFrame_140184672062488],
ComputedExpression[pre_combine_sum_DataFrame_140185558535696]}]
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[post_combine_sum_DataFrame_140184668836304]]:140184667797152]>
for
Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_140185558535696]},
partitioning=Index,
ops=[ComputedExpression[post_combine_sum_DataFrame_140184668836304]],
outputs={ComputedExpression[post_combine_sum_DataFrame_140184668836304]}]
INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory
/tmp/.tempb157fa07-d54e-40f6-848c-8c677d3e1111
WARNING root:environments.py:374 Make sure that locally built Python SDK
docker image has Python 3.6 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.6_sdk:2.37.0.dev
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function annotate_downstream_side_inputs at
0x7f7f4ddb72f0> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function fix_side_input_pcoll_coders at 0x7f7f4ddb7400>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function pack_combiners at 0x7f7f4ddb78c8>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function lift_combiners at 0x7f7f4ddb7950>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function expand_sdf at 0x7f7f4ddb7ae8>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function expand_gbk at 0x7f7f4ddb7b70>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sink_flattens at 0x7f7f4ddb7c80>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function greedily_fuse at 0x7f7f4ddb7d08>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function read_to_impulse at 0x7f7f4ddb7d90>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function impulse_to_input at 0x7f7f4ddb7e18>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function sort_stages at 0x7f7f4ddb50d0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function setup_timer_mapping at 0x7f7f4ddb5048>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:678
==================== <function populate_data_channel_coders at 0x7f7f4ddb5158>
====================
INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state
cache with size 100
INFO
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7f7f4950f4a8> for environment
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
((((ref_AppliedPTransform_Read-Read-Impulse_4)+(ref_AppliedPTransform_Read-Read-Map-lambda-at-iobase-py-898-_5))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_2_split/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running
((((((((((((((ref_PCollection_PCollection_2_split/Read)+(Read/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_Split_8))+(ref_AppliedPTransform_ToRows_9))+(ref_AppliedPTransform_BatchElements-words-BatchElements-ParDo-_GlobalWindowsBatchingDoFn-_12))+(ref_AppliedPTransform_BatchElements-words-Map-lambda-at-schemas-py-140-_13))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-set_column_DataFr_16))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-set_column_DataFr_18))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__21))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__22))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__23))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__26))+(ToPCollection(df)
-
/tmp/tmp3sgsjuww.result/[ComputedExpression[post_combine_sum_DataFrame_140184668836304]]:140184667797152/CoGroupByKey/CoGroupByKeyImpl/Flatten/Transcode/0))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__27))+(ToPCollection(df)
-
/tmp/tmp3sgsjuww.result/[ComputedExpression[post_combine_sum_DataFrame_140184668836304]]:140184667797152/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((((((((((((ToPCollection(df) -
/tmp/tmp3sgsjuww.result/[ComputedExpression[post_combine_sum_DataFrame_140184668836304]]:140184667797152/CoGroupByKey/CoGroupByKeyImpl/GroupByKey/Read)+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__29))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__30))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__31))+(ref_AppliedPTransform_ToPCollection-df---tmp-tmp3sgsjuww-result-ComputedExpression-post_combine_sum__33))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3sgsjuww-result-WriteToFiles-ParDo-_WriteUnshardedRe_37))+(ref_AppliedPTransform_Unbatch-post_combine_sum_DataFrame_140184668836304-with-indexes-ParDo-_Unbatch_46))+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3sgsjuww-result-WriteToFiles-ParDo-_AppendShardedDes_38))+(WriteToPandas(df)
- /tmp/tmp3sgsjuww.result/WriteToFiles/Flatten/Write/0))+(WriteToPandas(df) -
/tmp/tmp3sgsjuww.result/WriteToFiles/GroupRecordsByDestinationAndShard/Write))+(ref_AppliedPTransform_Filter-lambda-at-wordcount-py-80-_47))+(ref_AppliedPTransform_Map-lambda-at-wordcount-py-81-_48))+(ref_AppliedPTransform_Map-print-_49)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((WriteToPandas(df) -
/tmp/tmp3sgsjuww.result/WriteToFiles/GroupRecordsByDestinationAndShard/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3sgsjuww-result-WriteToFiles-ParDo-_WriteShardedReco_40))+(WriteToPandas(df)
- /tmp/tmp3sgsjuww.result/WriteToFiles/Flatten/Write/1)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running ((WriteToPandas(df) -
/tmp/tmp3sgsjuww.result/WriteToFiles/Flatten/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3sgsjuww-result-WriteToFiles-Map-lambda-at-fileio-py_42))+(WriteToPandas(df)
- /tmp/tmp3sgsjuww.result/WriteToFiles/GroupTempFilesByDestination/Write)
INFO
apache_beam.runners.portability.fn_api_runner.fn_runner:fn_runner.py:621
Running (WriteToPandas(df) -
/tmp/tmp3sgsjuww.result/WriteToFiles/GroupTempFilesByDestination/Read)+(ref_AppliedPTransform_WriteToPandas-df---tmp-tmp3sgsjuww-result-WriteToFiles-ParDo-_MoveTempFilesInt_44)
INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file
/tmp/.tempb157fa07-d54e-40f6-848c-8c677d3e1111/5562587401826285902_44ffc710-ab00-40cf-bcdb-6a084390d5fd
to dir: /tmp as tmp3sgsjuww.result-00000-of-00001. Res:
FileResult(file_name='/tmp/.tempb157fa07-d54e-40f6-848c-8c677d3e1111/5562587401826285902_44ffc710-ab00-40cf-bcdb-6a084390d5fd',
shard_index=-1, total_shards=0, window=GlobalWindow, pane=None,
destination=None)
INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files
for destination None and window GlobalWindow
INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in
the temporary folder: []
PASSED [100%]
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
DeprecationWarning: invalid escape sequence \c
self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint:
disable=anomalous-backslash-in-string
apache_beam/io/filesystems_test.py:62
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
DeprecationWarning: invalid escape sequence \d
self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), #
pylint: disable=anomalous-backslash-in-string
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2138:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2144:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2437:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2439:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2470:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
| _PassThroughThenCleanup(files_to_remove_pcoll))
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2134:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/dataframe/io.py>:632:
FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/pytest_postCommitExamples-direct-py36.xml>
-
===== 22 passed, 1 skipped, 5178 deselected, 38 warnings in 235.97 seconds =====
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Direct/ws/src/sdks/python/test-suites/direct/common.gradle'>
line: 92
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py38:examples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
* Get more help at https://help.gradle.org
BUILD FAILED in 5m 14s
21 actionable tasks: 15 executed, 4 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/42x6nqjc7m4lw
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]