See
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/305/display/redirect>
Changes:
------------------------------------------
[...truncated 2.24 MB...]
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2480:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
pipeline_options=pcoll.pipeline.options,
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2150:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/dataframe/io.py>:629:
FutureWarning: WriteToFiles is experimental.
return pcoll | fileio.WriteToFiles(
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/pytest_postCommitExamples-spark-py39.xml>
-
= 1 failed, 20 passed, 3 skipped, 5354 deselected, 36 warnings in 252.59
seconds =
> Task :sdks:python:test-suites:portable:py39:sparkExamples FAILED
> Task :sdks:python:test-suites:portable:py37:sparkExamples
INFO apache_beam.io.filebasedsink:filebasedsink.py:303 Starting
finalize_write threads with num_shards: 1 (skipped: 0), batches: 1,
num_threads: 1
INFO apache_beam.io.filebasedsink:filebasedsink.py:348 Renamed 1 shards in
0.10 seconds.
PASSED [ 91%]
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
SKIPPED [ 95%]
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
-------------------------------- live log call ---------------------------------
INFO root:pipeline.py:188 Missing pipeline option (runner). Executing
pipeline using the default runner: DirectRunner.
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[set_column_DataFrame_139752196446608],
ComputedExpression[set_index_DataFrame_139751201723408],
ComputedExpression[pre_combine_sum_DataFrame_139751205013264]]:139751202268496]>
for
Stage[inputs={PlaceholderExpression[placeholder_DataFrame_139753246412688]},
partitioning=Arbitrary,
ops=[ComputedExpression[set_column_DataFrame_139752196446608],
ComputedExpression[set_index_DataFrame_139751201723408],
ComputedExpression[pre_combine_sum_DataFrame_139751205013264]],
outputs={ComputedExpression[pre_combine_sum_DataFrame_139751205013264],
PlaceholderExpression[placeholder_DataFrame_139753246412688]}]
INFO root:transforms.py:182 Computing dataframe stage
<ComputeStage(PTransform)
label=[[ComputedExpression[post_combine_sum_DataFrame_139752624686416]]:139751202269520]>
for
Stage[inputs={ComputedExpression[pre_combine_sum_DataFrame_139751205013264]},
partitioning=Index,
ops=[ComputedExpression[post_combine_sum_DataFrame_139752624686416]],
outputs={ComputedExpression[post_combine_sum_DataFrame_139752624686416]}]
INFO apache_beam.io.fileio:fileio.py:555 Added temporary directory
/tmp/.temp9d13f778-ccb1-4b4b-8df8-07bab89f4a01
WARNING root:environments.py:374 Make sure that locally built Python SDK
docker image has Python 3.7 interpreter.
INFO root:environments.py:380 Default Python SDK image for environment is
apache/beam_python3.7_sdk:2.39.0.dev
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function annotate_downstream_side_inputs at
0x7f1ac299f290> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function fix_side_input_pcoll_coders at 0x7f1ac299f3b0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function pack_combiners at 0x7f1ac299f8c0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function lift_combiners at 0x7f1ac299f950>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function expand_sdf at 0x7f1ac299fb00>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function expand_gbk at 0x7f1ac299fb90>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function sink_flattens at 0x7f1ac299fcb0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function greedily_fuse at 0x7f1ac299fd40>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function read_to_impulse at 0x7f1ac299fdd0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function impulse_to_input at 0x7f1ac299fe60>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function sort_stages at 0x7f1ac29a00e0>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function add_impulse_to_dangling_transforms at
0x7f1ac29a0200> ====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function setup_timer_mapping at 0x7f1ac29a0050>
====================
INFO
apache_beam.runners.portability.fn_api_runner.translations:translations.py:714
==================== <function populate_data_channel_coders at 0x7f1ac29a0170>
====================
INFO apache_beam.runners.worker.statecache:statecache.py:172 Creating state
cache with size 100
INFO
apache_beam.runners.portability.fn_api_runner.worker_handlers:worker_handlers.py:894
Created Worker handler
<apache_beam.runners.portability.fn_api_runner.worker_handlers.EmbeddedWorkerHandler
object at 0x7f1a5ccbd750> for environment
ref_Environment_default_environment_1 (beam:env:embedded_python:v1, b'')
INFO apache_beam.io.fileio:fileio.py:642 Moving temporary file
/tmp/.temp9d13f778-ccb1-4b4b-8df8-07bab89f4a01/3126093549790666621_f71c9329-1ccc-4bff-910e-cb8fed433092
to dir: /tmp as tmp02xpcfka.result-00000-of-00001. Res:
FileResult(file_name='/tmp/.temp9d13f778-ccb1-4b4b-8df8-07bab89f4a01/3126093549790666621_f71c9329-1ccc-4bff-910e-cb8fed433092',
shard_index=-1, total_shards=0, window=GlobalWindow, pane=None,
destination=None)
INFO apache_beam.io.fileio:fileio.py:665 Checking orphaned temporary files
for destination None and window GlobalWindow
INFO apache_beam.io.fileio:fileio.py:678 Some files may be left orphaned in
the temporary folder: []
PASSED [100%]
=================================== FAILURES ===================================
_______________ CodersIT.test_coders_output_files_on_small_input _______________
self = <apache_beam.examples.cookbook.coders_it_test.CodersIT
testMethod=test_coders_output_files_on_small_input>
@pytest.mark.no_xdist
@pytest.mark.examples_postcommit
def test_coders_output_files_on_small_input(self):
test_pipeline = TestPipeline(is_integration_test=True)
# Setup the files with expected content.
OUTPUT_FILE_DIR = \
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output'
output = '/'.join([OUTPUT_FILE_DIR, str(uuid.uuid4()), 'result'])
INPUT_FILE_DIR = \
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/input'
input = '/'.join([INPUT_FILE_DIR, str(uuid.uuid4()), 'input.txt'])
create_content_input_file(
input, '\n'.join(map(json.dumps, self.SAMPLE_RECORDS)))
extra_opts = {'input': input, 'output': output}
> coders.run(test_pipeline.get_full_options_as_args(**extra_opts))
apache_beam/examples/cookbook/coders_it_test.py:93:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
apache_beam/examples/cookbook/coders.py:91: in run
| 'write' >> WriteToText(known_args.output, coder=JsonCoder()))
apache_beam/transforms/ptransform.py:1092: in __ror__
return self.transform.__ror__(pvalueish, self.label)
apache_beam/transforms/ptransform.py:614: in __ror__
result = p.apply(self, pvalueish, label)
apache_beam/pipeline.py:662: in apply
return self.apply(transform, pvalueish)
apache_beam/pipeline.py:708: in apply
pvalueish_result = self.runner.apply(transform, pvalueish, self._options)
apache_beam/runners/runner.py:185: in apply
return m(transform, input, options)
apache_beam/runners/runner.py:215: in apply_PTransform
return transform.expand(input)
apache_beam/io/textio.py:690: in expand
self._source.output_type_hint())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <apache_beam.io.textio._TextSource object at 0x7f1ad67b6290>
def output_type_hint(self):
try:
> return self._coder.to_type_hint()
E AttributeError: 'JsonCoder' object has no attribute 'to_type_hint'
apache_beam/io/textio.py:409: AttributeError
------------------------------ Captured log call -------------------------------
INFO root:coders_it_test.py:48 Creating file:
gs://temp-storage-for-end-to-end-tests/py-it-cloud/input/bab72757-977f-48dc-bf2c-d6b6ccc7ce04/input.txt
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
DeprecationWarning: invalid escape sequence \c
self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf')) # pylint:
disable=anomalous-backslash-in-string
apache_beam/io/filesystems_test.py:62
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
DeprecationWarning: invalid escape sequence \d
self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'), #
pylint: disable=anomalous-backslash-in-string
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
dataset_ref = client.dataset(unique_dataset_name, project=project)
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2154:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
is_streaming_pipeline = p.options.view_as(StandardOptions).streaming
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2160:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2454:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2456:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2487:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
| _PassThroughThenCleanup(files_to_remove_pcoll))
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2150:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
PendingDeprecationWarning: Client.dataset is deprecated and will be removed in
a future version. Use a string like 'my_project.my_dataset' or a
cloud.google.bigquery.DatasetReference object, instead.
table_ref = client.dataset(dataset_id).table(table_id)
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/dataframe/io.py>:632:
FutureWarning: WriteToFiles is experimental.
sink=lambda _: _WriteToPandasFileSink(
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
p.options.view_as(GoogleCloudOptions).temp_location or
-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file:
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/pytest_postCommitExamples-spark-py37.xml>
-
= 1 failed, 20 passed, 3 skipped, 5354 deselected, 33 warnings in 261.18
seconds =
> Task :sdks:python:test-suites:portable:py37:sparkExamples FAILED
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 271
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py39:sparkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'>
line: 271
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py37:sparkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 8.0.
You can use '--warning-mode all' to show the individual deprecation warnings
and determine if they come from your own scripts or plugins.
See
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 5m 42s
80 actionable tasks: 51 executed, 27 from cache, 2 up-to-date
Publishing build scan...
https://gradle.com/s/44xlxgtfmcmlk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]