See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/287/display/redirect>

Changes:


------------------------------------------
[...truncated 3.36 MB...]
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
self.__ProcessHttpResponse(method_config, http_response, request))'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/apitools/base/py/base_api.py";,>
 line 604, in __ProcessHttpResponse'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
http_response, method_config=method_config, request=request)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 
b"RuntimeError: apitools.base.py.exceptions.HttpForbiddenError: HttpError 
accessing 
<https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/jobs?alt=json>:
 response: <{'vary': 'Origin, X-Origin, Referer', 'content-type': 
'application/json; charset=UTF-8', 'date': 'Wed, 20 Apr 2022 07:32:53 GMT', 
'server': 'ESF', 'cache-control': 'private', 'x-xss-protection': '0', 
'x-frame-options': 'SAMEORIGIN', 'x-content-type-options': 'nosniff', 
'transfer-encoding': 'chunked', 'status': '403', 'content-length': '528', 
'-content-encoding': 'gzip'}>, content <{"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  
"error": {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
"code": 403,'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
"message": "Access Denied: Table bigquery-samples:airline_ontime_data.flights: 
User does not have permission to query table 
bigquery-samples:airline_ontime_data.flights.",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
"errors": ['
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      {'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"message": "Access Denied: Table bigquery-samples:airline_ontime_data.flights: 
User does not have permission to query table 
bigquery-samples:airline_ontime_data.flights.",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"domain": "global",'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'        
"reason": "accessDenied"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    ],'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
"status": "PERMISSION_DENIED"'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'}'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b"> 
[while running 'read 
table/Read/SDFBoundedSourceReader/ParDo(SDFBoundedSourceDoFn)/SplitAndSize0']"
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b''
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
java.util.concurrent.CompletableFuture.reportGet(CompletableFuture.java:357)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
java.util.concurrent.CompletableFuture.get(CompletableFuture.java:1908)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.sdk.util.MoreFutures.get(MoreFutures.java:60)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.runners.fnexecution.control.SdkHarnessClient$BundleProcessor$ActiveBundle.close(SdkHarnessClient.java:504)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.runners.fnexecution.control.DefaultJobBundleFactory$SimpleStageBundleFactory$1.close(DefaultJobBundleFactory.java:555)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.$closeResource(SparkExecutableStageFunction.java:210)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.processElements(SparkExecutableStageFunction.java:234)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:150)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.beam.runners.spark.translation.SparkExecutableStageFunction.call(SparkExecutableStageFunction.java:81)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.api.java.JavaRDDLike$$anonfun$fn$4$1.apply(JavaRDDLike.scala:153)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:823)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.iterator(RDD.scala:310)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.iterator(RDD.scala:310)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:359)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:357)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1165)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:357)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.iterator(RDD.scala:308)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.iterator(RDD.scala:310)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:346)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.rdd.RDD.iterator(RDD.scala:310)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.scheduler.Task.run(Task.scala:123)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:411)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\tat 
org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:417)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'\t... 3 
more'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'Caused 
by: java.lang.RuntimeException: Error received from SDK harness for instruction 
1: Traceback (most recent call last):'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/common.py";,>
 line 1198, in process'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
return self.do_fn_invoker.invoke_process(windowed_value)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/common.py";,>
 line 537, in invoke_process'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
windowed_value, self.process_method(windowed_value.value))'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/common.py";,>
 line 1334, in process_outputs'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    for 
result in results:'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/bundle_processor.py";,>
 line 1447, in process'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
element, restriction):'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/transforms/core.py";,>
 line 328, in split_and_size'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    for 
part in self.split(element, restriction):'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/iobase.py";,>
 line 1627, in split'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
estimated_size = restriction.source().estimate_size()'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py";,>
 line 772, in estimate_size'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
self.bigquery_job_labels))'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 253, in wrapper'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
return fun(*args, **kwargs)'
INFO     apache_beam.io.gcp.gcsio:gcsio.py:575 Finished listing 0 files in 
0.034230709075927734 seconds.
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_tools.py";,>
 line 605, in _start_query_job'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
return self._start_job(request)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_tools.py";,>
 line 551, in _start_job'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
response = self.client.jobs.Insert(request, upload=upload)'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/internal/clients/bigquery/bigquery_v2_client.py";,>
 line 345, in Insert'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    
upload=upload, upload_config=upload_config)'
---------------------------- Captured log teardown -----------------------------
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'      }'
INFO     apache_beam.utils.subprocess_server:subprocess_server.py:125 b'    ],'
=============================== warnings summary ===============================
apache_beam/io/filesystems_test.py:54
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:54:
 DeprecationWarning: invalid escape sequence \c
    self.assertIsNone(FileSystems.get_scheme('c:\\abc\cdf'))  # pylint: 
disable=anomalous-backslash-in-string

apache_beam/io/filesystems_test.py:62
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/filesystems_test.py>:62:
 DeprecationWarning: invalid escape sequence \d
    self.assertTrue(isinstance(FileSystems.get_filesystem('c:\\abc\def'),  # 
pylint: disable=anomalous-backslash-in-string

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:63:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    dataset_ref = client.dataset(unique_dataset_name, project=project)

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2143:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    is_streaming_pipeline = p.options.view_as(StandardOptions).streaming

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2149:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    experiments = p.options.view_as(DebugOptions).experiments or []

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1128:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = p.options.view_as(GoogleCloudOptions).temp_location

apache_beam/examples/complete/game/hourly_team_score_it_test.py::HourlyTeamScoreIT::test_hourly_team_score_output_checksum_on_small_input
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:1130:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).job_name or 'AUTOMATIC_JOB_NAME')

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2443:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    temp_location = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2445:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    job_name = pcoll.pipeline.options.view_as(GoogleCloudOptions).job_name

apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_side_input_it_test.py::BigQuerySideInputIT::test_bigquery_side_input_it
apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2476:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    | _PassThroughThenCleanup(files_to_remove_pcoll))

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:2139:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    self.table_reference.projectId = pcoll.pipeline.options.view_as(

apache_beam/examples/cookbook/bigquery_tornadoes_it_test.py::BigqueryTornadoesIT::test_bigquery_tornadoes_it
apache_beam/examples/cookbook/filters_test.py::FiltersTest::test_filters_output_bigquery_matcher
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py>:100:
 PendingDeprecationWarning: Client.dataset is deprecated and will be removed in 
a future version. Use a string like 'my_project.my_dataset' or a 
cloud.google.bigquery.DatasetReference object, instead.
    table_ref = client.dataset(dataset_id).table(table_id)

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/examples/dataframe/flight_delays.py>:47:
 FutureWarning: Dropping of nuisance columns in DataFrame reductions (with 
'numeric_only=None') is deprecated; in a future version this will raise 
TypeError.  Select only valid columns before calling the reduction.
    return airline_df[at_top_airports].mean()

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/dataframe/io.py>:632:
 FutureWarning: WriteToFiles is experimental.
    sink=lambda _: _WriteToPandasFileSink(

apache_beam/examples/dataframe/flight_delays_it_test.py::FlightDelaysTest::test_flight_delays
apache_beam/examples/dataframe/wordcount_test.py::WordCountTest::test_basics
  
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/io/fileio.py>:550:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
    p.options.view_as(GoogleCloudOptions).temp_location or

-- Docs: https://docs.pytest.org/en/latest/warnings.html
- generated xml file: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/pytest_postCommitExamples-spark-py37.xml>
 -
= 2 failed, 20 passed, 1 skipped, 5295 deselected, 39 warnings in 301.98 
seconds =

Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python3.7/threading.py", line 926, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.7/threading.py", line 870, in run
    self._target(*self._args, **self._kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 651, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 634, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 426, in __next__
    return self._next()
  File 
"<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/build/gradleenv/2022703441/lib/python3.7/site-packages/grpc/_channel.py";,>
 line 826, in _next
    raise self
grpc._channel._MultiThreadedRendezvous: <_MultiThreadedRendezvous of RPC that 
terminated with:
        status = StatusCode.CANCELLED
        details = "Multiplexer hanging up"
        debug_error_string = 
"{"created":"@1650439921.844874000","description":"Error received from peer 
ipv6:[::1]:42459","file":"src/core/lib/surface/call.cc","file_line":903,"grpc_message":"Multiplexer
 hanging up","grpc_status":1}"
>


> Task :sdks:python:test-suites:portable:py37:sparkExamples FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 271

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py39:sparkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Script 
'<https://ci-beam.apache.org/job/beam_PostCommit_Python_Examples_Spark/ws/src/sdks/python/test-suites/portable/common.gradle'>
 line: 271

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py37:sparkExamples'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
> Run with --stacktrace option to get the stack trace.
> Run with --info or --debug option to get more log output.
> Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 8.0.

You can use '--warning-mode all' to show the individual deprecation warnings 
and determine if they come from your own scripts or plugins.

See 
https://docs.gradle.org/7.4/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 6m 27s
79 actionable tasks: 50 executed, 27 from cache, 2 up-to-date

Publishing build scan...
https://gradle.com/s/xdmmg6cnw4hl2

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to