See 
<https://builds.apache.org/job/beam_PostCommit_Python2/1192/display/redirect?page=changes>

Changes:

[aaltay] [BEAM-8811] Upgrade Beam pipeline diagrams in docs (#10200)


------------------------------------------
[...truncated 1.43 MB...]
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:84:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
No handlers could be found for logger 
"apache_beam.runners.portability.fn_api_runner_transforms"
19/12/11 18:04:26 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: 
ArtifactStagingService started on localhost:44113
19/12/11 18:04:26 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: Java 
ExpansionService started on localhost:46859
19/12/11 18:04:26 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobServerDriver: JobService 
started on localhost:57337
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--parallelism=2', '--shutdown_sources_on_final_watermark']
19/12/11 18:04:28 INFO org.apache.beam.runners.spark.SparkJobInvoker: Invoking 
job BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028
19/12/11 18:04:29 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.JobInvocation: Starting job 
invocation 
BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028
INFO:root:Waiting until the pipeline has finished because the environment 
"LOOPBACK" has started a component necessary for the execution.
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STOPPED
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
STARTING
INFO:apache_beam.runners.portability.portable_runner:Job state changed to 
RUNNING
19/12/11 18:04:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
PipelineOptions.filesToStage was not specified. Defaulting to files from the 
classpath
19/12/11 18:04:29 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Will 
stage 1 files. (Enable logging at DEBUG level to see which files will be 
staged.)
19/12/11 18:04:29 INFO 
org.apache.beam.runners.spark.translation.SparkContextFactory: Creating a brand 
new Spark Context.
19/12/11 18:04:30 WARN org.apache.hadoop.util.NativeCodeLoader: Unable to load 
native-hadoop library for your platform... using builtin-java classes where 
applicable
19/12/11 18:04:31 INFO org.apache.beam.runners.spark.SparkPipelineRunner: 
Running job 
BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028 on 
Spark master local[4]
19/12/11 18:04:31 INFO 
org.apache.beam.runners.spark.aggregators.AggregatorsAccumulator: Instantiated 
aggregators accumulator: 
19/12/11 18:04:31 INFO 
org.apache.beam.runners.spark.metrics.MetricsAccumulator: Instantiated metrics 
accumulator: MetricQueryResults()
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 0
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure control channel 
for localhost:39773.
INFO:apache_beam.runners.worker.sdk_worker:Control channel established.
INFO:apache_beam.runners.worker.sdk_worker:Initializing SDKHarness with 
unbounded number of workers.
19/12/11 18:04:34 INFO 
org.apache.beam.runners.fnexecution.control.FnApiControlClientPoolService: Beam 
Fn Control client connected with id 1-1
INFO:apache_beam.runners.worker.sdk_worker:Creating insecure state channel for 
localhost:40609.
INFO:apache_beam.runners.worker.sdk_worker:State channel established.
INFO:apache_beam.runners.worker.data_plane:Creating client data channel for 
localhost:36499
19/12/11 18:04:34 INFO 
org.apache.beam.runners.fnexecution.data.GrpcDataService: Beam Fn Data client 
connected.
19/12/11 18:04:34 WARN 
org.apache.beam.runners.spark.translation.GroupNonMergingWindowsFunctions: 
Either coder LengthPrefixCoder(ByteArrayCoder) or GlobalWindow$Coder is not 
consistent with equals. That might cause issues on some runners.

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_27)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_28)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_29)))
INFO:__main__:Writing 100000 documents to mongodb finished in 73.316 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:Reading from mongodb 
beam_mongodbio_it_db:integration_test_1576087438
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85:
 FutureWarning: ReadFromMongoDB is experimental.
  | 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function annotate_downstream_side_inputs at 0x7fab4ecc2e60> 
====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function fix_side_input_pcoll_coders at 0x7fab4ecc2f50> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function lift_combiners at 0x7fab4ecec050> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_sdf at 0x7fab4ecec0c8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_gbk at 0x7fab4ecec140> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sink_flattens at 0x7fab4ecec230> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function greedily_fuse at 0x7fab4ecec2a8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function read_to_impulse at 0x7fab4ecec320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function impulse_to_input at 0x7fab4ecec398> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function inject_timer_pcollections at 0x7fab4ecec500> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sort_stages at 0x7fab4ecec578> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function window_pcollection_coders at 0x7fab4ecec5f0> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7fab468205d0> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_assert_that/Create/Impulse_10)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2530>)_11)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_13)+(ref_AppliedPTransform_assert_that/Group/pair_with_0_17))))+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/Impulse_5)+(ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+((ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction)+(ref_PCollection_PCollection_1_split/Write))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(((ref_PCollection_PCollection_1_split/Read)+((ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+((ref_AppliedPTransform_Map_7)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_14)+(ref_AppliedPTransform_assert_that/ToVoidKey_15)))))+(ref_AppliedPTransform_assert_that/Group/pair_with_1_18))+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
WARNING:apache_beam.io.filebasedsink:Deleting 4 existing files in target path 
matching: -*-of-%(num_shards)05d
19/12/11 18:05:06 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028: 
Pipeline translated successfully. Computing outputs

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_24)+((ref_AppliedPTransform_assert_that/Unkey_25)+(ref_AppliedPTransform_assert_that/Match_26)))
INFO:__main__:Read 100000 documents from mongodb finished in 22.242 seconds

> Task :sdks:python:test-suites:portable:py2:portableWordCountSparkRunnerBatch
INFO:apache_beam.io.filebasedsink:Starting finalize_write threads with 
num_shards: 4 (skipped: 0), batches: 4, num_threads: 4
INFO:apache_beam.io.filebasedsink:Renamed 4 shards in 0.12 seconds.
19/12/11 18:05:36 INFO org.apache.beam.runners.spark.SparkPipelineRunner: Job 
BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028 
finished.
19/12/11 18:05:36 WARN 
org.apache.beam.runners.spark.SparkPipelineResult$BatchMode: Collecting 
monitoring infos is not implemented yet in Spark portable runner.
19/12/11 18:05:36 INFO 
org.apache.beam.runners.fnexecution.artifact.AbstractArtifactRetrievalService: 
Manifest at 
/tmp/beam-tempF1TcBs/artifactsQydgGt/job_a9f1f325-3773-40fe-8a59-bd3cc3d01800/MANIFEST
 has 1 artifact locations
19/12/11 18:05:36 INFO 
org.apache.beam.runners.fnexecution.artifact.BeamFileSystemArtifactStagingService:
 Removed dir 
/tmp/beam-tempF1TcBs/artifactsQydgGt/job_a9f1f325-3773-40fe-8a59-bd3cc3d01800/
INFO:apache_beam.runners.portability.portable_runner:Job state changed to DONE
19/12/11 18:05:36 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Getting 
job metrics for 
BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028
19/12/11 18:05:36 INFO 
org.apache.beam.runners.fnexecution.jobsubmission.InMemoryJobService: Finished 
getting job metrics for 
BeamApp-jenkins-1211180428-adb14121_093d29fb-cae8-4d37-ae01-babdf615c028
ERROR:apache_beam.runners.worker.data_plane:Failed to read inputs in the data 
plane.
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 272, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 395, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Socket closed"
        debug_error_string = 
"{"created":"@1576087536.819925806","description":"Error received from peer 
ipv4:127.0.0.1:36499","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket
 closed","grpc_status":14}"
>
Exception in thread read_grpc_client_inputs:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 286, in <lambda>
    target=lambda: self._read_inputs(elements_iterator),
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/data_plane.py";,>
 line 272, in _read_inputs
    for elements in elements_iterator:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 395, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Socket closed"
        debug_error_string = 
"{"created":"@1576087536.819925806","description":"Error received from peer 
ipv4:127.0.0.1:36499","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread read_state:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 532, in pull_responses
    for response in responses:
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 395, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Socket closed"
        debug_error_string = 
"{"created":"@1576087536.819985493","description":"Error received from peer 
ipv4:127.0.0.1:40609","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket
 closed","grpc_status":14}"
>

Exception in thread run_worker_1-1:
Traceback (most recent call last):
  File "/usr/lib/python2.7/threading.py", line 801, in __bootstrap_inner
    self.run()
  File "/usr/lib/python2.7/threading.py", line 754, in run
    self.__target(*self.__args, **self.__kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py";,>
 line 112, in run
    for work_request in control_stub.Control(get_responses()):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 395, in next
    return self._next()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py";,>
 line 561, in _next
    raise self
_Rendezvous: <_Rendezvous of RPC that terminated with:
        status = StatusCode.UNAVAILABLE
        details = "Socket closed"
        debug_error_string = 
"{"created":"@1576087536.820437907","description":"Error received from peer 
ipv4:127.0.0.1:39773","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket
 closed","grpc_status":14}"
>


> Task :sdks:python:test-suites:portable:py2:postCommitPy2

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:739:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:652:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3356.298s

OK (SKIP=4)

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 56m 59s
120 actionable tasks: 95 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/bkjzhvs6way4o

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to