See <https://builds.apache.org/job/beam_PostCommit_Python2/1109/display/redirect>
Changes: ------------------------------------------ [...truncated 1.79 MB...] File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/worker/sdk_worker.py",> line 111, in run for work_request in control_stub.Control(get_responses()): File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 395, in next return self._next() File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/1866363813/local/lib/python2.7/site-packages/grpc/_channel.py",> line 561, in _next raise self _Rendezvous: <_Rendezvous of RPC that terminated with: status = StatusCode.UNAVAILABLE details = "Socket closed" debug_error_string = "{"created":"@1575137434.871119146","description":"Error received from peer ipv4:127.0.0.1:40797","file":"src/core/lib/surface/call.cc","file_line":1055,"grpc_message":"Socket closed","grpc_status":14}" > > Task :sdks:python:test-suites:portable:py2:postCommitPy2 > Task :sdks:python:test-suites:direct:py2:mongodbioIT INFO:apache_beam.runners.portability.fn_api_runner:Running ((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16)) INFO:__main__:Writing 100000 documents to mongodb finished in 52.571 seconds INFO:root:Missing pipeline option (runner). Executing pipeline using the default runner: DirectRunner. INFO:__main__:Reading from mongodb beam_mongodbio_it_db:integration_test_1575137386 <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:85: FutureWarning: ReadFromMongoDB is experimental. | 'Map' >> beam.Map(lambda doc: doc['number']) INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function annotate_downstream_side_inputs at 0x7fa37bf5f848> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function fix_side_input_pcoll_coders at 0x7fa37bf5f938> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function lift_combiners at 0x7fa37bf5f9b0> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_sdf at 0x7fa37bf5fa28> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function expand_gbk at 0x7fa37bf5faa0> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sink_flattens at 0x7fa37bf5fb90> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function greedily_fuse at 0x7fa37bf5fc08> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function read_to_impulse at 0x7fa37bf5fc80> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function impulse_to_input at 0x7fa37bf5fcf8> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function inject_timer_pcollections at 0x7fa37bf5fe60> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function sort_stages at 0x7fa37bf5fed8> ==================== INFO:apache_beam.runners.portability.fn_api_runner_transforms:==================== <function window_pcollection_coders at 0x7fa37bf5ff50> ==================== INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100 INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7fa370eed310> for environment urn: "beam:env:embedded_python:v1" INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/1)))+(assert_that/Group/Flatten/Write/1) INFO:apache_beam.runners.portability.fn_api_runner:Running ((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/0))))))+(assert_that/Group/Flatten/Write/0) INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write) INFO:apache_beam.runners.portability.fn_api_runner:Running (assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20))) INFO:__main__:Read 100000 documents from mongodb finished in 21.354 seconds > Task :sdks:python:test-suites:dataflow:py2:postCommitIT test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... SKIP: GCP dependencies are not installed <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:651: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported streaming = self.test_pipeline.options.view_as(StandardOptions).streaming <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1220: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:726: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: BEAM-8842: Disabled due to reliance on old retry behavior. test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:296: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:307: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=kms_key)) test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_53-2043035810453986453?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_09_03-350776916843078518?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_17_52-8999207406051685529?project=apache-beam-testing test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ERROR ====================================================================== ERROR: test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ---------------------------------------------------------------------- Traceback (most recent call last): File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 812, in run test(orig) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 45, in __call__ return self.run(*arg, **kwarg) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 133, in run self.runTest(result) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py",> line 151, in runTest test(result) File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__ return self.run(*args, **kwds) File "/usr/lib/python2.7/unittest/case.py", line 329, in run testMethod() File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py",> line 740, in test_multiple_destinations_transform equal_to([(full_output_table_1, bad_record)])) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 436, in __exit__ self.run().wait_until_finish() File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 416, in run self._options).run(False) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py",> line 429, in run return self.runner.run_pipeline(self, self._options) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 74, in run_pipeline self.wait_until_in_state(PipelineState.CANCELLED) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py",> line 94, in wait_until_in_state job_state = self.result.state File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1404, in state self._update_job() File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py",> line 1360, in _update_job self._job = self._runner.dataflow_client.get_job(self.job_id()) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py",> line 209, in wrapper return fun(*args, **kwargs) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py",> line 673, in get_job response = self._client.projects_locations_jobs.Get(request) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py",> line 661, in Get config, request, global_params=global_params) File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 717, in _RunMethod http = self.__client.http File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py",> line 324, in http @property File "<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py",> line 276, in signalhandler raise TimedOutException() TimedOutException: 'test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)' ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 45 tests in 5466.492s FAILED (SKIP=5, errors=1) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_02_02-10625095481526485865?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_16_25-17212469883613234769?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_24_08-9159443122461388847?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_30_43-5856499177835473125?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_37_33-14637036451842972407?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_44_56-8085010551727131545?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_51_46-4765860714345659598?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_55-17953381351743751734?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_19_14-2835426087498011637?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_27_17-1743468660108317247?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_34_20-4249396903678764432?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_41_15-745332379698211949?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_59-16073022993848135517?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_14_09-7396233147966540101?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_20_38-7475828128619221259?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_27_06-12515541307774503160?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_33_33-4286484303844675999?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_55-14920928390767598351?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_21_11-6883838598776258803?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_28_23-10756316836391834194?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_35_19-1179468522304054868?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_56-15908369351712971049?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_09_31-3764771809782010850?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_17_34-10840824016121511468?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_24_44-7678920390995232696?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_31_40-9407024375479433680?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_58-14958847216808086248?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_09_52-2762275398982537?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_17_12-1177038203876575065?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_25_05-10052277054454569747?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_31_44-8231559464571296074?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_39_39-8935015611729525066?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_01_55-15980942358089003296?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_10_35-7848820637959990487?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_19_51-16902147212322759700?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-11-30_10_37_36-5900831866044888758?project=apache-beam-testing > Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED FAILURE: Build completed with 2 failures. 1: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'> line: 70 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== 2: Task failed with an exception. ----------- * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'> line: 85 * What went wrong: Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. ============================================================================== * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 1h 32m 17s 120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date Publishing build scan... https://scans.gradle.com/s/cieglzpvgf3iu Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
