See 
<https://builds.apache.org/job/beam_PostCommit_Python2/1811/display/redirect?page=changes>

Changes:

[iemejia] [BEAM-9329] Support request of schemas by version on KafkaIO + CSR


------------------------------------------
[...truncated 2.78 MB...]
Requirement already satisfied: scandir; python_version < "3.5" in 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages>
 (from pathlib2>=2.2.0; python_version < 
"3.6"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.10.0)
Requirement already satisfied: apipkg>=1.4 in 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/45127155/lib/python2.7/site-packages>
 (from execnet>=1.1->pytest-xdist<2,>=1.29.0->apache-beam==2.20.0.dev0) (1.5)
Installing collected packages: apache-beam
  Attempting uninstall: apache-beam
    Found existing installation: apache-beam 2.20.0.dev0
    Uninstalling apache-beam-2.20.0.dev0:
      Successfully uninstalled apache-beam-2.20.0.dev0
  Running setup.py develop for apache-beam
Successfully installed apache-beam
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/__init__.py>:82:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:Writing 100000 documents to mongodb
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:82:
 FutureWarning: WriteToMongoDB is experimental.
  known_args.batch_size))
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function annotate_downstream_side_inputs at 0x7f99880f00c8> 
====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function fix_side_input_pcoll_coders at 0x7f99880f01b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function lift_combiners at 0x7f99880f0230> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_sdf at 0x7f99880f02a8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_gbk at 0x7f99880f0320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sink_flattens at 0x7f99880f0410> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function greedily_fuse at 0x7f99880f0488> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function read_to_impulse at 0x7f99880f0500> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function impulse_to_input at 0x7f99880f0578> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function inject_timer_pcollections at 0x7f99880f06e0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sort_stages at 0x7f99880f0758> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function window_pcollection_coders at 0x7f99880f07d0> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7f998810fcd0> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running 
(ref_AppliedPTransform_Create/Impulse_3)+((ref_AppliedPTransform_Create/FlatMap(<lambda
 at 
core.py:2637>)_4)+((ref_AppliedPTransform_Create/Map(decode)_6)+((ref_AppliedPTransform_Create
 
documents_7)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_9)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_11)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_13)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write)))))))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_18)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_19)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_20)))

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok

> Task :sdks:python:test-suites:direct:py2:mongodbioIT
INFO:__main__:Writing 100000 documents to mongodb finished in 36.216 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the 
default runner: DirectRunner.
INFO:__main__:Reading from mongodb 
beam_mongodbio_it_db:integration_test_1582714826
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:101:
 FutureWarning: ReadFromMongoDB is experimental.
  | 'Combine' >> beam.CombineGlobally(sum))
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function annotate_downstream_side_inputs at 0x7f99880f00c8> 
====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function fix_side_input_pcoll_coders at 0x7f99880f01b8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function lift_combiners at 0x7f99880f0230> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_sdf at 0x7f99880f02a8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_gbk at 0x7f99880f0320> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sink_flattens at 0x7f99880f0410> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function greedily_fuse at 0x7f99880f0488> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function read_to_impulse at 0x7f99880f0500> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function impulse_to_input at 0x7f99880f0578> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function inject_timer_pcollections at 0x7f99880f06e0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sort_stages at 0x7f99880f0758> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function window_pcollection_coders at 0x7f99880f07d0> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7f9966e96e10> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/Impulse_5)+(ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+((ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction)+(ref_PCollection_PCollection_1_split/Write))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(ref_PCollection_PCollection_1_split/Read)+(((ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+((ref_AppliedPTransform_Map_7)+((ref_AppliedPTransform_Combine/KeyWithVoid_9)+(Combine/CombinePerKey/Precombine))))+(Combine/CombinePerKey/Group/Write))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((Combine/CombinePerKey/Group/Read)+(Combine/CombinePerKey/Merge))+((Combine/CombinePerKey/ExtractOutputs)+((ref_AppliedPTransform_Combine/UnKey_17)+(ref_PCollection_PCollection_9/Write)))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(ref_AppliedPTransform_assert_that/Create/Impulse_26)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2637>)_27)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_29)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_33)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_Combine/DoOnce/Impulse_19)+((ref_AppliedPTransform_Combine/DoOnce/FlatMap(<lambda
 at 
core.py:2637>)_20)+((ref_AppliedPTransform_Combine/DoOnce/Map(decode)_22)+((ref_AppliedPTransform_Combine/InjectDefault_23)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30)+((ref_AppliedPTransform_assert_that/ToVoidKey_31)+(ref_AppliedPTransform_assert_that/Group/pair_with_1_34)))))))+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40))+((ref_AppliedPTransform_assert_that/Unkey_41)+(ref_AppliedPTransform_assert_that/Match_42))
INFO:__main__:Read 100000 documents from mongodb finished in 4.688 seconds
mongoioit27184
mongoioit27184

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87:
 FutureWarning: _ReadFromBigQuery is experimental.
  kms_key=kms_key)
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:788:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:818:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:84:
 FutureWarning: _ReadFromBigQuery is experimental.
  (options.view_as(GoogleCloudOptions).project, known_args.input_table))
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:275:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:162:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:823:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:818:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1463:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:95:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:303:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_basic_execution 
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: 
The "TestDataflowRunner", does not support the TestStream transform. Supported 
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: 
The "TestDataflowRunner", does not support the TestStream transform. Supported 
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... 
SKIP: The "TestDataflowRunner", does not support the TestStream transform. 
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 52 tests in 3531.913s

OK (SKIP=7)
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_46-14248785908124612548?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_09_08-1079474573065222466?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_15_57-6002963457576054284?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_23_08-13544500137930054607?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_42-4400479445270159900?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_11_27-16995085645909935706?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_18_33-4205974244209600897?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_25_10-3444116717724883276?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_31_36-4105599544782101490?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_40-15208863093260886688?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_13_34-1793894851663506834?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_30_26-13911415375681004287?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_43-2556643281579720700?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_06_35-204725646880938174?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_13_26-6246233407413768567?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_19_57-18314327646135871809?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_25_55-3519054774754303404?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_32_40-4972140297607227876?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_39-15892888517967731743?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_01_53-2184129274607646274?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_09_04-8248895207902639876?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_17_45-3391424816496365245?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_24_38-14332342665444516559?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_31_38-17362074980584220198?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_39-2113927375948685409?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_01_06-4253851390149508338?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_09_42-9461482410214720609?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_16_40-3928694143299605299?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_22_52-11753667525003684678?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_29_24-17170648559863598372?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_40-2102329746166808130?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_02_03-13469384430675825816?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_09_33-12021315769956787017?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_16_17-6264186001292256234?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_22_59-14048956024351690668?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_29_40-17949809732660534351?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_02_54_39-17495771610695437980?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_02_52-3866251781646084189?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_12_10-3285384165141360427?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_19_48-6941450521441968340?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_26_06-2975860280318979364?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_32_55-8695833462708492719?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_39_38-16159825566732610583?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-26_03_46_12-12125312227500547942?project=apache-beam-testing

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:py2:docker'.
> Process 'command 'docker'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 81

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 0m 17s
122 actionable tasks: 97 executed, 22 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/xen7n6ehe6aik

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to