See <https://builds.apache.org/job/beam_PostCommit_Python2/460/display/redirect>
------------------------------------------
[...truncated 388.90 KB...]
job_service = self.create_job_service(options)
File "apache_beam/runners/portability/portable_runner.py", line 161, in
create_job_service
return server.start()
File "apache_beam/runners/portability/job_server.py", line 84, in start
self._endpoint = self._job_server.start()
File "apache_beam/runners/portability/job_server.py", line 124, in start
self._process.poll())
RuntimeError: Job service failed to start up with error 125
> Task :sdks:python:test-suites:portable:py2:crossLanguagePythonJavaFlink FAILED
> Task :sdks:python:test-suites:direct:py2:mongodbioIT
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:68:
FutureWarning: WriteToMongoDB is experimental.
known_args.batch_size)
INFO:root:==================== <function annotate_downstream_side_inputs at
0x7ff8bbaf5d70> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at
0x7ff8bbaf5e60> ====================
INFO:root:==================== <function lift_combiners at 0x7ff8bbaf5ed8>
====================
INFO:root:==================== <function expand_sdf at 0x7ff8bbaf5f50>
====================
INFO:root:==================== <function expand_gbk at 0x7ff8ba78c050>
====================
INFO:root:==================== <function sink_flattens at 0x7ff8ba78c140>
====================
INFO:root:==================== <function greedily_fuse at 0x7ff8ba78c1b8>
====================
INFO:root:==================== <function read_to_impulse at 0x7ff8ba78c230>
====================
INFO:root:==================== <function impulse_to_input at 0x7ff8ba78c2a8>
====================
INFO:root:==================== <function inject_timer_pcollections at
0x7ff8ba78c410> ====================
INFO:root:==================== <function sort_stages at 0x7ff8ba78c488>
====================
INFO:root:==================== <function window_pcollection_coders at
0x7ff8ba78c500> ====================
INFO:root:Running (ref_AppliedPTransform_Create
documents/Read_3)+((ref_AppliedPTransform_WriteToMongoDB/ParDo(_GenerateObjectIdFn)_5)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/AddRandomKeys_7)+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9)+(WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Write))))
INFO:root:Running
((WriteToMongoDB/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_WriteToMongoDB/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+((ref_AppliedPTransform_WriteToMongoDB/Reshuffle/RemoveRandomKeys_15)+(ref_AppliedPTransform_WriteToMongoDB/ParDo(_WriteMongoFn)_16))
INFO:root:Writing 100000 documents to mongodb finished in 54.708 seconds
INFO:root:Missing pipeline option (runner). Executing pipeline using the
default runner: DirectRunner.
INFO:root:Reading from mongodb beam_mongodbio_it_db:integration_test_1568354934
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/mongodbio_it_test.py>:83:
FutureWarning: ReadFromMongoDB is experimental.
| 'Map' >> beam.Map(lambda doc: doc['number'])
INFO:root:==================== <function annotate_downstream_side_inputs at
0x7ff8bbaf5d70> ====================
INFO:root:==================== <function fix_side_input_pcoll_coders at
0x7ff8bbaf5e60> ====================
INFO:root:==================== <function lift_combiners at 0x7ff8bbaf5ed8>
====================
INFO:root:==================== <function expand_sdf at 0x7ff8bbaf5f50>
====================
INFO:root:==================== <function expand_gbk at 0x7ff8ba78c050>
====================
INFO:root:==================== <function sink_flattens at 0x7ff8ba78c140>
====================
INFO:root:==================== <function greedily_fuse at 0x7ff8ba78c1b8>
====================
INFO:root:==================== <function read_to_impulse at 0x7ff8ba78c230>
====================
INFO:root:==================== <function impulse_to_input at 0x7ff8ba78c2a8>
====================
INFO:root:==================== <function inject_timer_pcollections at
0x7ff8ba78c410> ====================
INFO:root:==================== <function sort_stages at 0x7ff8ba78c488>
====================
INFO:root:==================== <function window_pcollection_coders at
0x7ff8ba78c500> ====================
INFO:root:Running
((ref_AppliedPTransform_assert_that/Create/Read_7)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_11)+(assert_that/Group/Flatten/Transcode/1)))+(assert_that/Group/Flatten/Write/1)
INFO:root:Running
((ref_AppliedPTransform_ReadFromMongoDB/Read_3)+((ref_AppliedPTransform_Map_4)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_8)+((ref_AppliedPTransform_assert_that/ToVoidKey_9)+((ref_AppliedPTransform_assert_that/Group/pair_with_1_12)+(assert_that/Group/Flatten/Transcode/0))))))+(assert_that/Group/Flatten/Write/0)
INFO:root:Running
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:root:Running
(assert_that/Group/GroupByKey/Read)+((ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_18)+((ref_AppliedPTransform_assert_that/Unkey_19)+(ref_AppliedPTransform_assert_that/Match_20)))
INFO:root:Read 100000 documents from mongodb finished in 21.251 seconds
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:696:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:577:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1145:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1142:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
FutureWarning: MatchAll is experimental.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_big_query_read
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_job_python_from_python_it
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
======================================================================
ERROR: Failure: SyntaxError (invalid syntax (external_test_py37.py, line 46))
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/loader.py",>
line 418, in loadTestsFromName
addr.filename, addr.module)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/importer.py",>
line 47, in importFromPath
return self.importFromDir(dir_path, fqname)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/importer.py",>
line 94, in importFromDir
mod = load_module(part_fqname, fh, filename, desc)
SyntaxError: invalid syntax (external_test_py37.py, line 46)
-------------------- >> begin captured logging << --------------------
root: INFO: Generating grammar tables from
/usr/lib/python2.7/lib2to3/Grammar.txt
root: INFO: Generating grammar tables from
/usr/lib/python2.7/lib2to3/PatternGrammar.txt
root: WARNING: Using deprecated Datastore client.
This client will be removed in Beam 3.0 (next Beam major release).
Please migrate to apache_beam.io.gcp.datastore.v1new.datastoreio.
root: WARNING: python-snappy is not installed; some tests will be skipped.
root: WARNING: Tensorflow is not installed, so skipping some tests.
root: WARNING: Make sure that locally built Python SDK docker image has Python
2.7 interpreter.
root: INFO: Using Python SDK docker image: apachebeam/python2.7_sdk:2.17.0.dev.
If the image is not available at local, we will try to pull from hub.docker.com
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 46 tests in 4307.995s
FAILED (SKIP=4, errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_50-1351061783712333134?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_13_13-18441296790191577560?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_23_51-16336275295154255543?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_31_09-13749852990169736164?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_39_36-3161595699250974138?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_48_18-4558755323951107813?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_57_09-12772705003669669912?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-13_00_07_13-13768446811453394707?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_56-18016400556909569914?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_19_25-16309436915387642400?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_29_17-11668239701757944644?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_38_47-724768986217439037?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_51-11753717271130494772?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_25_50-7984037270899999914?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_33_59-9149234292047235796?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_43_34-831530087079499772?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_56-9268081518660041743?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_17_35-11277237347692874694?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_25_40-13901030204560655237?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_34_44-5541833000197126940?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_43_42-477345147746150409?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_51-7266694085166478517?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_23_45-732810322473724878?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_31_50-4169383565729562790?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_40_21-10825295688707738364?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_51-17271694157886421725?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_13_36-9701553601580592227?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_22_39-3317944077099512418?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_32_08-2946024867206420950?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_42_15-6705078672842557513?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_52-16952203148626959563?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_13_35-15033290747476722471?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_23_08-2182703045192182190?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_30_52-10113077328586814618?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_40_00-14160073496277376756?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_48_36-4642134813003616467?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_03_51-11323012476258987723?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_13_57-13085122591224366885?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_25_28-16494896649222535767?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-09-12_23_43_35-2854967870572105945?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
line: 131
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:crossLanguagePortableWordCount'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/portable/py2/build.gradle'>
line: 105
* What went wrong:
Execution failed for task
':sdks:python:test-suites:portable:py2:crossLanguagePythonJavaFlink'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 85
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 12m 49s
111 actionable tasks: 86 executed, 22 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/hc5li4lkkweso
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]