See 
<https://builds.apache.org/job/beam_PostCommit_Python2/1648/display/redirect?page=changes>

Changes:

[robinyqiu] Support all ZetaSQL TIMESTAMP functions


------------------------------------------
[...truncated 2.78 MB...]
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function greedily_fuse at 0x7efc3873baa0> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function read_to_impulse at 0x7efc3873bb18> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function impulse_to_input at 0x7efc3873bb90> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function inject_timer_pcollections at 0x7efc3873bcf8> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sort_stages at 0x7efc3873bd70> ====================
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function window_pcollection_coders at 0x7efc3873bde8> ====================
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7efc29454f90> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running 
(ref_AppliedPTransform_assert_that/Create/Impulse_26)+((ref_AppliedPTransform_assert_that/Create/FlatMap(<lambda
 at 
core.py:2646>)_27)+((ref_AppliedPTransform_assert_that/Create/Map(decode)_29)+((ref_AppliedPTransform_assert_that/Group/pair_with_0_33)+((assert_that/Group/Flatten/Transcode/0)+(assert_that/Group/Flatten/Write/0)))))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/Impulse_5)+(ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+((ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction)+(ref_PCollection_PCollection_1_split/Write))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(ref_PCollection_PCollection_1_split/Read)+(((ReadFromMongoDB/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+((ref_AppliedPTransform_Map_7)+((ref_AppliedPTransform_Combine/KeyWithVoid_9)+(Combine/CombinePerKey/Precombine))))+(Combine/CombinePerKey/Group/Write))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((Combine/CombinePerKey/Group/Read)+(Combine/CombinePerKey/Merge))+((Combine/CombinePerKey/ExtractOutputs)+((ref_AppliedPTransform_Combine/UnKey_17)+(ref_PCollection_PCollection_9/Write)))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_Combine/DoOnce/Impulse_19)+((ref_AppliedPTransform_Combine/DoOnce/FlatMap(<lambda
 at 
core.py:2646>)_20)+((ref_AppliedPTransform_Combine/DoOnce/Map(decode)_22)+((ref_AppliedPTransform_Combine/InjectDefault_23)+((ref_AppliedPTransform_assert_that/WindowInto(WindowIntoFn)_30)+((ref_AppliedPTransform_assert_that/ToVoidKey_31)+(ref_AppliedPTransform_assert_that/Group/pair_with_1_34)))))))+((assert_that/Group/Flatten/Transcode/1)+(assert_that/Group/Flatten/Write/1))
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(assert_that/Group/Flatten/Read)+(assert_that/Group/GroupByKey/Write)
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((assert_that/Group/GroupByKey/Read)+(ref_AppliedPTransform_assert_that/Group/Map(_merge_tagged_vals_under_key)_40))+((ref_AppliedPTransform_assert_that/Unkey_41)+(ref_AppliedPTransform_assert_that/Match_42))
INFO:__main__:Read 100000 documents from mongodb finished in 4.964 seconds
mongoioit27077
mongoioit27077

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... SKIP: GCP dependencies are not installed
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:87:
 FutureWarning: _ReadFromBigQuery is experimental.
  kms_key=kms_key)
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py>:84:
 FutureWarning: _ReadFromBigQuery is experimental.
  (options.view_as(GoogleCloudOptions).project, known_args.input_table))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:275:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py>:162:
 FutureWarning: _ReadFromBigQuery is experimental.
  query=self.query, use_standard_sql=True, project=self.project))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1655:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = pcoll.pipeline.options.view_as(
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:823:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1466:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1463:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:766:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:95:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:303:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/fileio_test.py>:316:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_basic_execution 
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: 
The "TestDataflowRunner", does not support the TestStream transform. Supported 
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: 
The "TestDataflowRunner", does not support the TestStream transform. Supported 
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... 
SKIP: The "TestDataflowRunner", does not support the TestStream transform. 
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 52 tests in 3703.849s

OK (SKIP=7)
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_21-10357865293852203581?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_57_31-3813119967569854895?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_07_03-6625791177474265692?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_14_35-8562966622701150650?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_21_34-14443926485161518598?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_28_29-7129952707812555361?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_35_58-10609292729826198599?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_42_45-2561488142612050529?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_28-17492393239953823880?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_02_55-14662720940327455068?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_11_16-3188008352965625922?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_20_16-16183203055478100212?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_50-3950905946819649649?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_22-3101084595879517298?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_09_04-13601895837644984783?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_39-1654974266543299882?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_25-13049122898625290045?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_00_45-2529960944946694437?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_07_44-3874352327759153952?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_14_29-9401300981581950981?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_20_55-12810796893535242147?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_27_42-7231660576506843580?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_22-16873989996142205177?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_05_30-6756410584474414197?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_12_54-5118505505397446140?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_19_44-8044974193681521574?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_50-3964249277307995178?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_23-11822242149372033622?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_56_57-14254824312503556566?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_06_14-2805712292462434354?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_13_36-3169405973899305072?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_20_01-9103555589485965304?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_40-6991246326310010225?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_20-13489540811440933903?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_56_37-472612825130617492?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_05_43-6074036985969849646?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_12_51-2614976212569934326?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_20_03-5691507837325678020?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_48_24-7096237094540504776?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_08_57_04-4093048697870661210?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_04_24-11984589298572638687?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_11_29-18116843604858807055?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_19_25-15634416748163890483?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-07_09_26_42-186065978162542562?project=apache-beam-testing

FAILURE: Build completed with 4 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:java:container:dockerPrepare'.
> Could not copy file 
> '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/java/container/build/target/beam-vendor-grpc-1_26_0-0.1.jar'>
>  to 
> '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/java/container/build/docker/target/beam-vendor-grpc-1_26_0-0.1.jar'.>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:buildDarwinAmd64'.
> Build failed due to return code 2 of: 
  Command:
   /usr/bin/go build -o ./build/target/launcher/darwin_amd64/boot 
github.com/apache/beam/sdks/python/boot
  Env:
   GOEXE=
   
GOPATH=<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/container/.gogradle/project_gopath>
   GOROOT=/usr/lib/go-1.12
   GOOS=darwin
   GOARCH=amd64

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task ':sdks:python:container:buildLinuxAmd64'.
> Failed to create directory 
> '<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/container/build/target/launcher/linux_amd64'>

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

4: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 81

* What went wrong:
Execution failed for task 
':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 28s
117 actionable tasks: 94 executed, 20 from cache, 3 up-to-date

Publishing build scan...
https://gradle.com/s/pbn73xfezg33e


FAILURE: Build failed with an exception.

* What went wrong:
Could not add entry ':sdks:java:container:dockerPrepare' to cache 
executionHistory.bin 
(<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/.gradle/5.2.1/executionHistory/executionHistory.bin).>
> java.io.IOException: No space left on device

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 3m 30s
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to