See 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/5746/display/redirect>

Changes:


------------------------------------------
[...truncated 81.88 KB...]
Collecting certifi>=2017.4.17
  Using cached certifi-2019.11.28-py2.py3-none-any.whl (156 kB)
Collecting chardet<3.1.0,>=3.0.2
  Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting idna<2.9,>=2.5
  Using cached idna-2.8-py2.py3-none-any.whl (58 kB)
Collecting urllib3!=1.25.0,!=1.25.1,<1.26,>=1.21.1
  Using cached urllib3-1.25.8-py2.py3-none-any.whl (125 kB)
Collecting google-auth<2.0dev,>=0.4.0
  Using cached google_auth-1.11.0-py2.py3-none-any.whl (76 kB)
Requirement already satisfied: contextlib2; python_version < "3" in 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages>
 (from importlib-metadata>=0.12; python_version < 
"3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (0.6.0.post1)
Requirement already satisfied: zipp>=0.5 in 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages>
 (from importlib-metadata>=0.12; python_version < 
"3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.1.0)
Requirement already satisfied: configparser>=3.5; python_version < "3" in 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages>
 (from importlib-metadata>=0.12; python_version < 
"3.8"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (4.0.2)
Requirement already satisfied: scandir; python_version < "3.5" in 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-194514014/lib/python2.7/site-packages>
 (from pathlib2>=2.2.0; python_version < 
"3.6"->pytest<5.0,>=4.4.0->apache-beam==2.20.0.dev0) (1.10.0)
Collecting apipkg>=1.4
  Using cached apipkg-1.5-py2.py3-none-any.whl (4.9 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.20.0.dev0-py2-none-any.whl size=1890975 
sha256=03c3720643b8729115d372ad86d6d68874107cb84270e7f4896e55c338cae07e
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/46/9d/1f/7a9661a8fcb0ed7d52d619d5385fcdc5cb78d6b0eed8782ad6
Successfully built apache-beam
Installing collected packages: crcmod, dill, fastavro, docopt, certifi, 
chardet, idna, urllib3, requests, hdfs, httplib2, pbr, funcsigs, mock, numpy, 
pymongo, pyasn1, rsa, pyasn1-modules, oauth2client, pyparsing, pydot, 
python-dateutil, pytz, avro, pyvcf, pyarrow, typing-extensions, cachetools, 
monotonic, fasteners, google-apitools, google-auth, googleapis-common-protos, 
google-api-core, google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, 
google-cloud-pubsub, google-resumable-media, google-cloud-bigquery, 
google-cloud-bigtable, google-cloud-spanner, grpcio-gcp, 
proto-google-cloud-datastore-v1, googledatastore, freezegun, nose, 
nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, 
tenacity, atomicwrites, packaging, attrs, wcwidth, more-itertools, pytest, 
pytest-forked, apipkg, execnet, pytest-xdist, pytest-timeout, apache-beam

> Task :runners:google-cloud-dataflow-java:worker:shadowJar

> Task :sdks:python:test-suites:dataflow:py2:installGcpTest
Successfully installed apache-beam-2.20.0.dev0 apipkg-1.5 atomicwrites-1.3.0 
attrs-19.3.0 avro-1.9.2 cachetools-3.1.1 certifi-2019.11.28 chardet-3.0.4 
crcmod-1.7 dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 
fasteners-0.15 freezegun-0.3.14 funcsigs-1.0.2 google-api-core-1.16.0 
google-apitools-0.5.28 google-auth-1.11.0 google-cloud-bigquery-1.17.1 
google-cloud-bigtable-1.0.0 google-cloud-core-1.3.0 
google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 
google-cloud-spanner-1.13.0 google-resumable-media-0.4.1 
googleapis-common-protos-1.51.0 googledatastore-7.0.2 grpc-google-iam-v1-0.12.3 
grpcio-gcp-0.2.2 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 monotonic-1.5 
more-itertools-5.0.0 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.16.6 
oauth2client-3.0.0 packaging-20.1 pandas-0.24.2 parameterized-0.7.1 pbr-5.4.4 
proto-google-cloud-datastore-v1-0.90.4 pyarrow-0.15.1 pyasn1-0.4.8 
pyasn1-modules-0.2.8 pydot-1.4.1 pyhamcrest-1.10.1 pymongo-3.10.1 
pyparsing-2.4.6 pytest-4.6.9 pytest-forked-1.1.3 pytest-timeout-1.3.4 
pytest-xdist-1.31.0 python-dateutil-2.8.1 pytz-2019.3 pyvcf-0.6.8 pyyaml-5.3 
requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 
typing-extensions-3.7.4.1 urllib3-1.25.8 wcwidth-0.1.8

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.20.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner
setup.py:245: UserWarning: You are using Apache Beam with Python 2. New 
releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/setuptools/dist.py>:476:
 UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
running nosetests
running egg_info
Skipping proto regeneration: all files up to date
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/__init__.py>:82:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:541:
 YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as 
the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full 
details.
  for spec in yaml.load_all(open(transcript_filename)):
test_gbk_many_values 
(apache_beam.runners.portability.fn_api_runner_test.FnApiBasedStateBackedCoderTest)
 ... ok
Test TimestampCombiner with EARLIEST. ... ok
Test TimestampCombiner with LATEST. ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_flatten_same_pcollections 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_one_single_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_pcollections 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_a_flattened_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
Test a GBK sideinput, with multiple triggering. ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_reshuffle_preserves_timestamps 
(apache_beam.transforms.util_test.ReshuffleTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerBatchTests-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 30 tests in 1849.964s

OK
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_42-8084550129129723591?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_18_37-3955831569906240889?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_18-7927871829559101608?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_31_13-3959663437105856365?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_37_11-9124849881379402145?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_45-7018054198270937428?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_19_37-9202869397397609206?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_45-17357824683042848741?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_44-10172560910420875479?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_18_48-5085863116918561181?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_31-16006580896569062904?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_45-18281249565766148556?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_18_55-4473849898523559544?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_38-611006150094104590?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_44-10076230678275929091?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_19_22-18036846010688440642?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_20-6049568707568917080?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_45-15292900363082691868?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_19_23-16650782549674044890?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_21-187447640025552897?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_43-13364356687252618829?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_19_04-9591300618860548505?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_10-9370037544490605724?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_12_45-3107737885714992109?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_19_13-9438322217608783238?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_25_49-3839091103289873951?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:validatesRunnerStreamingTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 --streaming 
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.20.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner,!sickbay-streaming
setup.py:245: UserWarning: You are using Apache Beam with Python 2. New 
releases of Apache Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/setuptools/dist.py>:476:
 UserWarning: Normalizing '2.20.0.dev' to '2.20.0.dev0'
  normalized_version,
running nosetests
running egg_info
Skipping proto regeneration: all files up to date
writing requirements to apache_beam.egg-info/requires.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/__init__.py>:82:
 UserWarning: You are using Apache Beam with Python 2. New releases of Apache 
Beam will soon support Python 3 only.
  'You are using Apache Beam with Python 2. '
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/apache_beam/transforms/trigger_test.py>:541:
 YAMLLoadWarning: calling yaml.load_all() without Loader=... is deprecated, as 
the default Loader is unsafe. Please read https://msg.pyyaml.org/load for full 
details.
  for spec in yaml.load_all(open(transcript_filename)):
test_gbk_many_values 
(apache_beam.runners.portability.fn_api_runner_test.FnApiBasedStateBackedCoderTest)
 ... ok
Test TimestampCombiner with EARLIEST. ... ok
Test TimestampCombiner with LATEST. ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_flatten_one_single_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_pcollections 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_impulse (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_a_flattened_pcollection 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle 
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
Test a GBK sideinput, with multiple triggering. ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_element_param (apache_beam.pipeline_test.DoFnTest) ... ok
test_key_param (apache_beam.pipeline_test.DoFnTest) ... ok

----------------------------------------------------------------------
XML: nosetests-validatesRunnerStreamingTests-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 26 tests in 1719.660s

OK
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_33-3319934334279908842?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_50_38-129336636903823667?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_57_41-9352913961361309553?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_11_05_18-2361883950232364997?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_34-5263806131771259768?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_49_49-9302135639509024415?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_56_38-15428242537586289732?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_33-813065369988798556?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_50_27-11856233669947500287?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_37-2906628317434251676?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_50_40-1779201195442402751?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_57_18-17350684811427723309?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_31-8007897470081364368?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_51_04-14220502400983783803?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_34-7697044324022972678?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_50_20-828062886641220974?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_32-479314723199369769?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_50_12-12188437166410282308?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_57_13-12191244917508139565?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_43_35-7394209916480950180?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_50_24-17776591682252887906?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2020-02-12_10_57_31-17688346293896495751?project=apache-beam-testing

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py37:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py36:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py35:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 1m 13s
68 actionable tasks: 51 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/ipcplu7622qba

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to