See
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/946/display/redirect?page=changes>
Changes:
[kcweaver] [BEAM-7131] cache executable stage output to prevent re-computation
[kcweaver] Spark portable runner: cache all re-used RDDs by default
[kcweaver] refactor to use pushDataset everywhere
------------------------------------------
[...truncated 769.79 KB...]
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYE5OLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqYIfgYGhvi0xJycpMTk7HiwlkJ8pgVkJmfnpEJNYQGawlpbyJZUnKQHACYlLgM=",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "write/Write/WriteImpl/FinalizeWrite.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s7"
},
"serialized_fn":
"eNrNV/l/27YVpyTnKNN2abJmzbpDTeNN7ioxbprO9ZJsnXK5WhSPdhOuXcZCJCQwBgk+ArTiLuqRTI7b3UfPnd3V3fd9/wv7j/YAWfK02l1+6qefj00ID3gPwPd93xfk06VKQFISMOq3KIlrKiOJbIsslrVAZNSuE85Ji9MLGUlTmp0QpxIbrKlnoNCDYsXbZVmWr1ZS6rMoURJK48FwwNhrIcVoRIlM2nPnFtF8RpttmMBI25o92F7xbsZQIldprkxACTuaJnyUbJh2NvNVuKnl7UB7momASgn22JKRwL8WkdT221FCePQE9btZpKgNu7yS3m07gZvZrrx1EW6p/K+ropdVJGx/EduFKFmy4Vbc4Nt6sLvi3YbOkpEs9BMSU19jRBTc5h1Ee3Wysj6Ux1OHjoRV0UYTdnxjltoGe7x94yE6XLSGcfaaweo9m3q+3bNxcMMItzcsbxuaAhHSDPaNHcPY5HpjL4oFlUVJp667NrwDj3NHD/YPIPQZJTrAO5vebhMtTjOENBKJwRvu9CbQTHIl4F1mRjvi1E+JYj5ObEeX4d1jS4sU0UtkbZnwHOdlYjnSqy4ooqLgvDbOD23wHtzJe3tQrpiDGA+4yzuHvzty1nEUjdOqRMKQDq0iRlWahFUlBg2VSjrpSjVS1YCLPHQGvHGmjxyZeWBm5v5DDx6emXbwKDlHTh4w6A32ZI51dxBGnNd8/bR9Lkho7DYcNAyRKoPJPrzPhffnLW+/xklzP8FZGYk4gukntIstErKytoGLyanM2xqXKabPd08PPlBhd3k4w4J72d1MB7wJO3EUr++lajaneeeknEQJ1PLWOCsTRTMkci1XEbcfyjp5TBM1z0lAmeAGSgeXOsT0c7pRMO19tAeHL8L9FW9KHwgRP1Q1VeBc2HjOxSl3Tq0XibHAkbGlU4OZ7T+SpFGwxGm4gIHmdDna8EAPPlgx9AiJIjCzmeNo+gmcYsODuLHZHnxoUErdKAlF148RWo0oVuXRrYRI79+ogLRNzRDuv87bhmOG0ssR7epgx8dyHGSUKKzZPAk0Q234cIUd9HbifF0XWo/gI314yIWPVhqFhoX/pcbe+i1rlnXFstYK1rWitQD1Zh9OTBkvxCrTqggn+3DKu4QWh4mYOpdoshQlcthWJSfL1OmKbEni0aijT+bPC6nqIo4j5c+vKCaSw/55mkXtFUdmgSPDJc1tbXf+Cw9ngGktXYHTZgtHOYlbITkOZ87WC3UL5rzbNREzEftZnihNsNEuHzbiaqBZL1ForMLHphScdaE5hlSHKp8ohaw6Z5Zp5RFXeBSYN/DisB6Fj6+C68LCmGsUpyJTfizCnGMxLXp7tHi8jhbwSB/Ou3DBhPfRN1C+D94qfMKFR9mZ5maJCyh24DGGacMMFTFDpcaORr1+VWHtFKxLRZ2n0OTpWsHqYbdoyWnrCg6VrLBoqQnrmmWyqLbpMZwUlqylopU1dDecsPYvqO1WuM0ajasdw5GNTsl02iVrHzZXMYaFtPhkpdkoGnBC2iaoOHBRc8h7Ei3zdcE5NZwri3ZZYqGVJ8NyN1KsHOP1WlaM4EhCy5RTXddlEuhLjYZlIssEHZIOpwq9dfJq5VNRJlVZdcVwvizTJBC51gjtgxEPTMoD95pnDT6lwB8UKY+kgseN9mhWKCG4BOJt133Jo4BCy2gf5hcC71b8dTJO1cqogiE0w5wmQI1iGTU/mWUigzabVNDxiiY2MAPFkGaRWV1fM3DpOiyx0+aa8zenKT/7WqG+2yreUdhe2FvYU9hZKBVKRYinkKiJC4I9yngTXxZSBeBC1gfpgmKP9SDfQlGW2THZh64Ll1dhpQdPKPi0C1fM8QY467tMyyv0vDs12ZGls1ptfCM3s8OdzS5Pw5N5ix1lx/Lr8NRIVKdvSFSfZiiWz1TYDNMKeLUH1yo6Eju+TueB4BQaC/WJK5pQn0Gd6U8xlJbVN1taro9Ly9rZfxfYnIb/WReeQ/jXNPyfVfA5Fz7P/j9mXxhh9sURZvfdEGZf0ph9eYjZV3rw1c0w20Skv4bgPW/Ae+HNBu/FcfBeQl1mc+xhhlL7MkL4igtfRwhfarK3po59Q+sYe8to1zcVfKvCHmeEtVjAQkaZ0ZlvM8YihlryHfbiG2nJq5tryXc1mb/nwvcxE69qMv9AwQ9deK0PP3Lhx1pLfrKFlvzUaMnPXPj5KvyiB79U8CsXfn0DRfCbURH8lr7hp9YFsyYuZ8PvkPS/78EfBq9KiGse55zoVOjrlcIfG4XBp1ck/eGt86c1I+/4tt/p0Ax3/OetVlufYp8YeC6ud+EvuOpfza2tsZSKxKmP3wQtfNPN4G+45M4RMhj+71uFH8ywT5s3tcGh8EvvHxj8n3lLwb9q/wHbcML3",
"user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: '2019-05-24T09:00:41.168661Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2019-05-24_02_00_39-11749527633479459098'
location: 'us-central1'
name: 'beamapp-jenkins-0524090009-967454'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2019-05-24T09:00:41.168661Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-05-24_02_00_39-11749527633479459098]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_00_39-11749527633479459098?project=apache-beam-testing
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1558688409381/results'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1558688409381/results*'
->
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1558688409381\\/results[^/\\\\]*'
root: DEBUG: Connecting using Google Application Default Credentials.
root: INFO: Starting the size estimation of the input
oauth2client.transport: INFO: Attempting refresh to obtain initial access_token
root: INFO: Finished listing 0 files in 0.04992985725402832 seconds.
root: DEBUG: Connecting using Google Application Default Credentials.
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_49-2501437662600718440?project=apache-beam-testing.
method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218:
FutureWarning: MatchAll is experimental.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_08_08-8376529085179153798?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_15_06-13111601076091321681?project=apache-beam-testing.
| 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229:
FutureWarning: MatchAll is experimental.
| 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229:
FutureWarning: ReadMatches is experimental.
| 'Checksums' >> beam.Map(compute_hash))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_45-15876899584780592640?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:674:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_13_42-16605798515260242964?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_22_05-13748551983172834549?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_48-9485470735890227900?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_06_21-4139411312472756536?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_13_38-1191106425800506423?project=apache-beam-testing.
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_41-5345433574387211745?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_11_27-1850300773016918281?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_39-9296830610550355929?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_00_45-9595396450690143635?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_08_44-14080142987778334744?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_17_04-3447904119454125766?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_25_08-17566674021224553758?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_32_26-15713847968120274736?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_38-14565826989590630758?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:674:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_00_39-11749527633479459098?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_01_19-15078886216425642652?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_10_39-18164418773935371762?project=apache-beam-testing.
or p.options.view_as(GoogleCloudOptions).temp_location)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_42-6484463505793188746?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
method_to_use = self._compute_method(p, p.options)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_02_07-853107455319244532?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
or p.options.view_as(GoogleCloudOptions).temp_location)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_03_35-4226404753705723256?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_11_44-1666387373330766643?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_01_52_44-1750162715661193855?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_01_02-12492250930586466602?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:674:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_07_51-10773970024426235673?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_16_06-13910272394679492715?project=apache-beam-testing.
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 35 tests in 2875.023s
FAILED (SKIP=4, errors=1, failures=1)
> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED
> Task :sdks:python:test-suites:dataflow:py35:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options:
>>> --runner=TestDataflowRunner --project=apache-beam-testing
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output
>>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build/apache-beam.tar.gz>
>>> --requirements_file=postcommit_requirements.txt --num_workers=1
>>> --sleep_secs=20
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar>
>>>
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>> test options: --nocapture --processes=8 --process-timeout=4500
>>> --attr=ValidatesRunner
running nosetests
running egg_info
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing apache_beam.egg-info/PKG-INFO
writing entry points to apache_beam.egg-info/entry_points.txt
writing top-level names to apache_beam.egg-info/top_level.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet
fully supported. You may encounter buggy behavior or missing features.
'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:472:
UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84:
UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully
supported. You may encounter buggy behavior or missing features.
'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59:
UserWarning: Datastore IO will support Python 3 after replacing
googledatastore by google-cloud-datastore, see: BEAM-4543.
warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47:
UserWarning: VCF IO will support Python 3 after migration to Nucleus, see:
BEAM-5628.
warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_18-11522264541189786651?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_46_56-9614097307799459215?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_19-13295233176211781327?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_48_27-15550577283918483237?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_19-10273537675951595805?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_48_27-9915343433343395176?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_19-16270932490977551291?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_48_36-8118007533673179526?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_19-1378977398264619843?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_48_32-12329684132759035587?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_18-16628584715040551235?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_46_56-5457278177916721573?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_54_13-3326839625238168078?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_19-6394729929977813952?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_48_11-11807840105334386253?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_40_19-6332251451664203441?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-24_02_47_21-12102495197759485479?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_dofn_lifecycle
(apache_beam.transforms.dofn_lifecycle_test.DoFnLifecycleTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ...
ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest)
... ok
test_flatten_multiple_pcollections_having_multiple_consumers
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_multiple_empty_outputs
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_return
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_list_and_as_dict_side_inputs
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ...
ok
test_as_singleton_with_different_defaults
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_flattened_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_empty_singleton_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 17 tests in 1270.687s
OK
FAILURE: Build completed with 3 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py37/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py37:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
3: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
line: 48
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 11m 6s
78 actionable tasks: 62 executed, 16 from cache
Publishing build scan...
https://gradle.com/s/pi5eswdq2akm2
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]