See 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/834/display/redirect?page=changes>

Changes:

[ankurgoenka] Updating python containers to beam-master-20190509

------------------------------------------
[...truncated 413.26 KB...]
              "is_wrapper": true
            },
            "output_name": "out",
            "user_name": "assert_that/Match.out"
          }
        ],
        "parallel_input": {
          "@type": "OutputReference",
          "output_name": "out",
          "step_name": "s11"
        },
        "serialized_fn": 
"eNq9VW1XG0UUnk0CbbdgBbRQW5VWqxu1WbW+1oKWAIWmTXFBMlVxnd2dZBf27c7OFnIk52g94fA7/Ct+8It/xn/g3UkoRaUfPTk7u/fe5z535r5Mfi4bLkuZ63Pb4SyqScHirJ2IKKu5ieB6nYUhc0LeEixNuVhMlmMdSPUX0HpQMmiZEGK3Yyi7XhCGNbtYddsVnElut/PYlUGCDhXjhD1MmGfLbsp1GKFnkaKeeHwDZRjtwxkLzhoNrUHwKTWm6uOHhOwT8qtGOhpZh3PNPuhVqqHXHpzvwxjN8NP0k4ib2zzeCeLs6H0jC9ljbu4mYifDI3KzOKG9lmSynkRRIO21rvST+Ka9yUXQ7pqZcM3M28nMVOnNZ/JiHufFLPJSS7swrrZ+O2SR47F5eOHBX5U6gQu0hFpMyYt9mKhKmLRg6sThO1zaTEqhw0uKwMmDUOJu4WV6BkU0F1a4eADTFsyccA2iNBHSjhIvDzF3l+hldHhO9eCVPly24IqKYyOJK20bXj2A1yx4nY4WSg45C2G2+V/1czkKcNWvGP6wIiONSazIn5KQQ1WRnka680RqR2Kp+B4Uq1cm+yWyXyY7ZSJWiSwRTxtq2iVyERFPNNKKfyIViRidiD+Ipml79wr3xa0F0quQ7gTZ18h2hexXCkatBQzRIwr9W4EekA7645j0EcIoPi10Fr+TU0CxRqhHsKGuNek0ZmKZBSH3ZlmWcSFvzV4Xs3NzuMIbB/CmQSuICINMwnWVtgzLwD14i06hsICZv6PclvZcnhYdD2/Tc2gpWnpJiESAodwEj5LHHKpUR2GThfnQ+o6EdwcI5sqiHu/RcRT4XspdjGOryDfohaeR7SMT1BRyqB16m6qReMgjHkt4X8IHNP1fZoRn2MgdM5dBWAzIh/5sI69fI9rYSFkbU7+yNl0a18bxPaHWK9oornBTdejTQ33Uh49xdD6x4FN/xr9EZ/7Z5oNAtSIQfNaHzy245WNbf2HBbX+26V/dgjlDVWAdRybkBqvC/LOyU4UveR++suBOHxZ6UKfni2EoriTbD2KZweLJWxENSl/zOA4Wk4nI9NWHRX1XCrUOS3glLjd7cNdQVEGc5lLxZbDSpGOoSnJ5rFtt5gdwz8GyNiy434cHFuC99rAHawN/m4lOhtko7tav/fv+ol8EsDDAuuGvNH3lv+HkEr6xYPNfu28pOEX4o2P4t47qi1QkLs8y+M7fzJ0t+L4HW1vww3P/BFpB7CW7mHEdbOT9sQfMUBXbVQbco3Oa/wCh3w0Th4UDHsyWiywenVAN7+ZRHrJiaoqLjQNvaKpUUgSdDhdI3j6NfAjRF3mb5aHcGIrQQXqfThYkQYS9wqLUdpPICWIuIEB+la8gs72BI2wf5o6Endrfu3EvPQ==",
        "user_name": "assert_that/Match"
      }
    }
  ],
  "type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
 createTime: '2019-05-13T15:59:59.421942Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2019-05-13_08_59_58-9042096771646816445'
 location: 'us-central1'
 name: 'beamapp-jenkins-0513155931-082851'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2019-05-13T15:59:59.421942Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2019-05-13_08_59_58-9042096771646816445]
root: INFO: To access the Dataflow monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_59_58-9042096771646816445?project=apache-beam-testing
root: INFO: Job 2019-05-13_08_59_58-9042096771646816445 is in state 
JOB_STATE_RUNNING
root: INFO: 2019-05-13T15:59:58.198Z: JOB_MESSAGE_DETAILED: Autoscaling is 
enabled for job 2019-05-13_08_59_58-9042096771646816445. The number of workers 
will be between 1 and 1000.
root: INFO: 2019-05-13T15:59:58.253Z: JOB_MESSAGE_DETAILED: Autoscaling was 
automatically enabled for job 2019-05-13_08_59_58-9042096771646816445.
root: INFO: 2019-05-13T16:00:01.483Z: JOB_MESSAGE_DETAILED: Checking 
permissions granted to controller Service Account.
root: INFO: 2019-05-13T16:00:02.395Z: JOB_MESSAGE_BASIC: Worker configuration: 
n1-standard-1 in us-central1-a.
root: INFO: 2019-05-13T16:00:02.972Z: JOB_MESSAGE_DETAILED: Expanding 
CoGroupByKey operations into optimizable parts.
root: INFO: 2019-05-13T16:00:03.025Z: JOB_MESSAGE_DEBUG: Combiner lifting 
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a 
combiner.
root: INFO: 2019-05-13T16:00:03.073Z: JOB_MESSAGE_DETAILED: Expanding 
GroupByKey operations into optimizable parts.
root: INFO: 2019-05-13T16:00:03.110Z: JOB_MESSAGE_DETAILED: Lifting 
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2019-05-13T16:00:03.201Z: JOB_MESSAGE_DEBUG: Annotating graph with 
Autotuner information.
root: INFO: 2019-05-13T16:00:03.252Z: JOB_MESSAGE_DETAILED: Fusing adjacent 
ParDo, Read, Write, and Flatten operations
root: INFO: 2019-05-13T16:00:03.300Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s8 for input s6.out
root: INFO: 2019-05-13T16:00:03.348Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Reify, through flatten 
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2019-05-13T16:00:03.395Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2019-05-13T16:00:03.443Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Match into assert_that/Unkey
root: INFO: 2019-05-13T16:00:03.484Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
root: INFO: 2019-05-13T16:00:03.533Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2019-05-13T16:00:03.575Z: JOB_MESSAGE_DETAILED: Unzipping flatten 
s8-u13 for input s9-reify-value0-c11
root: INFO: 2019-05-13T16:00:03.625Z: JOB_MESSAGE_DETAILED: Fusing unzipped 
copy of assert_that/Group/GroupByKey/Write, through flatten 
assert_that/Group/Flatten/Unzipped-1, into producer 
assert_that/Group/GroupByKey/Reify
root: INFO: 2019-05-13T16:00:03.676Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2019-05-13T16:00:03.707Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2019-05-13T16:00:03.755Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2019-05-13T16:00:03.796Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2019-05-13T16:00:03.843Z: JOB_MESSAGE_DETAILED: Fusing consumer 
ExternalTransform(simple)/Map(<lambda at external_test_it.py:42>) into 
Create/Read
root: INFO: 2019-05-13T16:00:03.884Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/WindowInto(WindowIntoFn) into ExternalTransform(simple)/Map(<lambda 
at external_test_it.py:42>)
root: INFO: 2019-05-13T16:00:03.981Z: JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2019-05-13T16:00:04.026Z: JOB_MESSAGE_DEBUG: Workflow config is 
missing a default resource spec.
root: INFO: 2019-05-13T16:00:04.068Z: JOB_MESSAGE_DEBUG: Adding StepResource 
setup and teardown to workflow graph.
root: INFO: 2019-05-13T16:00:04.111Z: JOB_MESSAGE_DEBUG: Adding workflow start 
and stop steps.
root: INFO: 2019-05-13T16:00:04.154Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2019-05-13T16:00:04.330Z: JOB_MESSAGE_DEBUG: Executing wait step 
start21
root: INFO: 2019-05-13T16:00:04.452Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Create
root: INFO: 2019-05-13T16:00:04.508Z: JOB_MESSAGE_DEBUG: Starting worker pool 
setup.
root: INFO: 2019-05-13T16:00:04.545Z: JOB_MESSAGE_BASIC: Starting 1 workers in 
us-central1-a...
root: INFO: 2019-05-13T16:00:04.665Z: JOB_MESSAGE_DEBUG: Value 
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2019-05-13T16:00:04.760Z: JOB_MESSAGE_BASIC: Executing operation 
Create/Read+ExternalTransform(simple)/Map(<lambda at 
external_test_it.py:42>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-05-13T16:00:04.831Z: JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2019-05-13T16:01:54.628Z: JOB_MESSAGE_ERROR: Startup of the worker 
pool in zone us-central1-a failed to bring up any of the desired 1 workers. 
QUOTA_EXCEEDED: Quota 'DISKS_TOTAL_GB' exceeded.  Limit: 200000.0 in region 
us-central1.
root: INFO: 2019-05-13T16:01:54.684Z: JOB_MESSAGE_ERROR: Workflow failed.
root: INFO: 2019-05-13T16:01:54.859Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2019-05-13T16:01:54.945Z: JOB_MESSAGE_DEBUG: Starting worker pool 
teardown.
root: INFO: 2019-05-13T16:01:54.995Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2019-05-13T16:02:12.095Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2019-05-13T16:02:12.140Z: JOB_MESSAGE_DEBUG: Tearing down pending 
resources...
root: INFO: Job 2019-05-13_08_59_58-9042096771646816445 is in state 
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_16-3835868220861072055?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_49_32-963062249730058563?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_58_01-4282301520795299243?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_22-701698759904749278?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_54_05-5276132350595889811?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_14-11604067314396649479?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_47_50-13635209755392245409?project=apache-beam-testing.
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_56_04-7620320006645490423?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_03_49-16935546512664604797?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_13-8873890979448682938?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:218:
 FutureWarning: MatchAll is experimental.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_52_54-3972904642825230981?project=apache-beam-testing.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_59_58-9042096771646816445?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/fileio_test.py>:229:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_16-14464128961318203646?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_43_13-5801107192312249139?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_49_57-13180217369528037590?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_58_17-15063796710250567374?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_06_31-6452211871774874183?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_14_50-14289544635830352663?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_13-1275882985566553139?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_41_25-9008119140678577594?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_50_38-5774165889312817910?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_58_18-6157626142326393623?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_21-16828617756122606595?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:665:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_43_14-1023621483809239122?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=kms_key))
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_51_10-9479841951542846262?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_58_45-9258398683397821560?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_34_14-13420293439817496156?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_43_52-16761824070911315354?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1137:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  method_to_use = self._compute_method(p, p.options)
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_08_54_02-14716613981316152224?project=apache-beam-testing.
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:545:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  or p.options.view_as(GoogleCloudOptions).temp_location)

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 35 tests in 2931.791s

FAILED (SKIP=4, errors=1)

> Task :beam-sdks-python-test-suites-dataflow-py35:postCommitIT FAILED

> Task :beam-sdks-python-test-suites-dataflow-py35:validatesRunnerBatchTests
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build/apache-beam.tar.gz>
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.14.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 
>>> --attr=ValidatesRunner
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing requirements to apache_beam.egg-info/requires.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
setup.py:176: UserWarning: Python 3 support for the Apache Beam SDK is not yet 
fully supported. You may encounter buggy behavior or missing features.
  'Python 3 support for the Apache Beam SDK is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/build/gradleenv/-1709362674/lib/python3.5/site-packages/setuptools/dist.py>:472:
 UserWarning: Normalizing '2.14.0.dev' to '2.14.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/__init__.py>:84:
 UserWarning: Running the Apache Beam SDK on Python 3 is not yet fully 
supported. You may encounter buggy behavior or missing features.
  'Running the Apache Beam SDK on Python 3 is not yet fully supported. '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/gcp/datastore/v1/datastoreio.py>:59:
 UserWarning: Datastore IO will support Python 3 after replacing 
googledatastore by google-cloud-datastore, see: BEAM-4543.
  warnings.warn('Datastore IO will support Python 3 after replacing '
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/apache_beam/io/vcfio.py>:47:
 UserWarning: VCF IO will support Python 3 after migration to Nucleus, see: 
BEAM-5628.
  warnings.warn("VCF IO will support Python 3 after migration to Nucleus, "
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_59-15327146404106377241?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_03-10572939958634838528?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_23_03-2890697535771546384?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_37-16590649220020918009?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_58-3960003295634522759?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_33-5107096194359856324?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_58-9486109885776107491?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_04-3706667783819179019?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_58-11336643678057618439?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_30_53-10603786511722403204?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_59-8351550214991927495?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_33-13358366209987651196?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_59-16226937276878481298?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_08-11568547311574187369?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_22_59-11487721115709331331?project=apache-beam-testing.
Found: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-05-13_09_31_08-8737875265284401873?project=apache-beam-testing.
test_read_metrics (apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_par_do_with_multiple_outputs_and_using_yield 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_flatten_multiple_pcollections_having_multiple_consumers 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_as_dict_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_as_list_and_as_dict_side_inputs 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_par_do_with_multiple_outputs_and_using_return 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_undeclared_outputs (apache_beam.transforms.ptransform_test.PTransformTest) 
... ok
test_multiple_empty_outputs 
(apache_beam.transforms.ptransform_test.PTransformTest) ... ok
test_empty_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_default_value_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_without_unique_labels 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_singleton_with_different_defaults 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_multi_valued_singleton_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_as_list_twice (apache_beam.transforms.sideinputs_test.SideInputsTest) ... 
ok
test_flattened_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok
test_iterable_side_input 
(apache_beam.transforms.sideinputs_test.SideInputsTest) ... ok

----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 1006.523s

OK

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py36/build.gradle'>
 line: 47

* What went wrong:
Execution failed for task 
':beam-sdks-python-test-suites-dataflow-py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python3_Verify/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 48

* What went wrong:
Execution failed for task 
':beam-sdks-python-test-suites-dataflow-py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 7m 8s
71 actionable tasks: 54 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/fmk3dino7q726

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to