See 
<https://builds.apache.org/job/beam_PostCommit_Python2/1118/display/redirect?page=changes>

Changes:

[sunjincheng121] [BEAM-8733]  Handle the registration request synchronously in 
the Python


------------------------------------------
[...truncated 2.08 MB...]
          "output_name": "out", 
          "step_name": "s18"
        }, 
        "serialized_fn": "<string of 1340 bytes>", 
        "user_name": "assert_that/Match"
      }
    }
  ], 
  "type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
 createTime: u'2019-12-02T21:13:11.454327Z'
 currentStateTime: u'1970-01-01T00:00:00Z'
 id: u'2019-12-02_13_13_09-6568729113300381307'
 location: u'us-central1'
 name: u'beamapp-jenkins-1202211257-778953'
 projectId: u'apache-beam-testing'
 stageStates: []
 startTime: u'2019-12-02T21:13:11.454327Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id: 
[2019-12-02_13_13_09-6568729113300381307]
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_13_09-6568729113300381307?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2019-12-02_13_13_09-6568729113300381307 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:09.423Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2019-12-02_13_13_09-6568729113300381307.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:09.423Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2019-12-02_13_13_09-6568729113300381307. The number of workers will be between 
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:13.284Z: 
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service 
Account.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:14.302Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.003Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.021Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step 
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.091Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.124Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.219Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.338Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.365Z: 
JOB_MESSAGE_DETAILED: Fusing consumer row to string into read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.398Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/KeyWithVoid into row to string
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.433Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
 into count/CombineGlobally(CountCombineFn)/KeyWithVoid
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.461Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.492Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.512Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.534Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.553Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/UnKey into 
count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.588Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s15 for input s13.out
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.614Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten, 
into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.643Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/GroupByKey/GroupByWindow into 
assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.669Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
assert_that/Group/Map(_merge_tagged_vals_under_key) into 
assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.695Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into 
assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.720Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.743Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s15-u31 for input s16-reify-value9-c29
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.773Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of 
assert_that/Group/GroupByKey/Write, through flatten 
assert_that/Group/Flatten/Unzipped-1, into producer 
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.791Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into 
assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.824Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into 
assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.852Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into 
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.882Z: 
JOB_MESSAGE_DETAILED: Fusing consumer 
count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into 
count/CombineGlobally(CountCombineFn)/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.908Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into 
count/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.937Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into 
assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:15.974Z: 
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into 
assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.012Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.037Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.070Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.102Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.422Z: 
JOB_MESSAGE_DEBUG: Executing wait step start41
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.503Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.540Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.553Z: 
JOB_MESSAGE_BASIC: Executing operation 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.574Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.615Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.628Z: 
JOB_MESSAGE_BASIC: Finished operation 
count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.677Z: 
JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.713Z: 
JOB_MESSAGE_DEBUG: Value 
"count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session" 
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.758Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:16.800Z: 
JOB_MESSAGE_BASIC: Executing operation read+row to 
string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:17.400Z: 
JOB_MESSAGE_BASIC: BigQuery export job "dataflow_job_15449188239845921278" 
started. You can check its status with the bq tool: "bq show -j 
--project_id=apache-beam-testing dataflow_job_15449188239845921278".
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:47.744Z: 
JOB_MESSAGE_DETAILED: BigQuery export job progress: 
"dataflow_job_15449188239845921278" observed total of 1 exported files thus far.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:47.777Z: 
JOB_MESSAGE_BASIC: BigQuery export job finished: 
"dataflow_job_15449188239845921278"
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:13:51.055Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.823Z: 
JOB_MESSAGE_ERROR: Startup of the worker pool in zone us-central1-f failed to 
bring up any of the desired 1 workers. QUOTA_EXCEEDED: Quota 'CPUS' exceeded.  
Limit: 1250.0 in region us-central1.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.861Z: 
JOB_MESSAGE_ERROR: Workflow failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.914Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:05.914Z: 
JOB_MESSAGE_BASIC: Finished operation read+row to 
string+count/CombineGlobally(CountCombineFn)/KeyWithVoid+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+count/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+count/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:06.030Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:06.146Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:06.162Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:25.235Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2019-12-02T21:15:25.259Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job 
2019-12-02_13_13_09-6568729113300381307 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------

======================================================================
ERROR: test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py";,>
 line 812, in run
    test(orig)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py";,>
 line 45, in __call__
    return self.run(*arg, **kwarg)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py";,>
 line 133, in run
    self.runTest(result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/case.py";,>
 line 151, in runTest
    test(result)
  File "/usr/lib/python2.7/unittest/case.py", line 393, in __call__
    return self.run(*args, **kwds)
  File "/usr/lib/python2.7/unittest/case.py", line 329, in run
    testMethod()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py";,>
 line 740, in test_multiple_destinations_transform
    equal_to([(full_output_table_1, bad_record)]))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 436, in __exit__
    self.run().wait_until_finish()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 416, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/test_dataflow_runner.py";,>
 line 73, in run_pipeline
    self.result.cancel()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1464, in cancel
    return self.state
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1404, in state
    self._update_job()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py";,>
 line 1360, in _update_job
    self._job = self._runner.dataflow_client.get_job(self.job_id())
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 209, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/apiclient.py";,>
 line 673, in get_job
    response = self._client.projects_locations_jobs.Get(request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/runners/dataflow/internal/clients/dataflow/dataflow_v1b3_client.py";,>
 line 661, in Get
    config, request, global_params=global_params)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py";,>
 line 731, in _RunMethod
    return self.ProcessHttpResponse(method_config, http_response, request)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py";,>
 line 737, in ProcessHttpResponse
    self.__ProcessHttpResponse(method_config, http_response, request))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py";,>
 line 620, in __ProcessHttpResponse
    return self.__client.DeserializeMessage(response_type, content)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/base_api.py";,>
 line 446, in DeserializeMessage
    message = encoding.JsonToMessage(response_type, data)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py";,>
 line 123, in JsonToMessage
    return _ProtoJsonApiTools.Get().decode_message(message_type, message)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py";,>
 line 309, in decode_message
    message_type, result)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py";,>
 line 214, in decode_message
    message = self.__decode_dictionary(message_type, dictionary)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/protorpclite/protojson.py";,>
 line 287, in __decode_dictionary
    for item in value]
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py";,>
 line 331, in decode_field
    field.message_type, json.dumps(value))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py";,>
 line 310, in decode_message
    result = _ProcessUnknownEnums(result, encoded_message)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/apitools/base/py/encoding_helper.py";,>
 line 531, in _ProcessUnknownEnums
    decoded_message = json.loads(six.ensure_str(encoded_message))
  File "/usr/lib/python2.7/json/__init__.py", line 339, in loads
    return _default_decoder.decode(s)
  File "/usr/lib/python2.7/json/decoder.py", line 364, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib/python2.7/json/decoder.py", line 380, in raw_decode
    obj, end = self.scan_once(s, idx)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/nose/plugins/multiprocess.py";,>
 line 276, in signalhandler
    raise TimedOutException()
TimedOutException: 'test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)'

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 5507.761s

FAILED (SKIP=5, errors=2)
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_19-4730468962750505463?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_12_23-119448814224082348?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_14_49-11999213596418577564?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_32_18-9248323002065203603?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_17-4584725342455771544?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_17_33-1305790076117006026?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_24_53-11059203174390592472?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_32_08-1150938119988014425?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_18-2095185521590466840?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_09_44-6449607306231753708?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_16_51-9674344046618286839?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_44-1418816685479814778?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_41-4175264225905358873?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_16-2094587883804841407?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_05_02-15459944219501791050?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_13_09-6568729113300381307?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_15_48-12675993783668797328?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_17-2215698956183388847?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_31_42-1321765033105109363?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_38_56-15047909077508743349?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_46_30-16011753843657935912?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_16-1344611298550093374?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_04_43-14662117601106516085?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_14_27-10521102793740008566?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_22_13-14926056476407583134?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_29_19-3993886651928485724?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_36_04-4422004491971162823?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_17-11020297155458256049?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_05_49-13456258886180144807?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_13_40-10808604804372799222?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_21_12-17981342819038252730?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_27_58-5772905708610951996?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_12_57_16-14423576763023242116?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_05_56-17199531945120722342?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_15_47-3545761448411806136?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_23_09-14250741536215217952?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-12-02_13_30_58-3820199892334211447?project=apache-beam-testing

> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
 line: 70

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
 line: 85

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 32m 46s
120 actionable tasks: 94 executed, 23 from cache, 3 up-to-date

Publishing build scan...
https://scans.gradle.com/s/qe2fr54tpgxrm

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to