See 
<https://builds.apache.org/job/beam_PostCommit_Python36/736/display/redirect>

Changes:


------------------------------------------
[...truncated 87.80 KB...]
root: INFO: ==================== <function lift_combiners at 0x7f0631688c80> 
====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_sdf at 0x7f0631688d08> 
====================
root: DEBUG: 9 [1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function expand_gbk at 0x7f0631688d90> 
====================
root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
  must follow: 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function sink_flattens at 0x7f0631688ea0> 
====================
root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n  
create/Read:beam:transform:read:v1\n  must follow: \n  downstream_side_inputs: 
', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
  must follow: 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
root: INFO: ==================== <function greedily_fuse at 0x7f0631688f28> 
====================
root: DEBUG: 2 [6, 4]
root: DEBUG: Stages: 
['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  
create/Read:beam:transform:read:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  downstream_side_inputs: ']
root: INFO: ==================== <function read_to_impulse at 0x7f063168b048> 
====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: 
['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  downstream_side_inputs: ']
root: INFO: ==================== <function impulse_to_input at 0x7f063168b0d0> 
====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: 
['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  downstream_side_inputs: ']
root: INFO: ==================== <function inject_timer_pcollections at 
0x7f063168b268> ====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: 
['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  downstream_side_inputs: ']
root: INFO: ==================== <function sort_stages at 0x7f063168b2f0> 
====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: 
['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  downstream_side_inputs: ']
root: INFO: ==================== <function window_pcollection_coders at 
0x7f063168b378> ====================
root: DEBUG: 2 [7, 4]
root: DEBUG: Stages: 
['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
  downstream_side_inputs: ']
root: INFO: Creating state cache with size 100
root: INFO: Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7f063131bfd0> for environment urn: "beam:env:embedded_python:v1"

root: INFO: Running 
(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)
root: DEBUG: start <DataOutputOperation >
root: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/AddInsertIds 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation write/_StreamToBigQuery/AppendDestination 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], 
len(consumers)=1]]>
root: DEBUG: start <ImpulseReadOperation 
receivers=[SingletonConsumerSet[create/Read.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation 
receivers=[SingletonConsumerSet[create/Read/Impulse.out0, 
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <DataInputOperation 
receivers=[SingletonConsumerSet[create/Read/Impulse.out0, 
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
root: DEBUG: finish <ImpulseReadOperation 
receivers=[SingletonConsumerSet[create/Read.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AppendDestination 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], 
len(consumers)=1]]>
root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AddInsertIds 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DataOutputOperation >
root: DEBUG: Wait for the bundle bundle_16 to finish.
root: INFO: Running 
(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)
root: DEBUG: start <DoOperation 
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['out_FailedRows', 'out'], 
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
 output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: start <DataInputOperation 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_write_to_table_15712920943129'
 projectId: 'apache-beam-testing'
 tableId: 'python_no_schema_table'> with schema None.
root: DEBUG: finish <DataInputOperation 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
 output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
root: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['out_FailedRows', 'out'], 
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
root: DEBUG: Attempting to flush to all destinations. Total buffered: 4
root: DEBUG: Flushing data to 
apache-beam-testing:python_write_to_table_15712920943129.python_no_schema_table.
 Total 4 rows.
root: DEBUG: Passed: True. Errors are []
root: DEBUG: Wait for the bundle bundle_17 to finish.
root: INFO: Attempting to perform query SELECT bytes, date, time FROM 
python_write_to_table_15712920943129.python_no_schema_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/844138762903-comp...@developer.gserviceaccount.com/token
 HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/113009ac-6c0e-419f-a30a-a9e150a0a389?maxResults=0&location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anona82530c36f2c07901eff86152d859345c354c05a/data
 HTTP/1.1" 200 None
root: INFO: Result of query is: []
root: INFO: Deleting dataset python_write_to_table_15712920943129 in project 
apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 30.993s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:dataflow:py36:postCommitIT
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_39-1450264437031784985?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_24_20-18067325119135713950?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_42_51-17525182335701691527?project=apache-beam-testing
  kms_key=transform.kms_key))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_44-18283141388583200512?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_17_35-4965968311588270802?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_27_59-12924676170326452173?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_36_41-11536976747571312786?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_46_12-425040853720671802?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_43-12692452238602112249?project=apache-beam-testing
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_14_57-11286410876644772802?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_24_14-17234934635610314494?project=apache-beam-testing
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_34_27-13194836956843593008?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_44_37-4500442309971421547?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_39-13021754289981487896?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_23_14-4425151508690240161?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_34_03-5573852341139977874?project=apache-beam-testing
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_39-10794668520420670962?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_12_14-2239181136358636233?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_22_59-13999267651615610810?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_33_04-821125531994561474?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_42_10-2207506484721832876?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:793:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232:
 FutureWarning: MatchAll is experimental.
  | 'GetPath' >> beam.Map(lambda metadata: metadata.path))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: MatchAll is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243:
 FutureWarning: ReadMatches is experimental.
  | 'Checksums' >> beam.Map(compute_hash))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_38-8875151182959758276?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:695:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
  kms_key=transform.kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_11_38-10835368774758146113?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73:
 BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use 
WriteToBigQuery instead.
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_21_46-10472898723378695790?project=apache-beam-testing
  kms_key=kms_key))
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_33_54-10164068722280483318?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_43_32-15957553672037010127?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_44-4951913423835087573?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_11_24-7434145400976567836?project=apache-beam-testing
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:635:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_21_42-16122781625768072590?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_32_10-3169899393250975825?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_41_16-18428468990067188822?project=apache-beam-testing
  streaming = self.test_pipeline.options.view_as(StandardOptions).streaming
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  experiments = p.options.view_as(DebugOptions).experiments or []
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_01_40-7798956484091819036?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_11_32-11238157407339485651?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_20_19-16607729326607797290?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_31_31-14678751212283581273?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_41_03-11040786191889997322?project=apache-beam-testing
Worker logs: 
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-16_23_50_50-6587279427837322777?project=apache-beam-testing
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due 
to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 
is addressed. 
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This 
test still needs to be fixed on Python 3TODO: BEAM-4543
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py36.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 45 tests in 3523.728s

OK (SKIP=6)

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'>
 line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 59m 45s
64 actionable tasks: 47 executed, 17 from cache

Publishing build scan...
https://gradle.com/s/dtr4koppyizvq

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: builds-unsubscr...@beam.apache.org
For additional commands, e-mail: builds-h...@beam.apache.org

Reply via email to