See <https://builds.apache.org/job/beam_PostCommit_Python36/809/display/redirect>
Changes: ------------------------------------------ [...truncated 91.08 KB...] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function expand_gbk at 0x7f201ec3a510> ==================== root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n must follow: \n downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n must follow: write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function sink_flattens at 0x7f201ec3a620> ==================== root: DEBUG: 10 [1, 1, 1, 1, 1, 1, 1, 1, 1, 1] root: DEBUG: Stages: ['ref_AppliedPTransform_create/Read_3\n create/Read:beam:transform:read:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n must follow: \n downstream_side_inputs: ', 'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n must follow: write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: ', 'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function greedily_fuse at 0x7f201ec3a6a8> ==================== root: DEBUG: 2 [4, 6] root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n create/Read:beam:transform:read:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function read_to_impulse at 0x7f201ec3a730> ==================== root: DEBUG: 2 [4, 7] root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function impulse_to_input at 0x7f201ec3a7b8> ==================== root: DEBUG: 2 [4, 7] root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function inject_timer_pcollections at 0x7f201ec3a950> ==================== root: DEBUG: 2 [4, 7] root: DEBUG: Stages: ['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n downstream_side_inputs: ', '(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: '] root: INFO: ==================== <function sort_stages at 0x7f201ec3a9d8> ==================== root: DEBUG: 2 [7, 4] root: DEBUG: Stages: ['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: ', '(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n downstream_side_inputs: '] root: INFO: ==================== <function window_pcollection_coders at 0x7f201ec3aa60> ==================== root: DEBUG: 2 [7, 4] root: DEBUG: Stages: ['(((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n must follow: \n downstream_side_inputs: ', '(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n must follow: (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n downstream_side_inputs: '] root: INFO: Creating state cache with size 100 root: INFO: Created Worker handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 0x7f1feb0a0780> for environment urn: "beam:env:embedded_python:v1" root: INFO: Running (((((ref_AppliedPTransform_create/Read_3)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write) root: DEBUG: start <DataOutputOperation > root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]> root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]> root: DEBUG: start <DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]> root: DEBUG: start <DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]> root: DEBUG: start <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]> root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]> root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[create/Read/Impulse.out0, coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]> root: DEBUG: finish <ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]> root: DEBUG: finish <DataOutputOperation > root: DEBUG: Wait for the bundle bundle_16 to finish. root: INFO: Running (((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19) root: DEBUG: start <DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out', 'out_FailedRows'], receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]> root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]> root: DEBUG: start <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]> root: DEBUG: start <DataInputOperation receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]> root: DEBUG: Creating or getting table <TableReference datasetId: 'python_write_to_table_15719856583242' projectId: 'apache-beam-testing' tableId: 'python_no_schema_table'> with schema None. root: DEBUG: finish <DataInputOperation receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0, coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps) output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0, coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]> root: DEBUG: finish <DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) output_tags=['out', 'out_FailedRows'], receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1, coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]> root: DEBUG: Attempting to flush to all destinations. Total buffered: 4 root: DEBUG: Flushing data to apache-beam-testing:python_write_to_table_15719856583242.python_no_schema_table. Total 4 rows. root: DEBUG: Passed: True. Errors are [] root: DEBUG: Wait for the bundle bundle_17 to finish. root: INFO: Attempting to perform query SELECT bytes, date, time FROM python_write_to_table_15719856583242.python_no_schema_table to BQ google.auth.transport._http_client: DEBUG: Making request: GET http://169.254.169.254 google.auth.transport._http_client: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/project/project-id urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, connect=None, read=None, redirect=None, status=None) google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): metadata.google.internal:80 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 200 144 google.auth.transport.requests: DEBUG: Making request: GET http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET /computeMetadata/v1/instance/service-accounts/[email protected]/token HTTP/1.1" 200 181 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): www.googleapis.com:443 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST /bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/queries/e8b1475c-28f1-4132-a966-42a72f7038bf?maxResults=0&timeoutMs=10000&location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/jobs/e8b1475c-28f1-4132-a966-42a72f7038bf?location=US HTTP/1.1" 200 None urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET /bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonad36afb64511a4df66babaa7ec4477df616e7112/data HTTP/1.1" 200 None root: INFO: Result of query is: [] root: INFO: Deleting dataset python_write_to_table_15719856583242 in project apache-beam-testing --------------------- >> end captured logging << --------------------- <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported self.table_reference.projectId = pcoll.pipeline.options.view_as( <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location ---------------------------------------------------------------------- XML: nosetests-postCommitIT-direct-py36.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 15 tests in 29.611s FAILED (SKIP=1, failures=1) > Task :sdks:python:test-suites:direct:py36:postCommitIT FAILED > Task :sdks:python:test-suites:dataflow:py36:postCommitIT <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_41_02-10416546378007295427?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_55_16-12192459484686896442?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_03_39-10684148755344112397?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_11_16-1247208715958382073?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_19_24-12416489735244197184?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_40_57-5165486611726814996?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:717: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_00_18-18056847589558474202?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_09_06-10938778785362521085?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_16_44-3920586231872950694?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_41_04-2405698931681595757?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_53_49-9244400007491514191?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_02_56-2091628028528275086?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:717: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_11_15-12706053046607359959?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_18_55-16179026163848590891?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_40_56-981415623452939328?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:717: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_51_05-7216196938950825280?project=apache-beam-testing kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_59_14-2901653071030612473?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py>:73: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_07_57-276193940019276430?project=apache-beam-testing kms_key=kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_15_57-15980901056627510195?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1208: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_40_57-18058285576504812776?project=apache-beam-testing self.table_reference.projectId = pcoll.pipeline.options.view_as( Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_02_41-1889219209848678586?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_40_56-5383406646154408096?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/runners/dataflow/dataflow_runner.py>:717: BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use WriteToBigQuery instead. kms_key=transform.kms_key)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_49_26-486762429256234133?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_59_09-15221497974585403563?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_test.py>:648: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_10_03-4575503729324483693?project=apache-beam-testing streaming = self.test_pipeline.options.view_as(StandardOptions).streaming Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_19_25-2821891579068115816?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported experiments = p.options.view_as(DebugOptions).experiments or [] <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:232: FutureWarning: MatchAll is experimental. | 'GetPath' >> beam.Map(lambda metadata: metadata.path)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: MatchAll is experimental. | 'Checksums' >> beam.Map(compute_hash)) <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/fileio_test.py>:243: FutureWarning: ReadMatches is experimental. | 'Checksums' >> beam.Map(compute_hash)) Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_41_05-3408894086446850163?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1211: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_51_08-375997547196772563?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_01_27-15803588341674239854?project=apache-beam-testing experiments = p.options.view_as(DebugOptions).experiments or [] Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_10_18-16745347134871608628?project=apache-beam-testing <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:795: BeamDeprecationWarning: options is deprecated since First stable release. References to <pipeline>.options will not be supported Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_19_00-1465741907323317631?project=apache-beam-testing temp_location = p.options.view_as(GoogleCloudOptions).temp_location Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_40_57-9701006106033484722?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_50_21-7818541692461479230?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-24_23_58_38-11557888905706291685?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_07_36-6160943644652781049?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_16_24-10893204215413913825?project=apache-beam-testing Worker logs: https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2019-10-25_00_23_57-12269712713677829219?project=apache-beam-testing test_datastore_wordcount_it (apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543 test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... SKIP: Due to a known issue in avro-python3 package, thistest is skipped until BEAM-6522 is addressed. test_bigquery_tornadoes_it (apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) ... ok test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_autocomplete_it (apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok test_streaming_wordcount_it (apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok test_leader_board_it (apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok test_game_stats_it (apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok test_hourly_team_score_it (apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT) ... ok test_user_score_it (apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok test_bigquery_read_1M_python (apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... SKIP: This test still needs to be fixed on Python 3TODO: BEAM-4543 test_value_provider_transform (apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests) ... ok test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_batch_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_copy_rewrite_token (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok test_bqfl_streaming (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: TestStream is not supported on TestDataflowRunner test_multiple_destinations_transform (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_one_job_fails_all_jobs_fail (apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok test_big_query_read (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_big_query_read_new_types (apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: https://issuetracker.google.com/issues/118375066 test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... ok test_big_query_legacy_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_new_types (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_big_query_standard_sql_kms_key_native (apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) ... ok test_streaming_data_only (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok test_streaming_with_attributes (apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok Runs streaming Dataflow job and verifies that user metrics are reported ... ok test_big_query_write (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_new_types (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_big_query_write_schema_autodetect (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... SKIP: DataflowRunner does not support schema autodetection test_big_query_write_without_schema (apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok test_metrics_fnapi_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_metrics_it (apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest) ... ok test_job_python_from_python_it (apache_beam.transforms.external_test_it.ExternalTransformIT) ... ok test_datastore_write_limit (apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) ... ok ---------------------------------------------------------------------- XML: nosetests-postCommitIT-df-py36.xml ---------------------------------------------------------------------- XML: <https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml> ---------------------------------------------------------------------- Ran 45 tests in 3102.937s OK (SKIP=6) FAILURE: Build failed with an exception. * Where: Build file '<https://builds.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/direct/py36/build.gradle'> line: 51 * What went wrong: Execution failed for task ':sdks:python:test-suites:direct:py36:postCommitIT'. > Process 'command 'sh'' finished with non-zero exit value 1 * Try: Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights. * Get more help at https://help.gradle.org Deprecated Gradle features were used in this build, making it incompatible with Gradle 6.0. Use '--warning-mode all' to show the individual deprecation warnings. See https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings BUILD FAILED in 52m 40s 64 actionable tasks: 47 executed, 17 from cache Publishing build scan... https://gradle.com/s/rr5tpla7lwds2 Build step 'Invoke Gradle script' changed build result to FAILURE Build step 'Invoke Gradle script' marked build as failure --------------------------------------------------------------------- To unsubscribe, e-mail: [email protected] For additional commands, e-mail: [email protected]
