See
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/2584/display/redirect>
Changes:
------------------------------------------
[...truncated 6.09 MB...]
DEBUG:apache_beam.io.gcp.bigquery_tools:Query SELECT * FROM (SELECT "apple" as
fruit) UNION ALL (SELECT "orange" as fruit) does not reference any tables.
WARNING:apache_beam.io.gcp.bigquery_tools:Dataset
apache-beam-testing:temp_dataset_b207c841eae94b73939461adda750bf4 does not
exist so we will create it as temporary with location=None
INFO:root:Job status: RUNNING
INFO:root:Job status: DONE
INFO:root:Job status: RUNNING
INFO:root:Job status: DONE
DEBUG:apache_beam.io.filesystem:Listing files in
'gs://temp-storage-for-end-to-end-tests/temp-it/1972a92f2e1f418cb26b36cd99ceac8c/bigquery-table-dump-'
DEBUG:apache_beam.io.filesystem:translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/1972a92f2e1f418cb26b36cd99ceac8c/bigquery-table-dump-*.json'
->
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/1972a92f2e1f418cb26b36cd99ceac8c\\/bigquery\\-table\\-dump\\-[^/\\\\]*\\.json'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 1 files in 0.034546613693237305
seconds.
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
read/Read/_SDFBoundedSourceWrapper/Impulse
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction.out0,
coder=WindowedValueCoder[TupleCoder[BytesCoder, TupleCoder[DillCoder,
FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_1_split/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_31 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((((((ref_AppliedPTransform_write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse_24)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda
at
core.py:2623>)_25))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)_27))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/Map(<lambda
at
bigquery_file_loads.py:872>)_28))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/GenerateFilePrefix_29))+(ref_PCollection_PCollection_14/Write))+(ref_PCollection_PCollection_15/Write))+(ref_PCollection_PCollection_16/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_14/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_15/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_16/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/GenerateFilePrefix output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GenerateFilePrefix.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:872>)
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Map(<lambda at
bigquery_file_loads.py:872>).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)
output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=3]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at
core.py:2623>) output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda
at core.py:2623>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda at
core.py:2623>) output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/FlatMap(<lambda
at core.py:2623>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode)
output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseSingleElementPC/Map(decode).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=3]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/Map(<lambda at bigquery_file_loads.py:872>)
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Map(<lambda at
bigquery_file_loads.py:872>).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/GenerateFilePrefix output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GenerateFilePrefix.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_16/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_15/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_14/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_32 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((((((((ref_PCollection_PCollection_1_split/Read)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/RewindowIntoGlobal_30))+(ref_PCollection_PCollection_4/Write))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/AppendDestination_31))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)_33))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ParDo(_ShardDestinations)_34))+(write/BigQueryBatchFileLoads/GroupShardedRows/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_4/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/GroupShardedRows/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(_ShardDestinations).out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
output_tags=['WrittenFiles', 'None', 'UnwrittenRecords'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1],
ConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out2,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/AppendDestination output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/AppendDestination.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/RewindowIntoGlobal output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RewindowIntoGlobal.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
output_tags=['cleanup_signal', 'None'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1],
SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start
<SdfProcessSizedElements
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process
output_tags=['None'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
ref_PCollection_PCollection_1_split/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_1_split/Read.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.io.filesystem:translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/1972a92f2e1f418cb26b36cd99ceac8c/bigquery-table-dump-000000000000.json'
->
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/1972a92f2e1f418cb26b36cd99ceac8c\\/bigquery\\-table\\-dump\\-000000000000\\.json'
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
ref_PCollection_PCollection_1_split/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_1_split/Read.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish
<SdfProcessSizedElements
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process
output_tags=['None'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
output_tags=['cleanup_signal', 'None'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1],
SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/RewindowIntoGlobal output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RewindowIntoGlobal.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/AppendDestination output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/AppendDestination.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
output_tags=['WrittenFiles', 'None', 'UnwrittenRecords'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1],
ConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile).out2,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ParDo(_ShardDestinations) output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(_ShardDestinations).out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/GroupShardedRows/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_4/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/0 >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_33 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((write/BigQueryBatchFileLoads/GroupShardedRows/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/DropShardNumber_36))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile_37))+(write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/DropShardNumber output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DropShardNumber.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
IterableCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/GroupShardedRows/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupShardedRows/Read.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/GroupShardedRows/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupShardedRows/Read.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/DropShardNumber output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DropShardNumber.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
IterableCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WriteGroupedRecordsToFile.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/DestinationFilesUnion/Write/1 >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_34 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((write/BigQueryBatchFileLoads/DestinationFilesUnion/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/IdentityWorkaround_39))+(write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/IdentityWorkaround output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/IdentityWorkaround.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/DestinationFilesUnion/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DestinationFilesUnion/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/DestinationFilesUnion/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/DestinationFilesUnion/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/IdentityWorkaround output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/IdentityWorkaround.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_35 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((((((((((write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)_42))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)_44))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables_53))+(ref_PCollection_PCollection_33/Write))+(ref_PCollection_PCollection_32/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/1))+(ref_PCollection_PCollection_42/Write))+(write/BigQueryBatchFileLoads/Flatten/Transcode/0))+(write/BigQueryBatchFileLoads/Flatten/Write/0))+(write/BigQueryBatchFileLoads/Flatten/Write/1)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_32/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_42/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/Flatten/Write/1 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/Flatten/Write/0 >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_33/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <FlattenOperation
write/BigQueryBatchFileLoads/Flatten/Transcode/1
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/1.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <FlattenOperation
write/BigQueryBatchFileLoads/Flatten/Transcode/0
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/0.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables
output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=2]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)
output_tags=['None', 'TemporaryTables'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=2],
SingletonConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out1,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
output_tags=['None', 'MULTIPLE_PARTITIONS', 'SINGLE_PARTITION'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1],
SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out2,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.io.gcp.bigquery_file_loads:Load job has 1 files. Job name is
beam_load_2020_06_20_180634_94_d52191028bc6fe7723f320b5f012b310_6e43f1acbb974490a7fee2b865c63e5b.
INFO:apache_beam.io.gcp.bigquery_file_loads:Triggering job
beam_load_2020_06_20_180634_94_d52191028bc6fe7723f320b5f012b310_6e43f1acbb974490a7fee2b865c63e5b
to load data to BigQuery table <TableReference
datasetId: 'python_query_to_table_1592676380366'
projectId: 'apache-beam-testing'
tableId: 'output_table'>.Schema: {'fields': [{'name': 'fruit', 'type':
'STRING', 'mode': 'NULLABLE'}]}. Additional parameters: {}
DEBUG:apache_beam.runners.worker.operations:Processing
[(('apache-beam-testing:python_query_to_table_1592676380366.output_table',
<JobReference
jobId:
'beam_load_2020_06_20_180634_94_d52191028bc6fe7723f320b5f012b310_6e43f1acbb974490a7fee2b865c63e5b'
location: 'US'
projectId: 'apache-beam-testing'>), 9223371950454.773, (GlobalWindow,),
PaneInfo(first: True, last: True, timing: ON_TIME, index: 0,
nonspeculative_index: 0))] in <FlattenOperation
write/BigQueryBatchFileLoads/Flatten/Transcode/0
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/0.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
output_tags=['None', 'MULTIPLE_PARTITIONS', 'SINGLE_PARTITION'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1],
SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles).out2,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)
output_tags=['None', 'TemporaryTables'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=2],
SingletonConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out1,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables
output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=2]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <FlattenOperation
write/BigQueryBatchFileLoads/Flatten/Transcode/0
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/0.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <FlattenOperation
write/BigQueryBatchFileLoads/Flatten/Transcode/1
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/Flatten/Transcode/1.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_33/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/Flatten/Write/0 >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/Flatten/Write/1 >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_42/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_32/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_36 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
at
core.py:2623>)_12))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles) output_tags=['None'],
receivers=[ConsumerSet[read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
read/_PassThroughThenCleanup/Create/Map(decode) output_tags=['None'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2623>)
output_tags=['None'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/FlatMap(<lambda
at core.py:2623>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
read/_PassThroughThenCleanup/Create/Impulse
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.io.filesystem:Listing files in
'gs://temp-storage-for-end-to-end-tests/temp-it/1972a92f2e1f418cb26b36cd99ceac8c/bigquery-table-dump-'
DEBUG:apache_beam.io.filesystem:translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/1972a92f2e1f418cb26b36cd99ceac8c/bigquery-table-dump-*.json'
->
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/1972a92f2e1f418cb26b36cd99ceac8c\\/bigquery\\-table\\-dump\\-[^/\\\\]*\\.json'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 1 files in 0.030999422073364258
seconds.
DEBUG:root:RemoveJsonFiles: matched 1 files
DEBUG:apache_beam.io.filesystem:translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/1972a92f2e1f418cb26b36cd99ceac8c/bigquery-table-dump-000000000000.json'
->
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/1972a92f2e1f418cb26b36cd99ceac8c\\/bigquery\\-table\\-dump\\-000000000000\\.json'
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
read/_PassThroughThenCleanup/Create/Impulse
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2623>)
output_tags=['None'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/FlatMap(<lambda
at core.py:2623>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
read/_PassThroughThenCleanup/Create/Map(decode) output_tags=['None'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles) output_tags=['None'],
receivers=[ConsumerSet[read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_37 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(ref_PCollection_PCollection_14/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs_54)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
ref_PCollection_PCollection_14/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_14/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
INFO:root:Job status: RUNNING
INFO:root:Job status: DONE
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
ref_PCollection_PCollection_14/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_14/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/WaitForDestinationLoadJobs.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_38 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
(((ref_PCollection_PCollection_14/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs_45))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)_46))+(ref_PCollection_PCollection_35/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
ref_PCollection_PCollection_35/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs) output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
ref_PCollection_PCollection_14/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_14/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
ref_PCollection_PCollection_14/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_14/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WaitForTempTableLoadJobs.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs) output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
ref_PCollection_PCollection_35/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_39 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((((ref_PCollection_PCollection_14/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/WaitForCopyJobs_47))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/RemoveTempTables/PassTables_48))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue_49))+(write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/PassTables.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/WaitForCopyJobs output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WaitForCopyJobs.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
ref_PCollection_PCollection_14/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_14/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
ref_PCollection_PCollection_14/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_14/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/WaitForCopyJobs output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/WaitForCopyJobs.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/PassTables output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/PassTables.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_40 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
write/BigQueryBatchFileLoads/Flatten/Read
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/Flatten/Read
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/Flatten/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/Flatten/Read
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/Flatten/Read.out0,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_41 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((ref_AppliedPTransform_write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse_19)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda
at
core.py:2623>)_20))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode)_22)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2623>)
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda
at core.py:2623>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda at core.py:2623>)
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/FlatMap(<lambda
at core.py:2623>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode) output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/ImpulseEmptyPC/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_42 to finish.
INFO:apache_beam.runners.portability.fn_api_runner.fn_runner:Running
((write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read)+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames_51))+(ref_AppliedPTransform_write/BigQueryBatchFileLoads/RemoveTempTables/Delete_52)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/Delete output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/Delete.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
output_tags=['None'],
receivers=[SingletonConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation
write/BigQueryBatchFileLoads/RemoveTempTables/Delete output_tags=['None'],
receivers=[ConsumerSet[write/BigQueryBatchFileLoads/RemoveTempTables/Delete.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner.fn_runner:Wait for the
bundle bundle_43 to finish.
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query
SELECT fruit from `python_query_to_table_1592676380366.output_table`; to BQ
DEBUG:google.auth.transport._http_client:Making request: GET
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1):
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1):
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/e8c8afcc-210b-4d8f-b0cb-a9e7540d5909?maxResults=0&location=US
HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonf5ea691d313331c214d0de6745a1ee0a21d9d316/data
HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT
fruit from `python_query_to_table_1592676380366.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum:
158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... ok
test_read_via_sql
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_read_via_table
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_spanner_error
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_spanner_update
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_write_batches
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_avro
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... SKIP: This test doesn't work on DirectRunner.
----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py36.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 25 tests in 106.207s
OK (SKIP=1)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python36/ws/src/sdks/python/test-suites/dataflow/common.gradle'>
line: 116
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py36:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 3m 34s
86 actionable tasks: 63 executed, 23 from cache
Publishing build scan...
https://gradle.com/s/eyur5u2v6ec74
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]