See
<https://builds.apache.org/job/beam_PostCommit_Python37/1721/display/redirect?page=changes>
Changes:
[iemejia] [BEAM-8616] Make hadoop-client a provided dependency on ParquetIO
------------------------------------------
[...truncated 2.76 MB...]
apache_beam.runners.worker.statecache: INFO: Creating state cache with size 100
apache_beam.runners.portability.fn_api_runner: INFO: Created Worker handler
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at
0x7f1856f23d30> for environment urn: "beam:env:embedded_python:v1"
apache_beam.runners.portability.fn_api_runner: INFO: Running
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataOutputOperation
ref_PCollection_PCollection_1_split/Write >
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction.out0,
coder=WindowedValueCoder[TupleCoder[BytesCoder, TupleCoder[DillCoder,
FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataInputOperation
read/Read/_SDFBoundedSourceWrapper/Impulse
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
apache_beam.io.gcp.bigquery_tools: INFO: Using location 'US' from table
<TableReference
datasetId: 'python_query_to_table_15826308369271'
projectId: 'apache-beam-testing'
tableId: 'python_new_types_table'> referenced by query SELECT bytes, date,
time FROM [python_query_to_table_15826308369271.python_new_types_table]
apache_beam.io.gcp.bigquery_tools: WARNING: Dataset
apache-beam-testing:temp_dataset_eea6fcc261c84dfb8846ed6eb045d341 does not
exist so we will create it as temporary with location=US
root: INFO: Job status: RUNNING
root: INFO: Job status: DONE
root: INFO: Job status: RUNNING
root: INFO: Job status: DONE
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-*.json'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/260744147414438c93f83e17382699d7/bigquery\\-table\\-dump\\-[^/\\\\]*\\.json'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 1 files in 0.05335569381713867
seconds.
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-000000000000.json'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/260744147414438c93f83e17382699d7/bigquery\\-table\\-dump\\-000000000000\\.json'
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataInputOperation
read/Read/_SDFBoundedSourceWrapper/Impulse
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction.out0,
coder=WindowedValueCoder[TupleCoder[BytesCoder, TupleCoder[DillCoder,
FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataOutputOperation
ref_PCollection_PCollection_1_split/Write >
apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the bundle
bundle_5 to finish.
apache_beam.runners.portability.fn_api_runner: INFO: Running
((((((((ref_PCollection_PCollection_1_split/Read)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_PCollection_PCollection_4/Write))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataOutputOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write >
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataOutputOperation
ref_PCollection_PCollection_4/Write >
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/AddInsertIds output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/AppendDestination output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]],
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
output_tags=['out', 'out_cleanup_signal'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1],
SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out1,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start
<SdfProcessSizedElements
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataInputOperation
ref_PCollection_PCollection_1_split/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_1_split/Read.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-000000000000.json'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/260744147414438c93f83e17382699d7/bigquery\\-table\\-dump\\-000000000000\\.json'
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataInputOperation
ref_PCollection_PCollection_1_split/Read
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_1_split/Read.out0,
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder,
TupleCoder[LengthPrefixCoder[DillCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish
<SdfProcessSizedElements
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process
output_tags=['out'],
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process.out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)
output_tags=['out', 'out_cleanup_signal'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1],
SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out1,
coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]],
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/AppendDestination output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]],
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/AddInsertIds output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataOutputOperation
ref_PCollection_PCollection_4/Write >
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataOutputOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write >
apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the bundle
bundle_6 to finish.
apache_beam.runners.portability.fn_api_runner: INFO: Running
(((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
at
core.py:2637>)_12))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles) output_tags=['out'],
receivers=[ConsumerSet[read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
read/_PassThroughThenCleanup/Create/Map(decode) output_tags=['out'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2637>)
output_tags=['out'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/FlatMap(<lambda
at core.py:2637>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataInputOperation
read/_PassThroughThenCleanup/Create/Impulse
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-*.json'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/260744147414438c93f83e17382699d7/bigquery\\-table\\-dump\\-[^/\\\\]*\\.json'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 1 files in 0.05103778839111328
seconds.
root: DEBUG: RemoveJsonFiles: matched 1 files
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/temp-it/260744147414438c93f83e17382699d7/bigquery-table-dump-000000000000.json'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/temp\\-it/260744147414438c93f83e17382699d7/bigquery\\-table\\-dump\\-000000000000\\.json'
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataInputOperation
read/_PassThroughThenCleanup/Create/Impulse
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2637>)
output_tags=['out'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/FlatMap(<lambda
at core.py:2637>).out0, coder=WindowedValueCoder[BytesCoder],
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
read/_PassThroughThenCleanup/Create/Map(decode) output_tags=['out'],
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles) output_tags=['out'],
receivers=[ConsumerSet[read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the bundle
bundle_7 to finish.
apache_beam.runners.portability.fn_api_runner: INFO: Running
(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31)
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)
output_tags=['out', 'out_FailedRows'],
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataInputOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.io.gcp.bigquery: DEBUG: Creating or getting table <TableReference
datasetId: 'python_query_to_table_15826308369271'
projectId: 'apache-beam-testing'
tableId: 'output_table'> with schema {'fields': [{'name': 'bytes', 'type':
'BYTES', 'mode': 'NULLABLE'}, {'name': 'date', 'type': 'DATE', 'mode':
'NULLABLE'}, {'name': 'time', 'type': 'TIME', 'mode': 'NULLABLE'}]}.
apache_beam.io.gcp.bigquery_tools: DEBUG: Created the table with id output_table
apache_beam.io.gcp.bigquery_tools: INFO: Created table
apache-beam-testing.python_query_to_table_15826308369271.output_table with
schema <TableSchema
fields: [<TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'bytes'
type: 'BYTES'>, <TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'date'
type: 'DATE'>, <TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'time'
type: 'TIME'>]>. Result: <Table
creationTime: 1582630851868
etag: 'ROqRgBQv9JSO9suxivHTqQ=='
id: 'apache-beam-testing:python_query_to_table_15826308369271.output_table'
kind: 'bigquery#table'
lastModifiedTime: 1582630851931
location: 'US'
numBytes: 0
numLongTermBytes: 0
numRows: 0
schema: <TableSchema
fields: [<TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'bytes'
type: 'BYTES'>, <TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'date'
type: 'DATE'>, <TableFieldSchema
fields: []
mode: 'NULLABLE'
name: 'time'
type: 'TIME'>]>
selfLink:
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15826308369271/tables/output_table'
tableReference: <TableReference
datasetId: 'python_query_to_table_15826308369271'
projectId: 'apache-beam-testing'
tableId: 'output_table'>
type: 'TABLE'>.
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataInputOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)
output_tags=['out', 'out_FailedRows'],
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
apache_beam.io.gcp.bigquery: DEBUG: Attempting to flush to all destinations.
Total buffered: 4
apache_beam.io.gcp.bigquery: DEBUG: Flushing data to
apache-beam-testing:python_query_to_table_15826308369271.output_table. Total 4
rows.
apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the bundle
bundle_8 to finish.
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query
SELECT bytes, date, time FROM
`python_query_to_table_15826308369271.output_table`; to BQ
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/64b936d4-7f98-4669-a232-b4575d6576aa?maxResults=0&location=US
HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anonb28bc41008afe516e9db7684e11659b5909ebe19/data
HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Read from given query (SELECT
bytes, date, time FROM `python_query_to_table_15826308369271.output_table`;),
total rows 0
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Generate checksum:
da39a3ee5e6b4b0d3255bfef95601890afd80709
--------------------- >> end captured logging << ---------------------
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py:87:
FutureWarning: _ReadFromBigQuery is experimental.
kms_key=kms_key)
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1655:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py:95:
BeamDeprecationWarning: BigQuerySink is deprecated since 2.11.0. Use
WriteToBigQuery instead.
kms_key=kms_key))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_io_read_pipeline.py:84:
FutureWarning: _ReadFromBigQuery is experimental.
(options.view_as(GoogleCloudOptions).project, known_args.input_table))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1655:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py:275:
FutureWarning: _ReadFromBigQuery is experimental.
query=self.query, use_standard_sql=True, project=self.project))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1655:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_read_it_test.py:162:
FutureWarning: _ReadFromBigQuery is experimental.
query=self.query, use_standard_sql=True, project=self.project))
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1655:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery.py:1463:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
self.table_reference.projectId = pcoll.pipeline.options.view_as(
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py:818:
BeamDeprecationWarning: options is deprecated since First stable release.
References to <pipeline>.options will not be supported
temp_location = p.options.view_as(GoogleCloudOptions).temp_location
----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py37.xml
----------------------------------------------------------------------
XML:
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/nosetests.xml
----------------------------------------------------------------------
Ran 19 tests in 53.011s
FAILED (SKIP=1, failures=1)
> Task :sdks:python:test-suites:direct:py37:postCommitIT FAILED
FATAL: command execution failed
java.io.IOException: Backing channel 'JNLP4-connect connection from
88.61.224.35.bc.googleusercontent.com/35.224.61.88:39318' is disconnected.
at
hudson.remoting.RemoteInvocationHandler.channelOrFail(RemoteInvocationHandler.java:214)
at
hudson.remoting.RemoteInvocationHandler.invoke(RemoteInvocationHandler.java:283)
at com.sun.proxy.$Proxy139.isAlive(Unknown Source)
at hudson.Launcher$RemoteLauncher$ProcImpl.isAlive(Launcher.java:1150)
at hudson.Launcher$RemoteLauncher$ProcImpl.join(Launcher.java:1142)
at hudson.Launcher$ProcStarter.join(Launcher.java:470)
at hudson.plugins.gradle.Gradle.perform(Gradle.java:317)
at hudson.tasks.BuildStepMonitor$1.perform(BuildStepMonitor.java:20)
at
hudson.model.AbstractBuild$AbstractBuildExecution.perform(AbstractBuild.java:741)
at hudson.model.Build$BuildExecution.build(Build.java:206)
at hudson.model.Build$BuildExecution.doRun(Build.java:163)
at
hudson.model.AbstractBuild$AbstractBuildExecution.run(AbstractBuild.java:504)
at hudson.model.Run.execute(Run.java:1815)
at hudson.model.FreeStyleBuild.run(FreeStyleBuild.java:43)
at hudson.model.ResourceController.execute(ResourceController.java:97)
at hudson.model.Executor.run(Executor.java:429)
Caused by: java.nio.channels.ClosedChannelException
at
org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer.onReadClosed(ChannelApplicationLayer.java:209)
at
org.jenkinsci.remoting.protocol.ApplicationLayer.onRecvClosed(ApplicationLayer.java:222)
at
org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.onRecvClosed(ProtocolStack.java:816)
at
org.jenkinsci.remoting.protocol.FilterLayer.onRecvClosed(FilterLayer.java:287)
at
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.onRecvClosed(SSLEngineFilterLayer.java:181)
at
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.switchToNoSecure(SSLEngineFilterLayer.java:283)
at
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processWrite(SSLEngineFilterLayer.java:503)
at
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.processQueuedWrites(SSLEngineFilterLayer.java:248)
at
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doSend(SSLEngineFilterLayer.java:200)
at
org.jenkinsci.remoting.protocol.impl.SSLEngineFilterLayer.doCloseSend(SSLEngineFilterLayer.java:213)
at
org.jenkinsci.remoting.protocol.ProtocolStack$Ptr.doCloseSend(ProtocolStack.java:784)
at
org.jenkinsci.remoting.protocol.ApplicationLayer.doCloseWrite(ApplicationLayer.java:173)
at
org.jenkinsci.remoting.protocol.impl.ChannelApplicationLayer$ByteBufferCommandTransport.closeWrite(ChannelApplicationLayer.java:314)
at hudson.remoting.Channel.close(Channel.java:1452)
at hudson.remoting.Channel.close(Channel.java:1405)
at hudson.slaves.SlaveComputer.closeChannel(SlaveComputer.java:847)
at hudson.slaves.SlaveComputer.kill(SlaveComputer.java:814)
at hudson.model.AbstractCIBase.killComputer(AbstractCIBase.java:89)
at jenkins.model.Jenkins.access$2100(Jenkins.java:312)
at jenkins.model.Jenkins$19.run(Jenkins.java:3464)
at hudson.model.Queue._withLock(Queue.java:1379)
at hudson.model.Queue.withLock(Queue.java:1256)
at jenkins.model.Jenkins._cleanUpDisconnectComputers(Jenkins.java:3458)
at jenkins.model.Jenkins.cleanUp(Jenkins.java:3336)
at hudson.WebAppMain.contextDestroyed(WebAppMain.java:379)
at
org.apache.catalina.core.StandardContext.listenerStop(StandardContext.java:4732)
at
org.apache.catalina.core.StandardContext.stopInternal(StandardContext.java:5396)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
at
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at
java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
at
org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1400)
at
org.apache.catalina.core.ContainerBase$StopChild.call(ContainerBase.java:1389)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at
org.apache.tomcat.util.threads.InlineExecutorService.execute(InlineExecutorService.java:75)
at
java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:134)
at
org.apache.catalina.core.ContainerBase.stopInternal(ContainerBase.java:976)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at
org.apache.catalina.core.StandardService.stopInternal(StandardService.java:473)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at
org.apache.catalina.core.StandardServer.stopInternal(StandardServer.java:994)
at org.apache.catalina.util.LifecycleBase.stop(LifecycleBase.java:257)
at org.apache.catalina.startup.Catalina.stop(Catalina.java:706)
at org.apache.catalina.startup.Catalina.start(Catalina.java:668)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at org.apache.catalina.startup.Bootstrap.start(Bootstrap.java:344)
at org.apache.catalina.startup.Bootstrap.main(Bootstrap.java:475)
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
ERROR: apache-beam-jenkins-10 is offline; cannot locate JDK 1.8 (latest)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]