Udi Meiri created BEAM-9148:
-------------------------------
Summary: test flakiness:
BigQueryQueryToTableIT.test_big_query_standard_sql
Key: BEAM-9148
URL: https://issues.apache.org/jira/browse/BEAM-9148
Project: Beam
Issue Type: Bug
Components: io-py-gcp, sdk-py-core, test-failures
Reporter: Udi Meiri
There might be other flaky test cases from the same class, but I'm focusing on
test_big_query_standard_sql here.
{code}
19:39:12 ======================================================================
19:39:12 FAIL: test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
19:39:12 ----------------------------------------------------------------------
19:39:12 Traceback (most recent call last):
19:39:12 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_it_test.py",
line 172, in test_big_query_standard_sql
19:39:12 big_query_query_to_table_pipeline.run_bq_pipeline(options)
19:39:12 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/big_query_query_to_table_pipeline.py",
line 84, in run_bq_pipeline
19:39:12 result = p.run()
19:39:12 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/testing/test_pipeline.py",
line 112, in run
19:39:12 else test_runner_api))
19:39:12 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/pipeline.py",
line 461, in run
19:39:12 self._options).run(False)
19:39:12 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/pipeline.py",
line 474, in run
19:39:12 return self.runner.run_pipeline(self, self._options)
19:39:12 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py",
line 53, in run_pipeline
19:39:12 hc_assert_that(self.result, pickler.loads(on_success_matcher))
19:39:12 AssertionError:
19:39:12 Expected: (Test pipeline expected terminated in state: DONE and
Expected checksum is 158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72)
19:39:12 but: Expected checksum is
158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72 Actual checksum is
da39a3ee5e6b4b0d3255bfef95601890afd80709
19:39:12
19:39:12 -------------------- >> begin captured logging << --------------------
19:39:12 root: DEBUG: Unhandled type_constraint: Union[]
19:39:12 root: DEBUG: Unhandled type_constraint: Union[]
19:39:12 apache_beam.runners.direct.direct_runner: INFO: Running pipeline with
DirectRunner.
19:39:12 apache_beam.io.gcp.bigquery_tools: DEBUG: Query SELECT * FROM (SELECT
"apple" as fruit) UNION ALL (SELECT "orange" as fruit) does not reference any
tables.
19:39:12 apache_beam.io.gcp.bigquery_tools: WARNING: Dataset
apache-beam-testing:temp_dataset_90f5797bdb5f4137af750399f91a8e66 does not
exist so we will create it as temporary with location=None
19:39:12 apache_beam.io.gcp.bigquery: DEBUG: Creating or getting table
<TableReference
19:39:12 datasetId: 'python_query_to_table_15792323245106'
19:39:12 projectId: 'apache-beam-testing'
19:39:12 tableId: 'output_table'> with schema {'fields': [{'name': 'fruit',
'type': 'STRING', 'mode': 'NULLABLE'}]}.
19:39:12 apache_beam.io.gcp.bigquery_tools: DEBUG: Created the table with id
output_table
19:39:12 apache_beam.io.gcp.bigquery_tools: INFO: Created table
apache-beam-testing.python_query_to_table_15792323245106.output_table with
schema <TableSchema
19:39:12 fields: [<TableFieldSchema
19:39:12 fields: []
19:39:12 mode: 'NULLABLE'
19:39:12 name: 'fruit'
19:39:12 type: 'STRING'>]>. Result: <Table
19:39:12 creationTime: 1579232328576
19:39:12 etag: 'WYysl6UIvc8IWMmTiiKhbg=='
19:39:12 id:
'apache-beam-testing:python_query_to_table_15792323245106.output_table'
19:39:12 kind: 'bigquery#table'
19:39:12 lastModifiedTime: 1579232328629
19:39:12 location: 'US'
19:39:12 numBytes: 0
19:39:12 numLongTermBytes: 0
19:39:12 numRows: 0
19:39:12 schema: <TableSchema
19:39:12 fields: [<TableFieldSchema
19:39:12 fields: []
19:39:12 mode: 'NULLABLE'
19:39:12 name: 'fruit'
19:39:12 type: 'STRING'>]>
19:39:12 selfLink:
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15792323245106/tables/output_table'
19:39:12 tableReference: <TableReference
19:39:12 datasetId: 'python_query_to_table_15792323245106'
19:39:12 projectId: 'apache-beam-testing'
19:39:12 tableId: 'output_table'>
19:39:12 type: 'TABLE'>.
19:39:12 apache_beam.io.gcp.bigquery: DEBUG: Attempting to flush to all
destinations. Total buffered: 2
19:39:12 apache_beam.io.gcp.bigquery: DEBUG: Flushing data to
apache-beam-testing:python_query_to_table_15792323245106.output_table. Total 2
rows.
19:39:12 apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to
perform query SELECT fruit from
`python_query_to_table_15792323245106.output_table`; to BQ
19:39:12 google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
19:39:12 google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
19:39:12 urllib3.util.retry: DEBUG: Converted retries value: 3 ->
Retry(total=3, connect=None, read=None, redirect=None, status=None)
19:39:12 google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
19:39:12 urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
19:39:12 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80
"GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true
HTTP/1.1" 200 144
19:39:12 google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
19:39:12 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80
"GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 181
19:39:12 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
www.googleapis.com:443
19:39:12 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
19:39:12 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/032dabb2-92e3-472b-993e-f0e840ba0104?maxResults=0&location=US
HTTP/1.1" 200 None
19:39:12 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon2425bb619e2b0611267244ec96ed403c7aae8d57/data
HTTP/1.1" 200 None
19:39:12 apache_beam.io.gcp.tests.bigquery_matcher: INFO: Read from given
query (SELECT fruit from `python_query_to_table_15792323245106.output_table`;),
total rows 0
19:39:12 apache_beam.io.gcp.tests.bigquery_matcher: INFO: Generate checksum:
da39a3ee5e6b4b0d3255bfef95601890afd80709
19:39:12 --------------------- >> end captured logging << ---------------------
{code}
https://builds.apache.org/job/beam_PostCommit_Python37/1387/timestamps/?time=HH:mm:ss&timeZone=GMT-8&appendLog&locale=en_US
{code}
04:09:22 ======================================================================
04:09:22 FAIL: test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
04:09:22 ----------------------------------------------------------------------
04:09:22 Traceback (most recent call last):
04:09:22 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/io/gcp/bigquery_write_it_test.py",
line 139, in test_big_query_write
04:09:22 write_disposition=beam.io.BigQueryDisposition.WRITE_EMPTY))
04:09:22 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/pipeline.py",
line 481, in __exit__
04:09:22 self.run().wait_until_finish()
04:09:22 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/pipeline.py",
line 461, in run
04:09:22 self._options).run(False)
04:09:22 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/pipeline.py",
line 474, in run
04:09:22 return self.runner.run_pipeline(self, self._options)
04:09:22 File
"/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Python37/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py",
line 53, in run_pipeline
04:09:22 hc_assert_that(self.result, pickler.loads(on_success_matcher))
04:09:22 AssertionError:
04:09:22 Expected: (Expected data is [(1, 'abc'), (2, 'def'), (3, '你好'), (4,
'привет')])
04:09:22 but: Expected data is [(1, 'abc'), (2, 'def'), (3, '你好'), (4,
'привет')] Actual data is []
04:09:22
04:09:22 -------------------- >> begin captured logging << --------------------
04:09:22 apache_beam.internal.gcp.auth: INFO: Setting socket default timeout
to 60 seconds.
04:09:22 apache_beam.internal.gcp.auth: INFO: socket default timeout is 60.0
seconds.
04:09:22 root: DEBUG: Connecting using Google Application Default Credentials.
04:09:22 oauth2client.transport: INFO: Attempting refresh to obtain initial
access_token
04:09:22 apache_beam.io.gcp.bigquery_write_it_test: INFO: Created dataset
python_write_to_table_15791765386341 in project apache-beam-testing
04:09:22 root: DEBUG: Unhandled type_constraint: Union[]
04:09:22 root: DEBUG: Unhandled type_constraint: Union[]
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function annotate_downstream_side_inputs at
0x7f127987d7b8> ====================
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 16
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages: ['ref_AppliedPTransform_create/Impulse_3\n
create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: ', 'ref_AppliedPTransform_create/FlatMap(<lambda at
core.py:2597>)_4\n create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/Map(decode)_16\n
create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19\n
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20\n
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22\n
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_25\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30\n
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32\n
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ']
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function fix_side_input_pcoll_coders at 0x7f127987d8c8>
====================
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 16
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages: ['ref_AppliedPTransform_create/Impulse_3\n
create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: ', 'ref_AppliedPTransform_create/FlatMap(<lambda at
core.py:2597>)_4\n create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/Map(decode)_16\n
create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19\n
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20\n
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22\n
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_25\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30\n
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32\n
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ']
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function lift_combiners at 0x7f127987d950>
====================
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 16
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages: ['ref_AppliedPTransform_create/Impulse_3\n
create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: ', 'ref_AppliedPTransform_create/FlatMap(<lambda at
core.py:2597>)_4\n create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/Map(decode)_16\n
create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19\n
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20\n
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22\n
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_25\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30\n
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32\n
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ']
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function expand_sdf at 0x7f127987d9d8>
====================
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 16
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages: ['ref_AppliedPTransform_create/Impulse_3\n
create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: ', 'ref_AppliedPTransform_create/FlatMap(<lambda at
core.py:2597>)_4\n create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey_10\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/Map(decode)_16\n
create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19\n
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20\n
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22\n
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_25\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30\n
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32\n
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ']
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function expand_gbk at 0x7f127987da60>
====================
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 18
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages: ['ref_AppliedPTransform_create/Impulse_3\n
create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: ', 'ref_AppliedPTransform_create/FlatMap(<lambda at
core.py:2597>)_4\n create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
must follow:
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/Map(decode)_16\n
create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19\n
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20\n
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22\n
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
must follow:
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30\n
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32\n
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ']
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function sink_flattens at 0x7f127987db70>
====================
04:09:22 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 18
[1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages: ['ref_AppliedPTransform_create/Impulse_3\n
create/Impulse:beam:transform:impulse:v1\n must follow: \n
downstream_side_inputs: ', 'ref_AppliedPTransform_create/FlatMap(<lambda at
core.py:2597>)_4\n create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7\n
create/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
must follow:
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write\n
downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15\n
create/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_create/Map(decode)_16\n
create/Map(decode):beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19\n
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n must
follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20\n
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n must follow: \n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22\n
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
must follow:
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n
downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30\n
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ',
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32\n
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow: \n downstream_side_inputs: ']
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function greedily_fuse at 0x7f127987dbf8>
====================
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 3
[4, 5, 9]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages:
['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow:
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ',
'((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
create/Impulse:beam:transform:impulse:v1\ncreate/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow:
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ']
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function read_to_impulse at 0x7f127987dc80>
====================
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 3
[4, 5, 9]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages:
['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow:
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ',
'((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
create/Impulse:beam:transform:impulse:v1\ncreate/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow:
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ']
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function impulse_to_input at 0x7f127987dd08>
====================
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 3
[4, 5, 9]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages:
['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow:
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ',
'((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Impulse:beam:source:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow:
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ']
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function inject_timer_pcollections at 0x7f127987dea0>
====================
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 3
[4, 5, 9]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages:
['(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow:
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ',
'((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Impulse:beam:source:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow:
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ']
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function sort_stages at 0x7f127987df28>
====================
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 3
[5, 9, 4]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages:
['((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Impulse:beam:source:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow:
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ',
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow:
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ']
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: INFO:
==================== <function window_pcollection_coders at 0x7f127987f048>
====================
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 3
[5, 9, 4]
04:09:23 apache_beam.runners.portability.fn_api_runner_transforms: DEBUG:
Stages:
['((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
create/FlatMap(<lambda at
core.py:2597>):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/AddRandomKeys:beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Impulse:beam:source:runner:0.1\n
must follow: \n downstream_side_inputs: ',
'((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\ncreate/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\ncreate/MaybeReshuffle/Reshuffle/RemoveRandomKeys:beam:transform:pardo:v1\ncreate/Map(decode):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
must follow:
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ',
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)\n
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
must follow:
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)\n
downstream_side_inputs: ']
04:09:23 apache_beam.runners.worker.statecache: INFO: Creating state cache
with size 100
04:09:23 apache_beam.runners.portability.fn_api_runner: INFO: Created Worker
handler <apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler
object at 0x7f12798127f0> for environment urn: "beam:env:embedded_python:v1"
04:09:23
04:09:23 apache_beam.runners.portability.fn_api_runner: INFO: Running
((((ref_AppliedPTransform_create/Impulse_3)+(ref_AppliedPTransform_create/FlatMap(<lambda
at
core.py:2597>)_4))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/AddRandomKeys_7))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)_9))+(create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Write)
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DataOutputOperation >
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation create/MaybeReshuffle/Reshuffle/AddRandomKeys output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/AddRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]],
len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation create/FlatMap(<lambda at core.py:2597>) output_tags=['out'],
receivers=[SingletonConsumerSet[create/FlatMap(<lambda at core.py:2597>).out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DataInputOperation receivers=[SingletonConsumerSet[create/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DataInputOperation receivers=[SingletonConsumerSet[create/Impulse.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation create/FlatMap(<lambda at core.py:2597>) output_tags=['out'],
receivers=[SingletonConsumerSet[create/FlatMap(<lambda at core.py:2597>).out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation create/MaybeReshuffle/Reshuffle/AddRandomKeys output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/AddRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]],
len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/Map(reify_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DataOutputOperation >
04:09:23 apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the
bundle bundle_1 to finish.
04:09:23 apache_beam.runners.portability.fn_api_runner: INFO: Running
((((((((create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)_14))+(ref_AppliedPTransform_create/MaybeReshuffle/Reshuffle/RemoveRandomKeys_15))+(ref_AppliedPTransform_create/Map(decode)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_19))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_20))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_22))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_24))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write)
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DataOutputOperation >
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]],
len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation create/Map(decode) output_tags=['out'],
receivers=[SingletonConsumerSet[create/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]],
len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DataInputOperation
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DataInputOperation
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/GroupByKey/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation
create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/ReshufflePerKey/FlatMap(restore_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder, BytesCoder]],
len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation create/MaybeReshuffle/Reshuffle/RemoveRandomKeys
output_tags=['out'],
receivers=[SingletonConsumerSet[create/MaybeReshuffle/Reshuffle/RemoveRandomKeys.out0,
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation create/Map(decode) output_tags=['out'],
receivers=[SingletonConsumerSet[create/Map(decode).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation write/_StreamToBigQuery/AppendDestination output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]],
len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation write/_StreamToBigQuery/AddInsertIds output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DataOutputOperation >
04:09:23 apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the
bundle bundle_2 to finish.
04:09:23 apache_beam.runners.portability.fn_api_runner: INFO: Running
(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_30))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_32)
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)
output_tags=['out', 'out_FailedRows'],
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: start
<DataInputOperation
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.io.gcp.bigquery: DEBUG: Creating or getting table
<TableReference
04:09:23 datasetId: 'python_write_to_table_15791765386341'
04:09:23 projectId: 'apache-beam-testing'
04:09:23 tableId: 'python_write_table'> with schema {'fields': [{'name':
'number', 'type': 'INTEGER'}, {'name': 'str', 'type': 'STRING'}]}.
04:09:23 apache_beam.io.gcp.bigquery_tools: DEBUG: Created the table with id
python_write_table
04:09:23 apache_beam.io.gcp.bigquery_tools: INFO: Created table
apache-beam-testing.python_write_to_table_15791765386341.python_write_table
with schema <TableSchema
04:09:23 fields: [<TableFieldSchema
04:09:23 fields: []
04:09:23 mode: 'NULLABLE'
04:09:23 name: 'number'
04:09:23 type: 'INTEGER'>, <TableFieldSchema
04:09:23 fields: []
04:09:23 mode: 'NULLABLE'
04:09:23 name: 'str'
04:09:23 type: 'STRING'>]>. Result: <Table
04:09:23 creationTime: 1579176540190
04:09:23 etag: 'JfW1AfAzzEB0skdmmPxv+A=='
04:09:23 id:
'apache-beam-testing:python_write_to_table_15791765386341.python_write_table'
04:09:23 kind: 'bigquery#table'
04:09:23 lastModifiedTime: 1579176540332
04:09:23 location: 'US'
04:09:23 numBytes: 0
04:09:23 numLongTermBytes: 0
04:09:23 numRows: 0
04:09:23 schema: <TableSchema
04:09:23 fields: [<TableFieldSchema
04:09:23 fields: []
04:09:23 mode: 'NULLABLE'
04:09:23 name: 'number'
04:09:23 type: 'INTEGER'>, <TableFieldSchema
04:09:23 fields: []
04:09:23 mode: 'NULLABLE'
04:09:23 name: 'str'
04:09:23 type: 'STRING'>]>
04:09:23 selfLink:
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_write_to_table_15791765386341/tables/python_write_table'
04:09:23 tableReference: <TableReference
04:09:23 datasetId: 'python_write_to_table_15791765386341'
04:09:23 projectId: 'apache-beam-testing'
04:09:23 tableId: 'python_write_table'>
04:09:23 type: 'TABLE'>.
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DataInputOperation
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder],
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
coder=WindowedValueCoder[TupleCoder[VarIntCoder,
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder,
FastPrimitivesCoder]]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys
output_tags=['out'],
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder,
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
04:09:23 apache_beam.runners.worker.bundle_processor: DEBUG: finish
<DoOperation write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)
output_tags=['out', 'out_FailedRows'],
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0],
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
04:09:23 apache_beam.io.gcp.bigquery: DEBUG: Attempting to flush to all
destinations. Total buffered: 4
04:09:23 apache_beam.io.gcp.bigquery: DEBUG: Flushing data to
apache-beam-testing:python_write_to_table_15791765386341.python_write_table.
Total 4 rows.
04:09:23 apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the
bundle bundle_3 to finish.
04:09:23 apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to
perform query SELECT number, str FROM
python_write_to_table_15791765386341.python_write_table to BQ
04:09:23 google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
04:09:23 google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
04:09:23 urllib3.util.retry: DEBUG: Converted retries value: 3 ->
Retry(total=3, connect=None, read=None, redirect=None, status=None)
04:09:23 google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
04:09:23 urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
04:09:23 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80
"GET /computeMetadata/v1/instance/service-accounts/default/?recursive=true
HTTP/1.1" 200 144
04:09:23 google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
04:09:23 urllib3.connectionpool: DEBUG: http://metadata.google.internal:80
"GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 181
04:09:23 urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
www.googleapis.com:443
04:09:23 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
04:09:23 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/4bf07fbd-75d6-4b57-bea0-4c52e218e965?maxResults=0&location=US
HTTP/1.1" 200 None
04:09:23 urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon86dd7594b23527458141a853c6f14cdda5465a50/data
HTTP/1.1" 200 None
04:09:23 apache_beam.io.gcp.tests.bigquery_matcher: INFO: Result of query is:
[]
04:09:23 apache_beam.io.gcp.bigquery_write_it_test: INFO: Deleting dataset
python_write_to_table_15791765386341 in project apache-beam-testing
04:09:23 --------------------- >> end captured logging << ---------------------
{code}
https://builds.apache.org/job/beam_PostCommit_Python37/1379/timestamps/?time=HH:mm:ss&timeZone=GMT-8&appendLog&locale=en_US
--
This message was sent by Atlassian Jira
(v8.3.4#803005)