See 
<https://builds.apache.org/job/beam_PostCommit_Python35/1938/display/redirect>

Changes:


------------------------------------------
[...truncated 1.66 MB...]
 type: 'DATE'>, <TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'time'
 type: 'TIME'>]>
 selfLink: 
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15836477398913/tables/output_table'
 tableReference: <TableReference
 datasetId: 'python_query_to_table_15836477398913'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'>
 type: 'TABLE'>.
INFO:apache_beam.io.gcp.bigquery_tools:Writing 4 rows to 
apache-beam-testing:python_query_to_table_15836477398913.output_table table.
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query 
SELECT bytes, date, time FROM 
`python_query_to_table_15836477398913.output_table`; to BQ
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/bfea144a-cb8c-4ca5-8af5-1d59cb91c6c1?maxResults=0&location=US
 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon622de179_8a8a_4813_b461_01ff1aa490c3/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT 
bytes, date, time FROM `python_query_to_table_15836477398913.output_table`;), 
total rows 4
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 
24de460c4d344a4b77ccc4cc1acb7b7ffc11a214
DEBUG:root:gcs_location is empty, using temp_location instead
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function annotate_downstream_side_inputs at 0x7f4eadc4c268> 
====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:15 [1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
  
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>)_12\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n  
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n 
 read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_24\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function fix_side_input_pcoll_coders at 0x7f4eadc4c378> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:15 [1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
  
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>)_12\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n  
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n 
 read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_24\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function lift_combiners at 0x7f4eadc4c400> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:15 [1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)_6\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
  
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>)_12\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n  
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n 
 read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_24\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_sdf at 0x7f4eadc4c488> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:17 [1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process\n  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\n
  must follow: 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
  
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>)_12\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n  
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n 
 read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_24\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function expand_gbk at 0x7f4eadc4c510> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:18 [1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process\n  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\n
  must follow: 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
  
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>)_12\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n  
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n 
 read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
  must follow: 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sink_flattens at 0x7f4eadc4c620> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:18 [1, 1, 1, 1, 
1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5\n  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process\n  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\n
  must follow: 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9\n
  
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11\n  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>)_12\n  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\n  must follow: \n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14\n  
read/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15\n 
 read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
  must follow: 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function greedily_fuse at 0x7f4eadc4c6a8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:4 [4, 9, 4, 4]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\nref_PCollection_PCollection_1_split/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
ref_PCollection_PCollection_1_split/Read:beam:source:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\nread/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nref_PCollection_PCollection_4/Write:beam:sink:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)\n
  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\nread/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ', 
'(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function read_to_impulse at 0x7f4eadc4c730> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:4 [4, 9, 4, 4]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  
read/Read/_SDFBoundedSourceWrapper/Impulse:beam:transform:impulse:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\nref_PCollection_PCollection_1_split/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
ref_PCollection_PCollection_1_split/Read:beam:source:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\nread/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nref_PCollection_PCollection_4/Write:beam:sink:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)\n
  
read/_PassThroughThenCleanup/Create/Impulse:beam:transform:impulse:v1\nread/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ', 
'(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function impulse_to_input at 0x7f4eadc4c7b8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:4 [4, 9, 4, 4]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\nref_PCollection_PCollection_1_split/Write:beam:sink:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
ref_PCollection_PCollection_1_split/Read:beam:source:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\nread/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nref_PCollection_PCollection_4/Write:beam:sink:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)\n
  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Impulse:beam:source:runner:0.1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ', 
'(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function inject_timer_pcollections at 0x7f4eadc4c950> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:4 [4, 9, 4, 4]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\nref_PCollection_PCollection_1_split/Write:beam:sink:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
ref_PCollection_PCollection_1_split/Read:beam:source:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\nread/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nref_PCollection_PCollection_4/Write:beam:sink:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)\n
  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Impulse:beam:source:runner:0.1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ', 
'(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function sort_stages at 0x7f4eadc4c9d8> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:4 [4, 9, 4, 4]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\nref_PCollection_PCollection_1_split/Write:beam:sink:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
ref_PCollection_PCollection_1_split/Read:beam:source:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\nread/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nref_PCollection_PCollection_4/Write:beam:sink:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)\n
  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Impulse:beam:source:runner:0.1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ', 
'(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
INFO:apache_beam.runners.portability.fn_api_runner_transforms:====================
 <function window_pcollection_coders at 0x7f4eadc4ca60> ====================
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:4 [4, 9, 4, 4]
DEBUG:apache_beam.runners.portability.fn_api_runner_transforms:Stages: 
['(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction:beam:transform:sdf_pair_with_restriction:v1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction:beam:transform:sdf_split_and_size_restrictions:v1\nref_PCollection_PCollection_1_split/Write:beam:sink:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ref_PCollection_PCollection_4', 
'(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
ref_PCollection_PCollection_1_split/Read:beam:source:runner:0.1\nread/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process:beam:transform:sdf_process_sized_element_and_restrictions:v1\nread/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nref_PCollection_PCollection_4/Write:beam:sink:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)\n
  downstream_side_inputs: ref_PCollection_PCollection_4', 
'((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)\n
  read/_PassThroughThenCleanup/Create/FlatMap(<lambda at 
core.py:2643>):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Map(decode):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/ParDo(RemoveJsonFiles):beam:transform:pardo:v1\nread/_PassThroughThenCleanup/Create/Impulse:beam:source:runner:0.1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ', 
'(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
INFO:apache_beam.runners.worker.statecache:Creating state cache with size 100
INFO:apache_beam.runners.portability.fn_api_runner:Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7f4ead750080> for environment urn: "beam:env:embedded_python:v1"

INFO:apache_beam.runners.portability.fn_api_runner:Running 
(((ref_AppliedPTransform_read/Read/_SDFBoundedSourceWrapper/Impulse_5)+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction))+(read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction))+(ref_PCollection_PCollection_1_split/Write)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation 
ref_PCollection_PCollection_1_split/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction
 output_tags=['out'], 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction.out0,
 coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder, 
TupleCoder[LengthPrefixCoder[DillCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction
 output_tags=['out'], 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction.out0,
 coder=WindowedValueCoder[TupleCoder[BytesCoder, TupleCoder[DillCoder, 
FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation 
read/Read/_SDFBoundedSourceWrapper/Impulse 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/Impulse.out0,
 coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.io.gcp.bigquery_tools:Query SELECT * FROM (SELECT "apple" as 
fruit) UNION ALL (SELECT "orange" as fruit) does not reference any tables.
WARNING:apache_beam.io.gcp.bigquery_tools:Dataset 
apache-beam-testing:temp_dataset_12d0accdd31145ff9475ab7479d9c102 does not 
exist so we will create it as temporary with location=None
INFO:root:Job status: RUNNING
INFO:root:Job status: DONE
INFO:root:Job status: RUNNING
INFO:root:Job status: DONE
DEBUG:apache_beam.io.filesystem:Listing files in 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-'
DEBUG:apache_beam.io.filesystem:translate_pattern: 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-*.json'
 -> 
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/3fd9ee7737db47ecb459d7e818b3bcd5\\/bigquery\\-table\\-dump\\-[^/\\\\]*\\.json'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 1 files in 0.041459083557128906 
seconds.
DEBUG:apache_beam.io.filesystem:translate_pattern: 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-000000000000.json'
 -> 
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/3fd9ee7737db47ecb459d7e818b3bcd5\\/bigquery\\-table\\-dump\\-000000000000\\.json'
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation 
read/Read/_SDFBoundedSourceWrapper/Impulse 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/Impulse.out0,
 coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction
 output_tags=['out'], 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/PairWithRestriction.out0,
 coder=WindowedValueCoder[TupleCoder[BytesCoder, TupleCoder[DillCoder, 
FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction
 output_tags=['out'], 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/SplitAndSizeRestriction.out0,
 coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder, 
TupleCoder[LengthPrefixCoder[DillCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation 
ref_PCollection_PCollection_1_split/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner:Wait for the bundle 
bundle_9 to finish.
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(((ref_PCollection_PCollection_1_split/Read)+(((read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process)+(((ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_18))+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_19)))+(ref_PCollection_PCollection_4/Write)))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_21))+((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_23)+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataOutputOperation 
ref_PCollection_PCollection_4/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) 
output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[VarIntCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/AddInsertIds output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/AppendDestination output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], 
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) 
output_tags=['None', 'cleanup_signal'], 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], 
SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out1,
 coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], 
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start 
<SdfProcessSizedElements 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process 
output_tags=['None'], 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process.out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation 
ref_PCollection_PCollection_1_split/Read 
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_1_split/Read.out0, 
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder, 
TupleCoder[LengthPrefixCoder[DillCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.io.filesystem:translate_pattern: 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-000000000000.json'
 -> 
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/3fd9ee7737db47ecb459d7e818b3bcd5\\/bigquery\\-table\\-dump\\-000000000000\\.json'
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation 
ref_PCollection_PCollection_1_split/Read 
receivers=[SingletonConsumerSet[ref_PCollection_PCollection_1_split/Read.out0, 
coder=WindowedValueCoder[TupleCoder[TupleCoder[BytesCoder, 
TupleCoder[LengthPrefixCoder[DillCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], FloatCoder]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish 
<SdfProcessSizedElements 
read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process 
output_tags=['None'], 
receivers=[SingletonConsumerSet[read/Read/_SDFBoundedSourceWrapper/ParDo(SDFBoundedSourceDoFn)/Process.out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough) 
output_tags=['None', 'cleanup_signal'], 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1], 
SingletonConsumerSet[read/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).out1,
 coder=WindowedValueCoder[LengthPrefixCoder[FastPrimitivesCoder]], 
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/AppendDestination output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], 
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/AddInsertIds output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[VarIntCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) 
output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation 
ref_PCollection_PCollection_4/Write >
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataOutputOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write >
DEBUG:apache_beam.runners.portability.fn_api_runner:Wait for the bundle 
bundle_10 to finish.
INFO:apache_beam.runners.portability.fn_api_runner:Running 
((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Impulse_11)+((ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at 
core.py:2643>)_12)+(ref_AppliedPTransform_read/_PassThroughThenCleanup/Create/Map(decode)_14)))+(ref_AppliedPTransform_read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)_15)
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles) output_tags=['None'], 
receivers=[ConsumerSet[read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
read/_PassThroughThenCleanup/Create/Map(decode) output_tags=['None'], 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Map(decode).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2643>) 
output_tags=['None'], 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at core.py:2643>).out0, coder=WindowedValueCoder[BytesCoder], 
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation 
read/_PassThroughThenCleanup/Create/Impulse 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Impulse.out0,
 coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.io.filesystem:Listing files in 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-'
DEBUG:apache_beam.io.filesystem:translate_pattern: 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-*.json'
 -> 
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/3fd9ee7737db47ecb459d7e818b3bcd5\\/bigquery\\-table\\-dump\\-[^/\\\\]*\\.json'
INFO:apache_beam.io.gcp.gcsio:Starting the size estimation of the input
INFO:apache_beam.io.gcp.gcsio:Finished listing 1 files in 0.03489804267883301 
seconds.
DEBUG:root:RemoveJsonFiles: matched 1 files
DEBUG:apache_beam.io.filesystem:translate_pattern: 
'gs://temp-storage-for-end-to-end-tests/temp-it/3fd9ee7737db47ecb459d7e818b3bcd5/bigquery-table-dump-000000000000.json'
 -> 
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/temp\\-it\\/3fd9ee7737db47ecb459d7e818b3bcd5\\/bigquery\\-table\\-dump\\-000000000000\\.json'
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation 
read/_PassThroughThenCleanup/Create/Impulse 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Impulse.out0,
 coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
read/_PassThroughThenCleanup/Create/FlatMap(<lambda at core.py:2643>) 
output_tags=['None'], 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/FlatMap(<lambda
 at core.py:2643>).out0, coder=WindowedValueCoder[BytesCoder], 
len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
read/_PassThroughThenCleanup/Create/Map(decode) output_tags=['None'], 
receivers=[SingletonConsumerSet[read/_PassThroughThenCleanup/Create/Map(decode).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles) output_tags=['None'], 
receivers=[ConsumerSet[read/_PassThroughThenCleanup/ParDo(RemoveJsonFiles).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.portability.fn_api_runner:Wait for the bundle 
bundle_11 to finish.
INFO:apache_beam.runners.portability.fn_api_runner:Running 
(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_28)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_29))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_31))
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['FailedRows', 'None'], 
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
 output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[VarIntCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:start <DataInputOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.io.gcp.bigquery:Creating or getting table <TableReference
 datasetId: 'python_query_to_table_15836477468105'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'> with schema {'fields': [{'name': 'fruit', 'mode': 
'NULLABLE', 'type': 'STRING'}]}.
DEBUG:apache_beam.io.gcp.bigquery_tools:Created the table with id output_table
INFO:apache_beam.io.gcp.bigquery_tools:Created table 
apache-beam-testing.python_query_to_table_15836477468105.output_table with 
schema <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>. Result: <Table
 creationTime: 1583647761396
 etag: '5cZQRzJWhQYuHdtiEsQCuw=='
 id: 'apache-beam-testing:python_query_to_table_15836477468105.output_table'
 kind: 'bigquery#table'
 lastModifiedTime: 1583647761531
 location: 'US'
 numBytes: 0
 numLongTermBytes: 0
 numRows: 0
 schema: <TableSchema
 fields: [<TableFieldSchema
 fields: []
 mode: 'NULLABLE'
 name: 'fruit'
 type: 'STRING'>]>
 selfLink: 
'https://www.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/python_query_to_table_15836477468105/tables/output_table'
 tableReference: <TableReference
 datasetId: 'python_query_to_table_15836477468105'
 projectId: 'apache-beam-testing'
 tableId: 'output_table'>
 type: 'TABLE'>.
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DataInputOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
IterableCoder[LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
 output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[VarIntCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['None'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
DEBUG:apache_beam.runners.worker.bundle_processor:finish <DoOperation 
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['FailedRows', 'None'], 
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
DEBUG:apache_beam.io.gcp.bigquery:Attempting to flush to all destinations. 
Total buffered: 2
DEBUG:apache_beam.io.gcp.bigquery:Flushing data to 
apache-beam-testing:python_query_to_table_15836477468105.output_table. Total 2 
rows.
DEBUG:apache_beam.runners.portability.fn_api_runner:Wait for the bundle 
bundle_12 to finish.
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query 
SELECT fruit from `python_query_to_table_15836477468105.output_table`; to BQ
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 192
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/f48aa91c-a292-4680-a739-0b27e95e77ba?maxResults=0&location=US
 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon27d3f823_f045_48a4_b1f2_e09bc74f4ed3/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT 
fruit from `python_query_to_table_15836477468105.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 
158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: This test doesn't work on DirectRunner.

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 19 tests in 57.774s

OK (SKIP=1)

FAILURE: Build completed with 2 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:dataflow:py35:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py35:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 2m 58s
81 actionable tasks: 59 executed, 22 from cache

Publishing build scan...
https://gradle.com/s/nzrqh5fpbky72

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to