See 
<https://builds.apache.org/job/beam_PostCommit_Python35/1097/display/redirect?page=changes>

Changes:

[amyrvold] [BEAM-8832] Allow GCS staging upload chunk size to be increased >1M 
when


------------------------------------------
[...truncated 95.48 KB...]
Requirement already satisfied: zipp>=0.5 in 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages>
 (from importlib-metadata>=0.12; python_version < 
"3.8"->pytest<5.0,>=4.4.0->apache-beam==2.18.0.dev0) (0.6.0)
Collecting apipkg>=1.4
  Using cached 
https://files.pythonhosted.org/packages/67/08/4815a09603fc800209431bec5b8bd2acf2f95abdfb558a44a42507fb94da/apipkg-1.5-py2.py3-none-any.whl
Installing collected packages: crcmod, dill, fastavro, future, idna, chardet, 
urllib3, certifi, requests, docopt, hdfs, httplib2, pbr, mock, numpy, pymongo, 
pyasn1, pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, 
pytz, avro-python3, pyarrow, cachetools, monotonic, fasteners, google-apitools, 
google-auth, googleapis-common-protos, google-api-core, google-cloud-core, 
google-cloud-datastore, grpc-google-iam-v1, google-cloud-pubsub, 
google-resumable-media, google-cloud-bigquery, google-cloud-bigtable, nose, 
nose-xunitmp, pandas, parameterized, pyhamcrest, pyyaml, requests-mock, 
tenacity, pathlib2, attrs, atomicwrites, wcwidth, packaging, pytest, 
pytest-forked, apipkg, execnet, pytest-xdist, apache-beam

> Task :runners:google-cloud-dataflow-java:worker:shadowJar

> Task :sdks:python:test-suites:direct:py35:postCommitIT
>>> RUNNING integration tests with pipeline options: --runner=TestDirectRunner 
>>> --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=build/apache-beam.tar.gz 
>>> --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: 
>>> --tests=apache_beam.examples.wordcount_it_test:WordCountIT.test_wordcount_it,apache_beam.io.gcp.pubsub_integration_test:PubSubIntegrationTest,apache_beam.io.gcp.big_query_query_to_table_it_test:BigQueryQueryToTableIT,apache_beam.io.gcp.bigquery_io_read_it_test,apache_beam.io.gcp.bigquery_read_it_test,apache_beam.io.gcp.bigquery_write_it_test,apache_beam.io.gcp.datastore.v1new.datastore_write_it_test
>>>  --nocapture --processes=8 --process-timeout=4500
running nosetests
running egg_info
writing entry points to apache_beam.egg-info/entry_points.txt
writing apache_beam.egg-info/PKG-INFO
writing top-level names to apache_beam.egg-info/top_level.txt
writing requirements to apache_beam.egg-info/requires.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/1398941889/lib/python3.5/site-packages/setuptools/dist.py>:476:
 UserWarning: Normalizing '2.18.0.dev' to '2.18.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'

> Task :runners:flink:1.9:job-server:shadowJar

> Task :sdks:python:test-suites:dataflow:py35:installGcpTest
  Running setup.py develop for apache-beam
Successfully installed apache-beam apipkg-1.5 atomicwrites-1.3.0 attrs-19.3.0 
avro-python3-1.9.1 cachetools-3.1.1 certifi-2019.9.11 chardet-3.0.4 crcmod-1.7 
dill-0.3.1.1 docopt-0.6.2 execnet-1.7.1 fastavro-0.21.24 fasteners-0.15 
future-0.18.2 google-api-core-1.14.3 google-apitools-0.5.28 google-auth-1.7.1 
google-cloud-bigquery-1.17.1 google-cloud-bigtable-1.0.0 
google-cloud-core-1.0.3 google-cloud-datastore-1.7.4 google-cloud-pubsub-1.0.2 
google-resumable-media-0.4.1 googleapis-common-protos-1.6.0 
grpc-google-iam-v1-0.12.3 hdfs-2.5.8 httplib2-0.12.0 idna-2.8 mock-2.0.0 
monotonic-1.5 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.17.4 oauth2client-3.0.0 
packaging-19.2 pandas-0.24.2 parameterized-0.6.3 pathlib2-2.3.5 pbr-5.4.4 
pyarrow-0.15.1 pyasn1-0.4.8 pyasn1-modules-0.2.7 pydot-1.4.1 pyhamcrest-1.9.0 
pymongo-3.9.0 pyparsing-2.4.5 pytest-4.6.6 pytest-forked-1.1.3 
pytest-xdist-1.30.0 python-dateutil-2.8.1 pytz-2019.3 pyyaml-5.1.2 
requests-2.22.0 requests-mock-1.7.0 rsa-4.0 tenacity-5.1.5 urllib3-1.25.7 
wcwidth-0.1.7

> Task :sdks:python:test-suites:dataflow:py35:postCommitIT FAILED
>>> RUNNING integration tests with pipeline options: 
>>> --runner=TestDataflowRunner --project=apache-beam-testing 
>>> --staging_location=gs://temp-storage-for-end-to-end-tests/staging-it 
>>> --temp_location=gs://temp-storage-for-end-to-end-tests/temp-it 
>>> --output=gs://temp-storage-for-end-to-end-tests/py-it-cloud/output 
>>> --sdk_location=<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/build/apache-beam.tar.gz>
>>>  --requirements_file=postcommit_requirements.txt --num_workers=1 
>>> --sleep_secs=20 
>>> --dataflow_worker_jar=<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/runners/google-cloud-dataflow-java/worker/build/libs/beam-runners-google-cloud-dataflow-java-fn-api-worker-2.18.0-SNAPSHOT.jar>
>>>  
>>> --kms_key_name=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>  
>>> --dataflow_kms_key=projects/apache-beam-testing/locations/global/keyRings/beam-it/cryptoKeys/test
>>>   test options: --nocapture --processes=8 --process-timeout=4500 --attr=IT
running nosetests
running egg_info
writing apache_beam.egg-info/PKG-INFO
writing requirements to apache_beam.egg-info/requires.txt
writing top-level names to apache_beam.egg-info/top_level.txt
writing dependency_links to apache_beam.egg-info/dependency_links.txt
writing entry points to apache_beam.egg-info/entry_points.txt
reading manifest file 'apache_beam.egg-info/SOURCES.txt'
reading manifest template 'MANIFEST.in'
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/build/gradleenv/-1734967054/lib/python3.5/site-packages/setuptools/dist.py>:476:
 UserWarning: Normalizing '2.18.0.dev' to '2.18.0.dev0'
  normalized_version,
warning: no files found matching 'README.md'
warning: no files found matching 'NOTICE'
warning: no files found matching 'LICENSE'
writing manifest file 'apache_beam.egg-info/SOURCES.txt'
error: [Errno 17] File exists: 
'<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/.eggs/pexpect-4.7.0-py3.5.egg'>

> Task :runners:spark:job-server:shadowJar

> Task :sdks:python:test-suites:direct:py35:postCommitIT
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_big_query_read 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_big_query_read_new_types 
(apache_beam.io.gcp.bigquery_read_it_test.BigQueryReadIntegrationTests) ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... SKIP: This test doesn't work on DirectRunner.
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
FAIL
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok

======================================================================
FAIL: test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests)
----------------------------------------------------------------------
Traceback (most recent call last):
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_it_test.py";,>
 line 272, in test_big_query_write_without_schema
    write_disposition=beam.io.BigQueryDisposition.WRITE_APPEND))
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 436, in __exit__
    self.run().wait_until_finish()
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 416, in run
    self._options).run(False)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/pipeline.py";,>
 line 429, in run
    return self.runner.run_pipeline(self, self._options)
  File 
"<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/runners/direct/test_direct_runner.py";,>
 line 51, in run_pipeline
    hc_assert_that(self.result, pickler.loads(on_success_matcher))
AssertionError: 
Expected: (Expected data is [(b'xyw', datetime.date(2011, 1, 1), 
datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), 
datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 
31), datetime.time(23, 59, 59)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), 
datetime.time(0, 0))])
     but: Expected data is [(b'xyw', datetime.date(2011, 1, 1), 
datetime.time(23, 59, 59, 999999)), (b'abc', datetime.date(2000, 1, 1), 
datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', datetime.date(3000, 12, 
31), datetime.time(23, 59, 59)), (b'\xab\xac\xad', datetime.date(2000, 1, 1), 
datetime.time(0, 0))] Actual data is []

-------------------- >> begin captured logging << --------------------
apache_beam.io.gcp.bigquery_write_it_test: INFO: Created dataset 
python_write_to_table_15748883127345 in project apache-beam-testing
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function annotate_downstream_side_inputs at 
0x7fd943e1fd08> ====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 9 [1, 1, 1, 1, 
1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function fix_side_input_pcoll_coders at 0x7fd943e1fe18> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 9 [1, 1, 1, 1, 
1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function lift_combiners at 0x7fd943e1fea0> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 9 [1, 1, 1, 1, 
1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function expand_sdf at 0x7fd943e1ff28> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 9 [1, 1, 1, 1, 
1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey_12\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey:beam:transform:group_by_key:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function expand_gbk at 0x7fd943e20048> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 10 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
  must follow: 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function sink_flattens at 0x7fd943e20158> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 10 [1, 1, 1, 
1, 1, 1, 1, 1, 1, 1]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['ref_AppliedPTransform_create/Read_3\n  create/Read:beam:transform:read:v1\n  
must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6\n  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\n  must 
follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7\n  
write/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\n  must follow: \n 
 downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9\n
  
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\n 
 must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read\n  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\n
  must follow: 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write\n  
downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17\n
  
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ', 
'ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19\n
  
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: \n  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function greedily_fuse at 0x7fd943e201e0> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 2 [6, 4]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
create/Read:beam:transform:read:v1\nwrite/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function read_to_impulse at 0x7fd943e20268> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 2 [7, 4]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read/Impulse:beam:transform:impulse:v1\ncreate/Read:beam:transform:read_from_impulse_python:v1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function impulse_to_input at 0x7fd943e202f0> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 2 [7, 4]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function inject_timer_pcollections at 0x7fd943e20488> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 2 [7, 4]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function sort_stages at 0x7fd943e20510> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 2 [7, 4]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
apache_beam.runners.portability.fn_api_runner_transforms: INFO: 
==================== <function window_pcollection_coders at 0x7fd943e20598> 
====================
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: 2 [7, 4]
apache_beam.runners.portability.fn_api_runner_transforms: DEBUG: Stages: 
['((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  
write/_StreamToBigQuery/AppendDestination:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/AddInsertIds:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/AddRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write:beam:sink:runner:0.1\ncreate/Read:beam:transform:read_from_impulse_python:v1\ncreate/Read/Impulse:beam:source:runner:0.1\n
  must follow: \n  downstream_side_inputs: ', 
'(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)\n
  
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read:beam:source:runner:0.1\nwrite/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps):beam:transform:pardo:v1\nwrite/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys:beam:transform:pardo:v1\nwrite/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn):beam:transform:pardo:v1\n
  must follow: 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))\n
  downstream_side_inputs: ']
apache_beam.runners.worker.statecache: INFO: Creating state cache with size 100
apache_beam.runners.portability.fn_api_runner: INFO: Created Worker handler 
<apache_beam.runners.portability.fn_api_runner.EmbeddedWorkerHandler object at 
0x7fd943ae9da0> for environment urn: "beam:env:embedded_python:v1"

apache_beam.runners.portability.fn_api_runner: INFO: Running 
((ref_AppliedPTransform_create/Read_3)+((ref_AppliedPTransform_write/_StreamToBigQuery/AppendDestination_6)+(ref_AppliedPTransform_write/_StreamToBigQuery/AddInsertIds_7)))+(((ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys_9)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps)_11))+(write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Write))
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataOutputOperation >
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], 
LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/AddInsertIds output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/AppendDestination output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], 
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <ImpulseReadOperation 
receivers=[SingletonConsumerSet[create/Read.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataInputOperation 
receivers=[SingletonConsumerSet[create/Read/Impulse.out0, 
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataInputOperation 
receivers=[SingletonConsumerSet[create/Read/Impulse.out0, 
coder=WindowedValueCoder[BytesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish 
<ImpulseReadOperation receivers=[SingletonConsumerSet[create/Read.out0, 
coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/AppendDestination output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AppendDestination.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]], 
len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/AddInsertIds output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/AddInsertIds.out0, 
coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/AddRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps) 
output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/Map(reify_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], 
LengthPrefixCoder[FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataOutputOperation 
>
apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the bundle 
bundle_16 to finish.
apache_beam.runners.portability.fn_api_runner: INFO: Running 
(((write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read)+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)_16))+(ref_AppliedPTransform_write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys_17))+(ref_AppliedPTransform_write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn)_19)
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['out_FailedRows', 'out'], 
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
 output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: start <DataInputOperation 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
IterableCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], 
LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
apache_beam.io.gcp.bigquery: DEBUG: Creating or getting table <TableReference
 datasetId: 'python_write_to_table_15748883127345'
 projectId: 'apache-beam-testing'
 tableId: 'python_no_schema_table'> with schema None.
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DataInputOperation 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/GroupByKey/Read.out0,
 coder=WindowedValueCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
IterableCoder[TupleCoder[TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
TupleCoder[LengthPrefixCoder[FastPrimitivesCoder], 
LengthPrefixCoder[FastPrimitivesCoder]]], 
LengthPrefixCoder[FastPrimitivesCoder]]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps)
 output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/ReshufflePerKey/FlatMap(restore_timestamps).out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, TupleCoder[FastPrimitivesCoder, 
FastPrimitivesCoder]]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys output_tags=['out'], 
receivers=[SingletonConsumerSet[write/_StreamToBigQuery/CommitInsertIds/RemoveRandomKeys.out0,
 coder=WindowedValueCoder[TupleCoder[FastPrimitivesCoder, 
TupleCoder[FastPrimitivesCoder, FastPrimitivesCoder]]], len(consumers)=1]]>
apache_beam.runners.worker.bundle_processor: DEBUG: finish <DoOperation 
write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn) 
output_tags=['out_FailedRows', 'out'], 
receivers=[ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out0,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0], 
ConsumerSet[write/_StreamToBigQuery/StreamInsertRows/ParDo(BigQueryWriteFn).out1,
 coder=WindowedValueCoder[FastPrimitivesCoder], len(consumers)=0]]>
apache_beam.io.gcp.bigquery: DEBUG: Attempting to flush to all destinations. 
Total buffered: 4
apache_beam.io.gcp.bigquery: DEBUG: Flushing data to 
apache-beam-testing:python_write_to_table_15748883127345.python_no_schema_table.
 Total 4 rows.
apache_beam.runners.portability.fn_api_runner: DEBUG: Wait for the bundle 
bundle_17 to finish.
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Attempting to perform query 
SELECT bytes, date, time FROM 
python_write_to_table_15748883127345.python_no_schema_table to BQ
google.auth.transport._http_client: DEBUG: Making request: GET 
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1): 
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
google.auth.transport.requests: DEBUG: Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 181
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1): 
www.googleapis.com:443
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/52826e4a-699a-48d1-a148-3ee7420940ea?location=US&timeoutMs=10000&maxResults=0
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/jobs/52826e4a-699a-48d1-a148-3ee7420940ea?location=US
 HTTP/1.1" 200 None
urllib3.connectionpool: DEBUG: https://www.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anond8d245a4b53a85e7cf1be96ca0845f6fa313724b/data
 HTTP/1.1" 200 None
apache_beam.io.gcp.tests.bigquery_matcher: INFO: Result of query is: []
apache_beam.io.gcp.bigquery_write_it_test: INFO: Deleting dataset 
python_write_to_table_15748883127345 in project apache-beam-testing
--------------------- >> end captured logging << ---------------------
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1217:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:797:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location

----------------------------------------------------------------------
XML: nosetests-postCommitIT-direct-py35.xml
----------------------------------------------------------------------
XML: 
<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 15 tests in 25.046s

FAILED (SKIP=1, failures=1)

> Task :sdks:python:test-suites:direct:py35:postCommitIT FAILED

FAILURE: Build completed with 3 failures.

1: Task failed with an exception.
-----------
* What went wrong:
Execution failed for task 
':sdks:python:test-suites:portable:py35:installGcpTest'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

2: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/dataflow/py35/build.gradle'>
 line: 56

* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

3: Task failed with an exception.
-----------
* Where:
Build file 
'<https://builds.apache.org/job/beam_PostCommit_Python35/ws/src/sdks/python/test-suites/direct/py35/build.gradle'>
 line: 51

* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py35:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.
==============================================================================

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1m 41s
80 actionable tasks: 60 executed, 20 from cache

Publishing build scan...
https://gradle.com/s/kqm6w3cp3xpom

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to