See 
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/131/display/redirect?page=changes>

Changes:

[Robert Bradshaw] [BEAM-10988] Partition dataframes according to size estimates.

[Robert Bradshaw] [BEAM-10988] Batch dataframes across partitions on the same 
worker.

[noreply] [BEAM-9196] Update testcontainers to 1.15.0-rc2 (#13031)

[Kenneth Knowles] Add gradle target for ValidatesRunner against Dataflow 
forcing streaming

[Kenneth Knowles] Add Jenkins job for ValidatesRunner against Dataflow forcing 
streaming

[noreply] Delete unneeded PCollections in pipeline_from_stages() (#13014)

[noreply] Revert "[BEAM-9196] Update testcontainers to 1.15.0-rc2 (#13031)"


------------------------------------------
[...truncated 35.43 KB...]
  Using cached oauthlib-3.1.0-py2.py3-none-any.whl (147 kB)
Collecting pycparser
  Using cached pycparser-2.20-py2.py3-none-any.whl (112 kB)
Collecting mypy-extensions>=0.3.0
  Using cached mypy_extensions-0.4.3-py2.py3-none-any.whl (4.5 kB)
Building wheels for collected packages: apache-beam
  Building wheel for apache-beam (setup.py): started
  Building wheel for apache-beam (setup.py): finished with status 'done'
  Created wheel for apache-beam: 
filename=apache_beam-2.26.0.dev0-py3-none-any.whl size=2288122 
sha256=ffcd792b7de1828baf8106b5ea70f5f93091add5c06aaa8b961cd942db47a61d
  Stored in directory: 
/home/jenkins/.cache/pip/wheels/d1/c9/65/2dcfb578dbb9566e63f5563e55b8c503d2d116c400c6121d1c
Successfully built apache-beam
Installing collected packages: crcmod, dill, fastavro, idna, certifi, urllib3, 
chardet, requests, docopt, hdfs, httplib2, pbr, mock, numpy, pymongo, pyasn1, 
pyasn1-modules, rsa, oauth2client, pyparsing, pydot, python-dateutil, pytz, 
typing-extensions, avro-python3, pyarrow, jmespath, botocore, s3transfer, 
boto3, azure-core, isodate, oauthlib, requests-oauthlib, msrest, pycparser, 
cffi, cryptography, azure-storage-blob, cachetools, monotonic, fasteners, 
google-apitools, google-auth, googleapis-common-protos, google-api-core, 
google-cloud-core, google-cloud-datastore, grpc-google-iam-v1, 
google-cloud-pubsub, google-crc32c, google-resumable-media, 
google-cloud-bigquery, google-cloud-bigtable, google-cloud-spanner, grpcio-gcp, 
google-cloud-dlp, google-cloud-language, google-cloud-videointelligence, 
google-cloud-vision, pyyaml, mypy-extensions, typing-inspect, libcst, 
proto-plus, google-cloud-build, freezegun, nose, nose-xunitmp, parameterized, 
pyhamcrest, requests-mock, tenacity, attrs, packaging, more-itertools, 
atomicwrites, wcwidth, pytest, apipkg, execnet, pytest-forked, pytest-xdist, 
pytest-timeout, pandas, sqlalchemy, psycopg2-binary, deprecation, wrapt, 
colorama, crayons, blindspin, websocket-client, docker, testcontainers, 
apache-beam
Successfully installed apache-beam-2.26.0.dev0 apipkg-1.5 atomicwrites-1.4.0 
attrs-20.2.0 avro-python3-1.9.2.1 azure-core-1.8.2 azure-storage-blob-12.5.0 
blindspin-2.0.1 boto3-1.15.16 botocore-1.18.16 cachetools-4.1.1 
certifi-2020.6.20 cffi-1.14.3 chardet-3.0.4 colorama-0.4.3 crayons-0.4.0 
crcmod-1.7 cryptography-3.1.1 deprecation-2.1.0 dill-0.3.1.1 docker-4.3.1 
docopt-0.6.2 execnet-1.7.1 fastavro-1.0.0.post1 fasteners-0.15 freezegun-1.0.0 
google-api-core-1.22.4 google-apitools-0.5.31 google-auth-1.22.1 
google-cloud-bigquery-1.28.0 google-cloud-bigtable-1.5.1 
google-cloud-build-2.0.0 google-cloud-core-1.4.3 google-cloud-datastore-1.15.3 
google-cloud-dlp-1.0.0 google-cloud-language-1.3.0 google-cloud-pubsub-1.7.0 
google-cloud-spanner-1.19.0 google-cloud-videointelligence-1.16.0 
google-cloud-vision-1.0.0 google-crc32c-1.0.0 google-resumable-media-1.1.0 
googleapis-common-protos-1.52.0 grpc-google-iam-v1-0.12.3 grpcio-gcp-0.2.2 
hdfs-2.5.8 httplib2-0.17.4 idna-2.10 isodate-0.6.0 jmespath-0.10.0 
libcst-0.3.12 mock-2.0.0 monotonic-1.5 more-itertools-8.5.0 msrest-0.6.19 
mypy-extensions-0.4.3 nose-1.3.7 nose-xunitmp-0.4.1 numpy-1.19.2 
oauth2client-4.1.3 oauthlib-3.1.0 packaging-20.4 pandas-1.1.3 
parameterized-0.7.4 pbr-5.5.0 proto-plus-1.10.1 psycopg2-binary-2.8.6 
pyarrow-0.17.1 pyasn1-0.4.8 pyasn1-modules-0.2.8 pycparser-2.20 pydot-1.4.1 
pyhamcrest-1.10.1 pymongo-3.11.0 pyparsing-2.4.7 pytest-4.6.11 
pytest-forked-1.3.0 pytest-timeout-1.4.2 pytest-xdist-1.34.0 
python-dateutil-2.8.1 pytz-2020.1 pyyaml-5.3.1 requests-2.24.0 
requests-mock-1.8.0 requests-oauthlib-1.3.0 rsa-4.6 s3transfer-0.3.3 
sqlalchemy-1.3.19 tenacity-5.1.5 testcontainers-3.1.0 typing-extensions-3.7.4.3 
typing-inspect-0.6.0 urllib3-1.25.10 wcwidth-0.2.5 websocket-client-0.57.0 
wrapt-1.12.1

> Task :sdks:python:apache_beam:testing:load_tests:run
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/bigquery.py>:1679:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  self.table_reference.projectId = pcoll.pipeline.options.view_as(
<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/bigquery_file_loads.py>:900:
 BeamDeprecationWarning: options is deprecated since First stable release. 
References to <pipeline>.options will not be supported
  temp_location = p.options.view_as(GoogleCloudOptions).temp_location
INFO:apache_beam.runners.portability.stager:Copying Beam SDK 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/build/apache-beam.tar.gz";>
 to staging location.
WARNING:root:Make sure that locally built Python SDK docker image has Python 
3.7 interpreter.
INFO:root:Using Python SDK docker image: apache/beam_python3.7_sdk:2.26.0.dev. 
If the image is not available at local, we will try to pull from hub.docker.com
INFO:apache_beam.runners.dataflow.internal.apiclient:Defaulting to the 
temp_location as staging_location: gs://temp-storage-for-perf-tests/loadtests
INFO:apache_beam.internal.gcp.auth:Setting socket default timeout to 60 seconds.
INFO:apache_beam.internal.gcp.auth:socket default timeout is 60.0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1010150334.1602344670.188016/pipeline.pb...
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:oauth2client.transport:Attempting refresh to obtain initial access_token
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1010150334.1602344670.188016/pipeline.pb
 in 0 seconds.
INFO:apache_beam.runners.dataflow.internal.apiclient:Starting GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1010150334.1602344670.188016/dataflow_python_sdk.tar...
INFO:apache_beam.runners.dataflow.internal.apiclient:Completed GCS upload to 
gs://temp-storage-for-perf-tests/loadtests/performance-tests-bqio-write-python-batch-10gb1010150334.1602344670.188016/dataflow_python_sdk.tar
 in 0 seconds.
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--output_dataset=beam_performance', '--output_table=bqio_write_10GB']
WARNING:apache_beam.options.pipeline_options:Discarding unparseable args: 
['--output_dataset=beam_performance', '--output_table=bqio_write_10GB']
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
 createTime: '2020-10-10T15:44:32.164960Z'
 currentStateTime: '1970-01-01T00:00:00Z'
 id: '2020-10-10_08_44_31-16933922622764105253'
 location: 'us-central1'
 name: 'performance-tests-bqio-write-python-batch-10gb1010150334'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-10-10T15:44:32.164960Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2020-10-10_08_44_31-16933922622764105253]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2020-10-10_08_44_31-16933922622764105253
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-10-10_08_44_31-16933922622764105253?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-10-10_08_44_31-16933922622764105253 is in state JOB_STATE_PENDING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:35.030Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-c.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:35.868Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:35.916Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables: GroupByKey 
not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:35.957Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations: GroupByKey not 
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:35.994Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows: GroupByKey not followed by a 
combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.039Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.062Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.197Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.253Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.282Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s19 for input s13.WrittenFiles
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.313Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of Write to 
BigQuery/BigQueryBatchFileLoads/IdentityWorkaround, through flatten Write to 
BigQuery/BigQueryBatchFileLoads/DestinationFilesUnion, into producer Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.350Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write into Write 
to BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.383Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow 
into Write to BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.417Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles) 
into Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.449Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)
 into Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.488Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
 into Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.525Z: 
JOB_MESSAGE_DETAILED: Unzipping flatten s19-u32 for input s20.None-c30
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.557Z: 
JOB_MESSAGE_DETAILED: Fusing unzipped copy of Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify, through 
flatten Write to 
BigQuery/BigQueryBatchFileLoads/DestinationFilesUnion/Unzipped-1, into producer 
Write to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.590Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/IdentityWorkaround into Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.650Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify into Write 
to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.675Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Count messages into Produce rows
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.698Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Format into Count messages
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.720Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Measure time into Format
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.755Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal into Measure time
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.783Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/AppendDestination into Write to 
BigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.813Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
 into Write to BigQuery/BigQueryBatchFileLoads/AppendDestination
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.842Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations) into Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.877Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify into Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.914Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write into Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.944Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow into Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:36.968Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/DropShardNumber into Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.002Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile
 into Write to BigQuery/BigQueryBatchFileLoads/DropShardNumber
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.083Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix into Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.117Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix into Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.152Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/GenerateFilePrefix into Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.187Z: 
JOB_MESSAGE_DETAILED: Fusing siblings Write to 
BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs
 and Write to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.219Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs) 
into Write to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.256Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables into 
Write to BigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.280Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue into Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.312Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify into 
Write to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.364Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write into 
Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.399Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
 into Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.422Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames into Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.477Z: 
JOB_MESSAGE_DETAILED: Fusing consumer Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete into Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.510Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.541Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.574Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.598Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.833Z: 
JOB_MESSAGE_DEBUG: Executing wait step start46
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.912Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+Write to 
BigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/GenerateFilePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.949Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.973Z: 
JOB_MESSAGE_DEBUG: Starting **** pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:37.982Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.008Z: 
JOB_MESSAGE_BASIC: Starting 5 ****s in us-central1-c...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.021Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.057Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.073Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.073Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.074Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.095Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.161Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.194Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Session" 
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.232Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Session" 
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:38.265Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseEmptyPC/Read.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-10-10_08_44_31-16933922622764105253 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:44:57.561Z: 
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric 
descriptors, so new user metrics of the form custom.googleapis.com/* will not 
be created. However, all user metrics are also available in the metric 
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics, 
you can delete old / unused metric descriptors. See 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:45:02.971Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 5 based on the 
rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:46:46.020Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:46:46.061Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.676Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read+Write to 
BigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix+Write to 
BigQuery/BigQueryBatchFileLoads/GenerateFilePrefix
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.760Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/ImpulseSingleElementPC/Read.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.802Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/LoadJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.836Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/CopyJobNamePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.867Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/GenerateFilePrefix.out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.904Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.949Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.984Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:17.988Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.003Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.011Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.041Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.042Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.072Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.076Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.103Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.113Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/_UnpickledSideInput(LoadJobNamePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.146Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/_UnpickledSideInput(CopyJobNamePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.177Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.216Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/_UnpickledSideInput(GenerateFilePrefix.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:50:18.287Z: 
JOB_MESSAGE_BASIC: Executing operation Produce rows+Count 
messages+Format+Measure time+Write to 
BigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+Write to 
BigQuery/BigQueryBatchFileLoads/AppendDestination+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write
 to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:42.664Z: 
JOB_MESSAGE_BASIC: Finished operation Produce rows+Count 
messages+Format+Measure time+Write to 
BigQuery/BigQueryBatchFileLoads/RewindowIntoGlobal+Write to 
BigQuery/BigQueryBatchFileLoads/AppendDestination+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)/ParDo(WriteRecordsToFile)+Write
 to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write+Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(_ShardDestinations)+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Reify+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:42.783Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:42.838Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:42.913Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+Write to 
BigQuery/BigQueryBatchFileLoads/DropShardNumber+Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+Write
 to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:43.214Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/Read+Write to 
BigQuery/BigQueryBatchFileLoads/GroupShardedRows/GroupByWindow+Write to 
BigQuery/BigQueryBatchFileLoads/DropShardNumber+Write to 
BigQuery/BigQueryBatchFileLoads/WriteGroupedRecordsToFile/WriteGroupedRecordsToFile+Write
 to BigQuery/BigQueryBatchFileLoads/IdentityWorkaround+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Reify+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:43.287Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:43.340Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:43.428Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+Write
 to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+Write
 to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+Write
 to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:46.878Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/Read+Write to 
BigQuery/BigQueryBatchFileLoads/GroupFilesByTableDestinations/GroupByWindow+Write
 to 
BigQuery/BigQueryBatchFileLoads/ParDo(PartitionFiles)/ParDo(PartitionFiles)+Write
 to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs)/ParDo(TriggerLoadJobs)+Write
 to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables/TriggerLoadJobsWithoutTempTables
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:46.964Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).out"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.007Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithTempTables/ParDo(TriggerLoadJobs).TemporaryTables"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.036Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/TriggerLoadJobsWithoutTempTables.out" 
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.069Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.109Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.135Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.140Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.165Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.176Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.198Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.211Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/_UnpickledSideInput(ParDo(TriggerLoadJobs).out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.249Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/_UnpickledSideInput(ParDo(TriggerLoadJobs).TemporaryTables.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.254Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.287Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/_UnpickledSideInput(TriggerLoadJobsWithoutTempTables.out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.324Z: 
JOB_MESSAGE_DEBUG: Value "Write to BigQuery/BigQueryBatchFileLoads/Flatten.out" 
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:54:47.359Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+Write
 to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+Write
 to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:20.839Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForDestinationLoadJobs/WaitForDestinationLoadJobs+Write
 to 
BigQuery/BigQueryBatchFileLoads/WaitForTempTableLoadJobs/WaitForTempTableLoadJobs+Write
 to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs)/ParDo(TriggerCopyJobs)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:20.931Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/ParDo(TriggerCopyJobs).out" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:21.030Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:21.088Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:21.147Z: 
JOB_MESSAGE_DEBUG: Value "Write to 
BigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:21.248Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+Write 
to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:22.478Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+Write 
to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:22.534Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:22.586Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:22.663Z: 
JOB_MESSAGE_BASIC: Executing operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+Write 
to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+Write
 to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:22.846Z: 
JOB_MESSAGE_BASIC: Finished operation Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+Write 
to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+Write
 to BigQuery/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+Write to 
BigQuery/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:22.924Z: 
JOB_MESSAGE_DEBUG: Executing success step success44
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:23.034Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:23.093Z: 
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:55:23.125Z: 
JOB_MESSAGE_BASIC: Stopping **** pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:56:14.855Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool from 5 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:56:14.931Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-10-10T15:56:14.973Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-10-10_08_44_31-16933922622764105253 is in state JOB_STATE_DONE
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Load test results 
for test: 039b2fa35a064686953116589fab29b3 and timestamp: 1602345388.425404:
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
bqio_write_10GB_results_count_messag_total_messages Value: 10485760
INFO:apache_beam.testing.load_tests.load_test_metrics_utils:Metric: 
bqio_write_10GB_results_runtime Value: 260
INFO:apache_beam.io.gcp.tests.utils:Clean up a BigQuery table with project: 
apache-beam-testing, dataset: beam_performance, table: bqio_write_10GB.
Traceback (most recent call last):
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py";,>
 line 104, in delete_bq_table
    client.delete_table(table_ref)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/bigquery/client.py";,>
 line 1512, in delete_table
    timeout=timeout,
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/bigquery/client.py";,>
 line 641, in _call_api
    return call()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/api_core/retry.py";,>
 line 286, in retry_wrapped_func
    on_error=on_error,
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/api_core/retry.py";,>
 line 184, in retry_target
    return target()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/build/gradleenv/1329484227/lib/python3.7/site-packages/google/cloud/_http.py";,>
 line 435, in api_request
    raise exceptions.from_http_response(response)
google.api_core.exceptions.NotFound: 404 DELETE 
https://bigquery.googleapis.com/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance/tables/bqio_write_10GB?prettyPrint=false:
 Not found: Table apache-beam-testing:beam_performance.bqio_write_10GB

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/usr/lib/python3.7/runpy.py", line 193, in _run_module_as_main
    "__main__", mod_spec)
  File "/usr/lib/python3.7/runpy.py", line 85, in _run_code
    exec(code, run_globals)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_perf_test.py";,>
 line 109, in <module>
    BigQueryWritePerfTest().run()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/load_test.py";,>
 line 155, in run
    self.cleanup()
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/bigquery_write_perf_test.py";,>
 line 104, in cleanup
    self.project_id, self.output_dataset, self.output_table)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/utils/retry.py";,>
 line 260, in wrapper
    return fun(*args, **kwargs)
  File 
"<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/io/gcp/tests/utils.py";,>
 line 106, in delete_bq_table
    raise GcpTestIOError('BigQuery table does not exist: %s' % table_ref)
apache_beam.io.gcp.tests.utils.GcpTestIOError: BigQuery table does not exist: 
TableReference(DatasetReference('apache-beam-testing', 'beam_performance'), 
'bqio_write_10GB')

> Task :sdks:python:apache_beam:testing:load_tests:run FAILED

FAILURE: Build failed with an exception.

* Where:
Build file 
'<https://ci-beam.apache.org/job/beam_PerformanceTests_BiqQueryIO_Write_Python_Batch/ws/src/sdks/python/apache_beam/testing/load_tests/build.gradle'>
 line: 58

* What went wrong:
Execution failed for task ':sdks:python:apache_beam:testing:load_tests:run'.
> error occurred

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 13m 11s
5 actionable tasks: 5 executed

Publishing build scan...
https://gradle.com/s/nel5cvikup7sw

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure


---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to