See 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/2799/display/redirect?page=changes>

Changes:

[noreply] More logging for missing next work index. (#12718)


------------------------------------------
[...truncated 21.72 MB...]
 id: '2020-09-04_00_05_24-8701404319086773166'
 location: 'us-central1'
 name: 'beamapp-jenkins-0904070517-131710'
 projectId: 'apache-beam-testing'
 stageStates: []
 startTime: '2020-09-04T07:05:25.836179Z'
 steps: []
 tempFiles: []
 type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id: 
[2020-09-04_00_05_24-8701404319086773166]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job: 
2020-09-04_00_05_24-8701404319086773166
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow 
monitoring console, please navigate to 
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-04_00_05_24-8701404319086773166?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:22.973Z: 
JOB_MESSAGE_DEBUG: Value 
"write/BigQueryBatchFileLoads/WaitForCopyJobs/_UnpickledSideInput(ParDo(TriggerCopyJobs).out.0).output"
 materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:23.132Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:26.131Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/WaitForCopyJobs/WaitForCopyJobs+write/BigQueryBatchFileLoads/RemoveTempTables/PassTables/PassTables+write/BigQueryBatchFileLoads/RemoveTempTables/AddUselessValue+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Reify+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:26.260Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:26.394Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:26.520Z: 
JOB_MESSAGE_BASIC: Executing operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:29.272Z: 
JOB_MESSAGE_BASIC: Finished operation 
write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/Read+write/BigQueryBatchFileLoads/RemoveTempTables/DeduplicateTables/GroupByWindow+write/BigQueryBatchFileLoads/RemoveTempTables/GetTableNames+write/BigQueryBatchFileLoads/RemoveTempTables/Delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:29.399Z: 
JOB_MESSAGE_DEBUG: Executing success step success44
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:29.543Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:29.729Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:29.803Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-09-04_00_05_24-8701404319086773166 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:24.793Z: 
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job 
2020-09-04_00_05_24-8701404319086773166.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:24.793Z: 
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job 
2020-09-04_00_05_24-8701404319086773166. The number of workers will be between 
1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:30.212Z: 
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.493Z: 
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.517Z: 
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not 
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.561Z: 
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.599Z: 
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into 
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.679Z: 
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.723Z: 
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.758Z: 
JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.792Z: 
JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.828Z: 
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.861Z: 
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.897Z: 
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into 
GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.934Z: 
JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:33.971Z: 
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.011Z: 
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.040Z: 
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.074Z: 
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.649Z: 
JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.720Z: 
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.765Z: 
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.800Z: 
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.844Z: 
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:34.911Z: 
JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:35.013Z: 
JOB_MESSAGE_BASIC: Executing operation 
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:05:58.644Z: 
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric 
descriptors and Stackdriver will not create new Dataflow custom metrics for 
this job. Each unique user-defined metric name (independent of the DoFn in 
which it is defined) produces a new metric descriptor. To delete old / unused 
metric descriptors see 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
 and 
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:11.226Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on 
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:21.598Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:21.698Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:21.786Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-09-03_23_58_35-8594446532781886035 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query 
SELECT bytes, date, time FROM 
python_write_to_table_15992027027899.python_no_schema_table to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of 
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth 
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using 
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth 
process...
DEBUG:google.auth._default:No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 221
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/5f221c9a-04d0-49f1-b956-a279f4e75875?maxResults=0&location=US
 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon1077b80da2bc2c15e2fb40de8121c57f7aae9b7b/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Result of query is: [(b'xyw', 
datetime.date(2011, 1, 1), datetime.time(23, 59, 59, 999999)), (b'abc', 
datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xab\xac\xad', 
datetime.date(2000, 1, 1), datetime.time(0, 0)), (b'\xe4\xbd\xa0\xe5\xa5\xbd', 
datetime.date(3000, 12, 31), datetime.time(23, 59, 59))]
INFO:apache_beam.io.gcp.bigquery_write_it_test:Deleting dataset 
python_write_to_table_15992027027899 in project apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:38.846Z: 
JOB_MESSAGE_BASIC: Executing BigQuery import job 
"dataflow_job_6493492625448460821". You can check its status with the bq tool: 
"bq show -j --project_id=apache-beam-testing dataflow_job_6493492625448460821".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:49.667Z: 
JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_6493492625448460821" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:50.525Z: 
JOB_MESSAGE_BASIC: Finished operation read+write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:50.596Z: 
JOB_MESSAGE_DEBUG: Executing success step success1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:50.667Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:50.859Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:06:50.880Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:47.835Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:47.875Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:47.914Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:50.293Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type
 matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type 
matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type 
matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:53.428Z: 
JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type 
matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type 
matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:53.503Z: 
JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:53.558Z: 
JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:07:53.637Z: 
JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type 
matches/Group/GroupByKey/GroupByWindow+Type 
matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type 
matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-09-04_00_00_51-1108042432122239676 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query 
SELECT fruit from `python_query_to_table_1599202840883.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of 
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth 
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using 
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth 
process...
DEBUG:google.auth._default:No App Engine library was found so cannot 
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET 
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3, 
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1): 
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1" 
200 144
DEBUG:google.auth.transport.requests:Making request: GET 
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET 
/computeMetadata/v1/instance/service-accounts/[email protected]/token
 HTTP/1.1" 200 221
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1): 
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST 
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/queries/09a081c0-9e83-4585-b312-326a50c4c5d8?maxResults=0&location=US
 HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET 
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anone6c0d5d3ed2ffa58fb0157026c0882762e33dfe4/data
 HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT 
fruit from `python_query_to_table_1599202840883.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum: 
158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:02.817Z: 
JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type 
matches/Group/GroupByKey/GroupByWindow+Type 
matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type 
matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:02.880Z: 
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:02.961Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:03.022Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:03.054Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:10.538Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+ExternalTransform(simple)/Map(<lambda at 
external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:13.659Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:13.709Z: 
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:13.761Z: 
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:13.824Z: 
JOB_MESSAGE_BASIC: Executing operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.021Z: 
JOB_MESSAGE_BASIC: Finished operation 
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.074Z: 
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.143Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.188Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.219Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.522Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:23.552Z: 
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:56.284Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:56.337Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:08:56.370Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-09-04_00_01_47-16772668296626516727 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:09:03.942Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:09:03.977Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:09:04.012Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-09-04_00_01_47-229864367004271508 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:17.041Z: 
JOB_MESSAGE_BASIC: Finished operation 
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:17.139Z: 
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:17.193Z: 
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:17.273Z: 
JOB_MESSAGE_BASIC: Executing operation 
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:26.341Z: 
JOB_MESSAGE_BASIC: Finished operation 
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:26.426Z: 
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:26.503Z: 
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:26.564Z: 
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:12:26.602Z: 
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:13:21.186Z: 
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:13:21.231Z: 
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-04T07:13:21.258Z: 
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job 
2020-09-04_00_05_24-8701404319086773166 is in state JOB_STATE_DONE
test_autocomplete_it 
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_streaming_wordcount_debugging_it 
(apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT)
 ... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_bigquery_tornadoes_it 
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT) 
... ok
test_datastore_wordcount_it 
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
 ... ok
test_leader_board_it 
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it 
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it 
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT) 
... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_user_score_it 
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it 
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
 ... ok
test_read_via_sql 
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
 ... ok
test_read_via_table 
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
 ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests) 
... ok
test_bqfl_streaming 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP: 
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail 
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python 
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_spanner_error 
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
 ... ok
test_spanner_update 
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
 ... ok
test_write_batches 
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
 ... ok
test_avro_file_load 
(apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... 
ok
test_copy_batch 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) 
... ok
test_copy_rewrite_token 
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_value_provider_transform 
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
 ... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit 
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT) 
... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP: 
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... 
ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances 
(apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs 
(apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_streaming_data_only 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes 
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax 
(apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_text_detection_with_language_hint 
(apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution 
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP: 
The "TestDataflowRunner", does not support the TestStream transform. Supported 
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP: 
The "TestDataflowRunner", does not support the TestStream transform. Supported 
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ... 
SKIP: The "TestDataflowRunner", does not support the TestStream transform. 
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_label_detection_with_video_context 
(apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ... 
ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_big_query_write 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... 
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema 
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_legacy_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_avro 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_new_types_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_big_query_standard_sql_kms_key_native 
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT) 
... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_job_python_from_python_it 
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok
test_metrics_it 
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
 ... ok

----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML: 
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 4235.406s

OK (SKIP=7)

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':sdks:python:container:py37:docker'.
> Process 'command 'docker'' finished with non-zero exit value 2

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug 
option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

Deprecated Gradle features were used in this build, making it incompatible with 
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See 
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings

BUILD FAILED in 1h 13m 13s
170 actionable tasks: 130 executed, 36 from cache, 4 up-to-date

Publishing build scan...
https://gradle.com/s/6vubz2fr35dy6

Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to