See
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/2893/display/redirect>
Changes:
------------------------------------------
[...truncated 34.22 MB...]
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "m_out.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s4"
},
"serialized_fn":
"QlpoOTFBWSZTWWeYg/EAApJ/8H/////////////cwr///+ZjwCAAYEBAAvYZlAZGGqZKeap6nqaD1Bk2o9T9UeoGgAaAaDQaNPSeoaD1PUAGnqeoMj0gZNBKmiZNCYmI1J4mSbSYg0yZAAGgABoDQaABoNAyMgGmSIQ0pspptTYiDTTQNBgQyaAGIyAGg0Bp6BNqeoGgeoOAAAAAAAAAAaAAAAAAAAAADvptRksy3rYqqrRyJX2qH0NSOBctdVUdLQAm1Vh2UANxiG8xcsPs2/V+wqRYe1ULTTsFGP9a61H8xkkgE7und2KQKCYLXStXLn3eFthcDyETfV4UGFKZNeBBEHUSJLWyzBEAWMCthwxMLlv1cDyano9cgF8al+wdq5r/REAkG1Xg8aacyW/LCkBXoTuB4iGoncNkgf7kZTycpPhEogIUlcmUXbLXkkTvSvGWUVI8gorCTQi4qGCTsUwnjQkyt2ZyOO9OYuufzYucYr+tb6skz+fLbYmGltqvc0ZyO62U1BN/063ZqwM9sHukMYDtXzQzuCo6I8hcg9bN+uA5zVxEToRovLZIoOS0PpZ6DtewoIpXjNEHdMET2MhmBwtiJO/7kFGvRXCBJCR3E0VT+tlPoVcp7u8FEolFZEAQgtDOCFCYBounGasGioKCtPYuiTpMtBVhRemCeN1kw3xz7+O3HdpAYYCUXQKJCFvkypPA1qn+cbuIJzJofIN2BcCmDlvIBCstlwFz2aQYXYogZO/bA/V1tacCaIv1lmiMtOWCxQ22hA4/JeLrKatgfimoi1XTBYqKI1UlwpqoFFxMxI7eKk1vELwoiakO73oRuciOUowiTwDyilzBw1LIP03KHIxylREq0IIBlhEEIgRqAmRKSZspXcY8WJZc1S79gLtt2o65KoRUptqAP0C0mEjQrLUMMdcwvLxdDZ2YWF3a2YmEc6V3Flq/zGko0q1i7CE9F60VVmlLVbxGN1Kc8GIjGCmM2+eg7pKnM7xezfVKYmJBAxbGHNEIVArMrZhiGweBLCVomDBhtYxZft2DIMzallLUjBweKHOZYLJJaYlEbRaHX4A2iWd0kmIYgPGrVkkFdUkblNPWmlouz7UYWBJGThIYIMpTkP5/pJL782MpXbJs/HJr2mwipgUqF1LL6WhjchFIhEb70ijjxamTOr2EAb6pUqdP+qQ4F4FkIiHj/i7kinChIM8xB+I=",
"user_name": "m_out"
}
}
],
"type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: '2020-09-27T19:01:45.210216Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-09-27_12_01_43-1165577789104850003'
location: 'us-central1'
name: 'beamapp-jenkins-0927190138-148334'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-09-27T19:01:45.210216Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-09-27_12_01_43-1165577789104850003]
INFO:apache_beam.runners.dataflow.internal.apiclient:Submitted job:
2020-09-27_12_01_43-1165577789104850003
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_12_01_43-1165577789104850003?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-09-27_12_01_43-1165577789104850003?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-09-27_12_01_43-1165577789104850003 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:43.925Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-09-27_12_01_43-1165577789104850003.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:43.925Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-09-27_12_01_43-1165577789104850003. The number of workers will be between
1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:48.185Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-b.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.132Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.171Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step GroupByKey: GroupByKey not
followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.218Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.244Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.313Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.357Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.393Z:
JOB_MESSAGE_DETAILED: Fusing consumer metrics into Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.426Z:
JOB_MESSAGE_DETAILED: Fusing consumer map_to_common_key into metrics
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.459Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Reify into map_to_common_key
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.494Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/Write into GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.516Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupByKey/GroupByWindow into
GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.547Z:
JOB_MESSAGE_DETAILED: Fusing consumer m_out into GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.570Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.602Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.635Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.669Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.803Z:
JOB_MESSAGE_DEBUG: Executing wait step start13
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.867Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.914Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.945Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-b...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:49.990Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:50.059Z:
JOB_MESSAGE_DEBUG: Value "GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:01:50.128Z:
JOB_MESSAGE_BASIC: Executing operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:02:13.512Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:02:19.732Z:
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric
descriptors, so new user metrics of the form custom.googleapis.com/* will not
be created. However, all user metrics are also available in the metric
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics,
you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:11.462Z:
JOB_MESSAGE_BASIC: Finished operation Type matches/Create/Read+Type
matches/Group/pair_with_0+Type matches/Group/GroupByKey/Reify+Type
matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:15.256Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+InspectForDetails/ParDo(_InspectFn)+ParDo(CallableWrapperDoFn)/ParDo(CallableWrapperDoFn)+Type
matches/WindowInto(WindowIntoFn)+Type matches/ToVoidKey+Type
matches/Group/pair_with_1+Type matches/Group/GroupByKey/Reify+Type
matches/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:15.315Z:
JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:15.365Z:
JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:15.432Z:
JOB_MESSAGE_BASIC: Executing operation Type matches/Group/GroupByKey/Read+Type
matches/Group/GroupByKey/GroupByWindow+Type
matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type
matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:19.570Z:
JOB_MESSAGE_BASIC: Executing BigQuery import job
"dataflow_job_5672647391000191655". You can check its status with the bq tool:
"bq show -j --project_id=apache-beam-testing dataflow_job_5672647391000191655".
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:24.629Z:
JOB_MESSAGE_BASIC: Finished operation Type matches/Group/GroupByKey/Read+Type
matches/Group/GroupByKey/GroupByWindow+Type
matches/Group/Map(_merge_tagged_vals_under_key)+Type matches/Unkey+Type
matches/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:24.683Z:
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:24.760Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:24.794Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:24.827Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:30.106Z:
JOB_MESSAGE_BASIC: BigQuery import job "dataflow_job_5672647391000191655" done.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:30.821Z:
JOB_MESSAGE_BASIC: Finished operation
read/ReadFromBigQuery/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough)+write/Write/NativeWrite
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:30.886Z:
JOB_MESSAGE_DEBUG: Value
"read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(PassThrough)/ParDo(PassThrough).cleanup_signal"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:30.966Z:
JOB_MESSAGE_BASIC: Executing operation
read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:31.030Z:
JOB_MESSAGE_BASIC: Finished operation
read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:31.112Z:
JOB_MESSAGE_DEBUG: Value
"read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/_UnpickledSideInput(ParDo(PassThrough).cleanup_signal.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:31.202Z:
JOB_MESSAGE_BASIC: Executing operation
read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:36.919Z:
JOB_MESSAGE_BASIC: Finished operation
read/ReadFromBigQuery/_PassThroughThenCleanup/Create/Read+read/ReadFromBigQuery/_PassThroughThenCleanup/ParDo(RemoveJsonFiles)/ParDo(RemoveJsonFiles)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:36.997Z:
JOB_MESSAGE_DEBUG: Executing success step success2
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:37.089Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:37.185Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:37.221Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:35.974Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:42.350Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+ExternalTransform(simple)/Map(<lambda at
external_it_test.py:43>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:42.412Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:42.463Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:42.527Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:49.957Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:49.986Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:51.711Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:51.794Z:
JOB_MESSAGE_DEBUG: Executing success step success19
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:51.871Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:51.926Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:03:51.954Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:06.210Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:06.256Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:06.295Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-09-27_11_57_32-3403967312454559348 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:31.776Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:31.841Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:31.884Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-09-27_11_57_31-13789166174429209981 is in state JOB_STATE_DONE
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Attempting to perform query
SELECT fruit from `python_query_to_table_16012330406553.output_table`; to BQ
DEBUG:google.auth._default:Checking None for explicit credentials as part of
auth process...
DEBUG:google.auth._default:Checking Cloud SDK credentials as part of auth
process...
DEBUG:google.auth._default:Cloud SDK credentials not found on disk; not using
them
DEBUG:google.auth._default:Checking for App Engine runtime as part of auth
process...
DEBUG:google.auth._default:No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
DEBUG:google.auth.transport._http_client:Making request: GET
http://169.254.169.254
DEBUG:google.auth.transport._http_client:Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
DEBUG:urllib3.util.retry:Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
DEBUG:urllib3.connectionpool:Starting new HTTP connection (1):
metadata.google.internal:80
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
DEBUG:google.auth.transport.requests:Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
DEBUG:urllib3.connectionpool:http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 221
DEBUG:urllib3.connectionpool:Starting new HTTPS connection (1):
bigquery.googleapis.com:443
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/jobs HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/queries/2efffde0-ce92-4cca-991f-ce285325e3ed?maxResults=0&location=US
HTTP/1.1" 200 None
DEBUG:urllib3.connectionpool:https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/_7357fab0f784d2a7327ddbe81cdd1f4ca7e429cd/tables/anon7f727a58e11fd6250696b4b28f5bc00064e90567/data
HTTP/1.1" 200 None
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Read from given query (SELECT
fruit from `python_query_to_table_16012330406553.output_table`;), total rows 2
INFO:apache_beam.io.gcp.tests.bigquery_matcher:Generate checksum:
158a8ea1c254fcf40d4ed3e7c0242c3ea0a29e72
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:42.130Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:42.162Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:04:42.191Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-09-27_11_57_48-16469637630514701175 is in state JOB_STATE_DONE
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:39.812Z:
JOB_MESSAGE_BASIC: Finished operation
Create/Read+metrics+map_to_common_key+GroupByKey/Reify+GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:39.904Z:
JOB_MESSAGE_BASIC: Executing operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:39.961Z:
JOB_MESSAGE_BASIC: Finished operation GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:40.032Z:
JOB_MESSAGE_BASIC: Executing operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:49.128Z:
JOB_MESSAGE_BASIC: Finished operation
GroupByKey/Read+GroupByKey/GroupByWindow+m_out
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:49.194Z:
JOB_MESSAGE_DEBUG: Executing success step success11
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:49.297Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:49.366Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:07:49.396Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:08:34.624Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:08:34.678Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-09-27T19:08:34.720Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-09-27_12_01_43-1165577789104850003 is in state JOB_STATE_DONE
test_bigquery_tornadoes_it
(apache_beam.examples.cookbook.bigquery_tornadoes_it_test.BigqueryTornadoesIT)
... ok
test_streaming_wordcount_debugging_it
(apache_beam.examples.streaming_wordcount_debugging_it_test.StreamingWordcountDebuggingIT)
... SKIP: Skipped due to [BEAM-3377]: assert_that not working for streaming
test_datastore_wordcount_it
(apache_beam.examples.cookbook.datastore_wordcount_it_test.DatastoreWordCountIT)
... ok
test_autocomplete_it
(apache_beam.examples.complete.autocomplete_test.AutocompleteTest) ... ok
test_leader_board_it
(apache_beam.examples.complete.game.leader_board_it_test.LeaderBoardIT) ... ok
test_game_stats_it
(apache_beam.examples.complete.game.game_stats_it_test.GameStatsIT) ... ok
test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT) ... ok
test_wordcount_it (apache_beam.examples.wordcount_it_test.WordCountIT) ... ok
test_wordcount_fnapi_it (apache_beam.examples.wordcount_it_test.WordCountIT)
... ok
test_user_score_it
(apache_beam.examples.complete.game.user_score_it_test.UserScoreIT) ... ok
test_avro_it (apache_beam.examples.fastavro_it_test.FastavroIT) ... ok
test_hourly_team_score_it
(apache_beam.examples.complete.game.hourly_team_score_it_test.HourlyTeamScoreIT)
... ok
test_read_via_sql
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_read_via_table
(apache_beam.io.gcp.experimental.spannerio_read_it_test.SpannerReadIntegrationTest)
... ok
test_bqfl_streaming
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... SKIP:
TestStream is not supported on TestDataflowRunner
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_one_job_fails_all_jobs_fail
(apache_beam.io.gcp.bigquery_file_loads_test.BigQueryFileLoadsIT) ... ok
test_bigquery_read_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_bigquery_read_custom_1M_python
(apache_beam.io.gcp.bigquery_io_read_it_test.BigqueryIOReadIT) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadNewTypesTests)
... ok
test_avro_file_load
(apache_beam.io.gcp.bigquery_test.BigQueryFileLoadsIntegrationTests) ... ok
test_iobase_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_native_source (apache_beam.io.gcp.bigquery_read_it_test.ReadTests) ... ok
test_spanner_error
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_spanner_update
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_write_batches
(apache_beam.io.gcp.experimental.spannerio_write_it_test.SpannerWriteIntegrationTest)
... ok
test_copy (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ...
ok
test_copy_batch
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_kms
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_batch_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_copy_kms (apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest)
... ok
test_copy_rewrite_token
(apache_beam.io.gcp.gcsio_integration_test.GcsIOIntegrationTest) ... ok
test_multiple_destinations_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_value_provider_transform
(apache_beam.io.gcp.bigquery_test.BigQueryStreamingInsertTransformIntegrationTests)
... ok
test_transform_on_gcs (apache_beam.io.fileio_test.MatchIntegrationTest) ... ok
test_datastore_write_limit
(apache_beam.io.gcp.datastore.v1new.datastore_write_it_test.DatastoreWriteIT)
... ok
test_parquetio_it (apache_beam.io.parquetio_it_test.TestParquetIT) ... ok
test_dicom_search_instances
(apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_dicom_store_instance_from_gcs
(apache_beam.io.gcp.dicomio_integration_test.DICOMIoIntegrationTest) ... ok
test_file_loads (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ... SKIP:
https://issuetracker.google.com/issues/118375066
test_streaming_inserts (apache_beam.io.gcp.bigquery_test.PubSubBigQueryIT) ...
ok
test_streaming_data_only
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_streaming_with_attributes
(apache_beam.io.gcp.pubsub_integration_test.PubSubIntegrationTest) ... ok
test_analyzing_syntax
(apache_beam.ml.gcp.naturallanguageml_test_it.NaturalLanguageMlTestIT) ... ok
test_text_detection_with_language_hint
(apache_beam.ml.gcp.visionml_test_it.VisionMlTestIT) ... ok
test_basic_execution
(apache_beam.testing.test_stream_it_test.TestStreamIntegrationTests) ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream supports emitting to multiple PCollections. ... SKIP:
The "TestDataflowRunner", does not support the TestStream transform. Supported
runners: ['DirectRunner', 'SwitchingDirectRunner']
Tests that the TestStream can independently control output watermarks. ...
SKIP: The "TestDataflowRunner", does not support the TestStream transform.
Supported runners: ['DirectRunner', 'SwitchingDirectRunner']
test_label_detection_with_video_context
(apache_beam.ml.gcp.videointelligenceml_test_it.VideoIntelligenceMlTestIT) ...
ok
test_big_query_write
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_new_types
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
test_big_query_write_schema_autodetect
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ...
SKIP: DataflowRunner does not support schema autodetection
test_big_query_write_without_schema
(apache_beam.io.gcp.bigquery_write_it_test.BigQueryWriteIntegrationTests) ... ok
Runs streaming Dataflow job and verifies that user metrics are reported ... ok
test_deidentification (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_inspection (apache_beam.ml.gcp.cloud_dlp_it_test.CloudDLPIT) ... ok
test_big_query_legacy_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_avro
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_new_types_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_big_query_standard_sql_kms_key_native
(apache_beam.io.gcp.big_query_query_to_table_it_test.BigQueryQueryToTableIT)
... ok
test_job_python_from_python_it
(apache_beam.transforms.external_it_test.ExternalTransformIT) ... ok
test_metrics_fnapi_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
test_metrics_it
(apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest)
... ok
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df-py37.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 66 tests in 3926.103s
OK (SKIP=7)
FAILURE: Build failed with an exception.
* Where:
Script
'<https://ci-beam.apache.org/job/beam_PostCommit_Python37/ws/src/sdks/python/test-suites/direct/common.gradle'>
line: 135
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py37:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.6.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 8m 23s
174 actionable tasks: 123 executed, 47 from cache, 4 up-to-date
Publishing build scan...
https://gradle.com/s/osmmyvsazfnsk
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]