See
<https://ci-beam.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/1830/display/redirect?page=changes>
Changes:
[Robin Qiu] Fix bug in JoinScanWithRefConverter
[ningk] [BEAM-10771] Added Whitespacelint job to pull request template
[noreply] Use setCoder, not setSchema (#12662)
[noreply] [BEAM-10703] Prepare Dataflow Java runner for shardable states
(#12578)
[noreply] [BEAM-10549] Improve runtime type checking performance for the Python
[Kyle Weaver] [BEAM-10460] Remove SparkPortableExecutionTest altogether.
[noreply] [BEAM-10788] Also disable immutability checking for transforms within
[noreply] [BEAM-5715, BEAM-8862, BEAM-8702] Removes all references to grpc_all
and
[noreply] [BEAM-10761] Prefer TestPubsub#assertSubscriptionEventuallyExists when
[noreply] [BEAM-10654] Implemented ExternalSchemaIOTransformRegistrar for jdbc
[noreply] Merge pull request #12580 from [BEAM-2855] nexmark python suite
------------------------------------------
[...truncated 291.59 KB...]
}
},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
},
{
"@type":
"FastPrimitivesCoder$QlpoOTFBWSZTWYQR6NMAAEDXwH8QgCEJAEBAv279AmAAIABqEqnqGgaABpoyNAGVHpGgMgDQBk0oR6IeoECBiqqU5NY23ndshzT2UPUOGrg42YPi9VyA8lbwwPJgtghxs5Qq1aWwExCDeMa0RHC2QigTCdizz1nx+LuSKcKEhCCPRpg=",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "Write/Write/WriteImpl/FinalizeWrite.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s7"
},
"serialized_fn":
"QlpoOTFBWSZTWeFWTx8ACKN/////////////////////////////////////////////4AjT6gN26rdTHZtCxsgE9g0k4g0kIaSYnpo0j0TENPSbSep6TT0EbUMho00DT1NAGmmgaAAAAAABoAAANA0DQMjQ0AGgAaNNqGTJo00PSaaDMkGSEATEYU8gmmpPKPUNHqabUAAADINGmQAAAAAGQAAAA0AAANAAAAAAAAAAAAABSVP2Sj1BDATJoHqGJgEaZMEyaDCMJkyYAEGEwDQmRoxGmhk9CMRkw0NEGmTEaBiDAmE0YmQMmmmBGmhoaGAQep/qkGITACNMTTABMEaZMBGmmTAEwBGACYAIwARppiYATAAAjTAAJgAAmmAAIMIaNAwAJMABMAATAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAASJE0CEwgjE1T2hT9CCNB4oHqZqPUyabSMQAA0DTTQBkGmaamgAAADQ0AAAAAAAAAAAAAAAGVTCp3UwgWJMuK7LkU8b4mlfEsN9fU4MOAY6S2yB5YuLjyTcUT9q0420yd5dhWhmzNO+7RRAuZ+APGO/uGsEQgIUKBASNMSbSAZqaJ32aS5ZMY13ToHrEpsY0JTEIJ1zIIAxZ3kLAUCeJCOAUh0L4URW696a5du1X8Fm1bzbUwlpqGExGceM/waaR4c0AicC1maZqmzNY4rJqBpFt5cdKRaLexnnPEzAdKeGJBYYgxd+KzxF4YriZWivGSDv9NzmKwrnQmEwnBOQpWkphMCdhPx81OUBHP+Ajh9F0bbnOclKviRWm5s4kjSshmHn6iU6Qi+/obp15TEEsUBGMlDkokSkNhM0yUkNjeUghsOLIUbckMlouaZDaZGM6+M5YniM82QVGkTLVkDRNQoWEywpb1gLQhnORiZy6qOvBIzIwfT0O34P/WiQibVPLr+4xT0yxEp3+d4EAJpSUn2fl+6oXFkc0zUyL4HzqefQYMTRoi+XYHxoeLL9WbNnOv2iiiiijoYxCEIQxjEIknOPwq6KBWAQtgEAzCQo13bMC30nA57XXKSOEXcqKoXTRNNXsLllJzZMTdTIWTh51RdBe1AqLZcKjC48WAb7mDlzv7pcn4DCMTGNrxCyp5kFmoyIqiFwlG+NUrXDn0Th5ngWesykaKdw+wJwNnNQauT3HcgoZINODKB1+KFEwcHjheL9h6hFwkQPqC2KIuHJiVphUogKV2qBTMi7EdDBx/DXGFqmzYmeOJEzlQw3td5AImuINe2b+J8CAJOhGT/Rllp/Dsr/+Jx2GUQLFE9xCpKj0oCBGwmDADBIRpAY2fz+n2e13KO+sEllpXLzPhnJ340H5poN3BjGBhDq1dkeFryc9E9ErPJDoiLLIKILPWUlnAeAIKCd74K6AOiDL7faRjH1KtmGwEIBwDDU0vlljWrokdZbovzTrnKMY5z5kaqj7HHpdqykduZbHIvyJ0sZFacg4HSZ45jsdmNc7uy+QwBbrYJ/GzYLLH6J1UECClViz4IuaRzyMOSMMYzEiBQrGLMmJNZXgEnCHnIdbLc6aipuIvlApNchKsOMpMqLIqlfwIwZBWqVZfEncTnup5NYpvytdZVbCyY8ym0Cy85eVlWCoGF/N297R2NMVqSVirDELKZFgF1wFcxNnGqWe6gOOA2A+gk0XO54y3O9AtIAyFqUyOGgbV4FQV16WW0IVCYCAJOSokCRA8k5SSSfIgDg+VonDcYg3nEHUrR1+3b+1B8Ot5Q+U6Mse9y6jumHfqwFi6ubmOvLpSYEZYrYFXa8kIAmABkgiVARAMxAKIz5F5E1QlIv4RWaZ2gkMPcFA6Q8FabGwqk1LHLXMMRmQGuuutiBMcgLqAgTztkzA4ZrEGQOqfjMcD5LG3bvMW9dR3wvIMfIWqwWNcct+rlU3fRdWUq1FWBYkCpS2HZWKFXN1SbrrNJJLCv+23WkzGRwCd2/rK54pceEWBnSNcppz5kVLxatMzKOoldETIv0GgdJoI5dQTA2TBQaVCdqxuMAqciqri8eqFdKPxe4NoBctz35mnC16WoG1XqQrFsFxQEgOBKVsEoDQWFuRJXOfABSFrqKyjEFqKvNTzcIQGD6U3PzNHCkG8QgfgnAwIUrYFyjb6kKtgpln8qKGmKNwIAliyquVfQfYS8QMEuTRhKgVE+r1DiPsYhcQGbShp2QXAMgDAXMCmoKNtGAPScOTQwdHV3irY8P64lxFFBYYB5A5K/kTg2gpHJNXs5MZZdvlb105gaG2a2aWUIyGW0VOb6jLjQZNWoHeLWUIgQ0yiGGLVJyDtBY0UlFKn74Qh8C3iA19ZGOZgMWCoXTIce8QabZQvIu8zKFc2MkK8g2y8b9MBQltQ0qg2pn36dtMYQKxii94R4ruAnoZ1hXZDVgk2CWZqaks9XAyamDKJt3lNGCYseIi05QsVGuIqruDgFg19pUFg0ZzRSwMAVK8tamhOhBFUAYnIU/Z4ZibbvZdBRhJUauFEOZcmMk1BCu3xjnPRNrBu+Y0RSxdz3DFs0gjZGwaWkUM1ismnaGEkoadpVCnus2WWi7eLTM4qzKiRdXtoy41abm2o2BFGr0ZmbZaiBYzo1LDDYNmvAUWDaRYlZGGNkpWZGA2aBam2mq1zXq+tUln4yGpc7D5PnYiKnXdTWteaLq6x1j7rrtZjrzTUlSI9KkbzzzxeicYudTS3d+RiI3JJGpaVeUxkXuWHbb7VdtpjkC7LSmn36kwUgfUI4VBlFLepNQlQ8PEwgDBhKVigAAAAAAAAAAFgADQACMWLJhQ0CABBBBBBG8N0SehktNDQuuriLJNtt3XRJ3XXF11999M0jBgyypgiL3LLlzpNttt1VYJVyqk223dklLoVdOcRGDLKVJotYs2GRs84q0mlKaW33G1nhTWoJjGaysNc2sBV9nWwcU7d5ixApCVpfuVus2dO26sdkJccwUWDjcK0VMGMg5kNE0jUVYt0Ue7HNjUt4upYSSztPY3h+ZvskH4AumVwVgXRi0A6Icbjw4gxTHKZWrAlFCr5cYvktOSloF0ZoFhw5pBlexuPCFRqjPSDOyhu8hjMEt60phOWbYxVZazLYJ8ClwiSz3AixaUbzRx6szRs1SsUYlcqywyZcriBMMw1pBLDK4oIMrBcacb9h9zWwiJ+DFLFD+FdmJmrNwzlYCB2Q3qRPAIEIqwPazOZsfL9f85GgjDsAmZeJLizlqgC7jGrka61ODppGmr6EjEjZ8apI1K9YlKxba01ihkktuO9wwbVxVawlWVYjcqqlLLitYN4p2m+aGya+kxTTAjaV/DeHsY4S4MEBAAJoAGNNs2cq5++zrnz2xXt+M/D0FD8/pa6L6qKNR9vv+GiWkSqqX9U37/n+p6hq2qV9+D8KWyHCLAZAAgvB8CIAJwDAGCAQCUAxgLnhquKeoup4DY6GaxLY+pcA1atro3AxsQHQVKLoEog+9H7BkgbBpg2DTBsGL7kQGyB0IIFIaAbSQ22gfuYEc20HlWhWDS2vqf/i7kinChIcKsnj4",
"user_name": "Write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
}
}
],
"type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: '2020-08-22T00:46:56.656851Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2020-08-21_17_46_54-2519339669040960941'
location: 'us-central1'
name: 'performance-tests-wordcount-python37-batch-1gb0822000637'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2020-08-22T00:46:56.656851Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2020-08-21_17_46_54-2519339669040960941]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job:
2020-08-21_17_46_54-2519339669040960941
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-21_17_46_54-2519339669040960941?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-08-21_17_46_54-2519339669040960941 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:46:59.276Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.014Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.053Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
Write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.120Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.152Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.249Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.293Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.324Z:
JOB_MESSAGE_DETAILED: Fusing consumer Split into Read/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.356Z:
JOB_MESSAGE_DETAILED: Fusing consumer PairWIthOne into Split
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.385Z:
JOB_MESSAGE_DETAILED: Fusing consumer
GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial into PairWIthOne
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.418Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/GroupByKey/Reify into
GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.451Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/GroupByKey/Write into
GroupAndSum/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.477Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/Combine into
GroupAndSum/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.511Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/Combine/Extract into
GroupAndSum/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.530Z:
JOB_MESSAGE_DETAILED: Fusing consumer Format into GroupAndSum/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.555Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Write/Write/WriteImpl/WindowInto(WindowIntoFn) into Format
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.589Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Write/Write/WriteImpl/WriteBundles/WriteBundles into
Write/Write/WriteImpl/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.625Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/Pair into
Write/Write/WriteImpl/WriteBundles/WriteBundles
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.664Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/Reify
into Write/Write/WriteImpl/Pair
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.692Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/Write
into Write/Write/WriteImpl/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.735Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Write/Write/WriteImpl/GroupByKey/GroupByWindow into
Write/Write/WriteImpl/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.766Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/Extract into
Write/Write/WriteImpl/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.798Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/InitializeWrite
into Write/Write/WriteImpl/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.850Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.883Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.922Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:00.958Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.183Z:
JOB_MESSAGE_DEBUG: Executing wait step start35
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.252Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/DoOnce/Read+Write/Write/WriteImpl/InitializeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.286Z:
JOB_MESSAGE_DEBUG: Starting **** pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.293Z:
JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.327Z:
JOB_MESSAGE_BASIC: Starting 10 ****s in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.337Z:
JOB_MESSAGE_BASIC: Executing operation GroupAndSum/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.387Z:
JOB_MESSAGE_BASIC: Finished operation GroupAndSum/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.387Z:
JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.484Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/GroupByKey/Session"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.519Z:
JOB_MESSAGE_DEBUG: Value "GroupAndSum/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:01.720Z:
JOB_MESSAGE_BASIC: Executing operation
Read/Read+Split+PairWIthOne+GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial+GroupAndSum/GroupByKey/Reify+GroupAndSum/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:17.692Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:28.230Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 7 based on the
rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:28.409Z:
JOB_MESSAGE_DETAILED: Resized **** pool to 7, though goal was 10. This could
be a quota issue.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:33.638Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 9 based on the
rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:33.679Z:
JOB_MESSAGE_DETAILED: Resized **** pool to 9, though goal was 10. This could
be a quota issue.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:47:38.940Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 10 based on
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:49:04.256Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:49:04.292Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.159Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/DoOnce/Read+Write/Write/WriteImpl/InitializeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.227Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/DoOnce/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.272Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/InitializeWrite.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.353Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.390Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.409Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.426Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.445Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.465Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.480Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.510Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:52:24.545Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:54:42.508Z:
JOB_MESSAGE_BASIC: Finished operation
Read/Read+Split+PairWIthOne+GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial+GroupAndSum/GroupByKey/Reify+GroupAndSum/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:54:42.594Z:
JOB_MESSAGE_BASIC: Executing operation GroupAndSum/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:54:42.646Z:
JOB_MESSAGE_BASIC: Finished operation GroupAndSum/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:54:42.721Z:
JOB_MESSAGE_BASIC: Executing operation
GroupAndSum/GroupByKey/Read+GroupAndSum/Combine+GroupAndSum/Combine/Extract+Format+Write/Write/WriteImpl/WindowInto(WindowIntoFn)+Write/Write/WriteImpl/WriteBundles/WriteBundles+Write/Write/WriteImpl/Pair+Write/Write/WriteImpl/GroupByKey/Reify+Write/Write/WriteImpl/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:22.526Z:
JOB_MESSAGE_BASIC: Finished operation
GroupAndSum/GroupByKey/Read+GroupAndSum/Combine+GroupAndSum/Combine/Extract+Format+Write/Write/WriteImpl/WindowInto(WindowIntoFn)+Write/Write/WriteImpl/WriteBundles/WriteBundles+Write/Write/WriteImpl/Pair+Write/Write/WriteImpl/GroupByKey/Reify+Write/Write/WriteImpl/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:22.607Z:
JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:22.667Z:
JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:22.757Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/GroupByKey/Read+Write/Write/WriteImpl/GroupByKey/GroupByWindow+Write/Write/WriteImpl/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.158Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/GroupByKey/Read+Write/Write/WriteImpl/GroupByKey/GroupByWindow+Write/Write/WriteImpl/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.270Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/Extract.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.350Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.393Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.412Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.442Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.506Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.541Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:25.626Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/PreFinalize/PreFinalize
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:28.113Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/PreFinalize/PreFinalize
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:28.214Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/PreFinalize.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:28.289Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:28.333Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:28.402Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:28.496Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:31.008Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:31.102Z:
JOB_MESSAGE_DEBUG: Executing success step success33
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:31.237Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:31.313Z:
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:57:31.348Z:
JOB_MESSAGE_BASIC: Stopping **** pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:58:23.623Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool from 10 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:58:23.677Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-22T00:58:23.709Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-08-21_17_46_54-2519339669040960941 is in state JOB_STATE_DONE
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results*-of-*'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1598057212033/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 30 files in
0.06270146369934082 seconds.
apache_beam.testing.pipeline_verifiers: INFO: Find 30 files in
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results*-of-*:
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00000-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00001-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00002-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00003-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00004-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00005-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00006-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00007-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00008-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00009-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00010-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00011-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00012-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00013-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00014-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00015-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00016-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00017-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00018-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00019-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00020-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00021-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00022-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00023-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00024-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00025-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00026-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00027-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00028-of-00030
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results-00029-of-00030
apache_beam.testing.pipeline_verifiers: INFO: Read from given path
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results*-of-*,
26186927 lines, checksum: ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710.
google.auth._default: DEBUG: Checking None for explicit credentials as part of
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth
process...
google.auth._default: DEBUG: No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 221
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance HTTP/1.1"
200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance/tables/wordcount_py37_pkb_results
HTTP/1.1" 200 None
apache_beam.testing.load_tests.load_test_metrics_utils: INFO: Load test results
for test: 0ede2197deb24b1f81f7c6c306532576 and timestamp: 1598058022.8185709:
apache_beam.testing.load_tests.load_test_metrics_utils: INFO: Metric: runtime
Value: 810
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance/tables/wordcount_py37_pkb_results/insertAll
HTTP/1.1" 200 None
apache_beam.testing.load_tests.load_test_metrics_utils: ERROR: no such field.
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1598057212033/results*'
->
'gs://temp\\-storage\\-for\\-end\\-to\\-end\\-tests/py\\-it\\-cloud/output/1598057212033/results[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 30 files in
0.11221504211425781 seconds.
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-runPerformanceTest-df-py37.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PerformanceTests_WordCountIT_Py37/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 811.428s
FAILED (errors=1)
> Task :sdks:python:test-suites:dataflow:py37:runPerformanceTest FAILED
:sdks:python:test-suites:dataflow:py37:runPerformanceTest (Thread[Execution
**** for ':',5,main]) completed. Took 13 mins 35.933 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py37:runPerformanceTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 15m 4s
5 actionable tasks: 5 executed
Publishing build scan...
https://gradle.com/s/6lfmkdmkas6fe
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]