See
<https://ci-beam.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/1845/display/redirect?page=changes>
Changes:
[zyichi] Allow Nexmark launcher to publish human-readable events to pubsub.
[irvi.fa] [BEAM-10753] Add Slack link invitation on README
[Luke Cwik] [BEAM-10756] Fix empty pull response to not ack and to not throw
[noreply] Merge pull request #12597: [BEAM-10685] Added integration test for
[ettarapp] clarifying unclear comments
[noreply] Update README.md
[noreply] [BEAM-3301] Adding SDF Go Dataflow translation. (#12629)
[noreply] [BEAM-10752] Use TestPubsubSignal in PubsubToBigqueryIT (#12625)
------------------------------------------
[...truncated 243.68 KB...]
"step_name": "SideInput-s20"
}
},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
},
{
"@type": "FastPrimitivesCoder$<string of 176 bytes>",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "Write/Write/WriteImpl/FinalizeWrite.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s7"
},
"serialized_fn": "<string of 3492 bytes>",
"user_name": "Write/Write/WriteImpl/FinalizeWrite/FinalizeWrite"
}
}
],
"type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: u'2020-08-19T18:09:55.126538Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-08-19_11_09_54-1938108206869501114'
location: u'us-central1'
name: u'performance-tests-wordcount-python27-batch-1gb0819150314'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-08-19T18:09:55.126538Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2020-08-19_11_09_54-1938108206869501114]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job:
2020-08-19_11_09_54-1938108206869501114
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-08-19_11_09_54-1938108206869501114?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-08-19_11_09_54-1938108206869501114 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:58.028Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:58.961Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:58.991Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
Write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.072Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.112Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.237Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.311Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.355Z:
JOB_MESSAGE_DETAILED: Fusing consumer Split into Read/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.389Z:
JOB_MESSAGE_DETAILED: Fusing consumer PairWIthOne into Split
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.417Z:
JOB_MESSAGE_DETAILED: Fusing consumer
GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial into PairWIthOne
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.442Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/GroupByKey/Reify into
GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.479Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/GroupByKey/Write into
GroupAndSum/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.515Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/Combine into
GroupAndSum/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.551Z:
JOB_MESSAGE_DETAILED: Fusing consumer GroupAndSum/Combine/Extract into
GroupAndSum/Combine
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.587Z:
JOB_MESSAGE_DETAILED: Fusing consumer Format into GroupAndSum/Combine/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.612Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Write/Write/WriteImpl/WindowInto(WindowIntoFn) into Format
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.650Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Write/Write/WriteImpl/WriteBundles/WriteBundles into
Write/Write/WriteImpl/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.685Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/Pair into
Write/Write/WriteImpl/WriteBundles/WriteBundles
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.716Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/Reify
into Write/Write/WriteImpl/Pair
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.746Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/GroupByKey/Write
into Write/Write/WriteImpl/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.785Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Write/Write/WriteImpl/GroupByKey/GroupByWindow into
Write/Write/WriteImpl/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.829Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/Extract into
Write/Write/WriteImpl/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.865Z:
JOB_MESSAGE_DETAILED: Fusing consumer Write/Write/WriteImpl/InitializeWrite
into Write/Write/WriteImpl/DoOnce/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.896Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.928Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:09:59.966Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.003Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.296Z:
JOB_MESSAGE_DEBUG: Executing wait step start35
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.363Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/DoOnce/Read+Write/Write/WriteImpl/InitializeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.401Z:
JOB_MESSAGE_DEBUG: Starting **** pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.401Z:
JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.437Z:
JOB_MESSAGE_BASIC: Starting 10 ****s in us-central1-f...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.437Z:
JOB_MESSAGE_BASIC: Executing operation GroupAndSum/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.500Z:
JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.501Z:
JOB_MESSAGE_BASIC: Finished operation GroupAndSum/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.574Z:
JOB_MESSAGE_DEBUG: Value "GroupAndSum/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.611Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/GroupByKey/Session"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.648Z:
JOB_MESSAGE_BASIC: Executing operation
Read/Read+Split+PairWIthOne+GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial+GroupAndSum/GroupByKey/Reify+GroupAndSum/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:00.722Z:
JOB_MESSAGE_WARNING: The Compute Engine API has not fully initialized. Please
try again in a few minutes. Causes: Unable to get information for region
us-central1.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:31.945Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 8 based on the
rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:31.986Z:
JOB_MESSAGE_DETAILED: Resized **** pool to 8, though goal was 10. This could
be a quota issue.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:36.334Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:10:37.326Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of ****s to 10 based on
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:12:05.747Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:12:05.799Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:13:01.115Z:
JOB_MESSAGE_WARNING: The Compute Engine API has not fully initialized. Please
try again in a few minutes. Causes: Unable to get information for region
us-central1.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.667Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/DoOnce/Read+Write/Write/WriteImpl/InitializeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.733Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/DoOnce/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.769Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/InitializeWrite.out"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.829Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.862Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.891Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.897Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.930Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.960Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:13.968Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/WriteBundles/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:14.010Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:15:14.048Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(InitializeWrite.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:17:55.482Z:
JOB_MESSAGE_BASIC: Finished operation
Read/Read+Split+PairWIthOne+GroupAndSum/GroupByKey+GroupAndSum/Combine/Partial+GroupAndSum/GroupByKey/Reify+GroupAndSum/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:17:55.570Z:
JOB_MESSAGE_BASIC: Executing operation GroupAndSum/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:17:55.622Z:
JOB_MESSAGE_BASIC: Finished operation GroupAndSum/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:17:55.695Z:
JOB_MESSAGE_BASIC: Executing operation
GroupAndSum/GroupByKey/Read+GroupAndSum/Combine+GroupAndSum/Combine/Extract+Format+Write/Write/WriteImpl/WindowInto(WindowIntoFn)+Write/Write/WriteImpl/WriteBundles/WriteBundles+Write/Write/WriteImpl/Pair+Write/Write/WriteImpl/GroupByKey/Reify+Write/Write/WriteImpl/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:19:01.339Z:
JOB_MESSAGE_WARNING: The Compute Engine API has not fully initialized. Please
try again in a few minutes. Causes: Unable to get GCE information for project
apache-beam-testing.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:16.155Z:
JOB_MESSAGE_BASIC: Finished operation
GroupAndSum/GroupByKey/Read+GroupAndSum/Combine+GroupAndSum/Combine/Extract+Format+Write/Write/WriteImpl/WindowInto(WindowIntoFn)+Write/Write/WriteImpl/WriteBundles/WriteBundles+Write/Write/WriteImpl/Pair+Write/Write/WriteImpl/GroupByKey/Reify+Write/Write/WriteImpl/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:16.293Z:
JOB_MESSAGE_BASIC: Executing operation Write/Write/WriteImpl/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:16.374Z:
JOB_MESSAGE_BASIC: Finished operation Write/Write/WriteImpl/GroupByKey/Close
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:16.481Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/GroupByKey/Read+Write/Write/WriteImpl/GroupByKey/GroupByWindow+Write/Write/WriteImpl/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.391Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/GroupByKey/Read+Write/Write/WriteImpl/GroupByKey/GroupByWindow+Write/Write/WriteImpl/Extract
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.569Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/Extract.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.724Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.783Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.818Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.859Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.913Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:19.948Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/PreFinalize/_UnpickledSideInput(Extract.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:20.017Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/PreFinalize/PreFinalize
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:23.053Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/PreFinalize/PreFinalize
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:23.184Z:
JOB_MESSAGE_DEBUG: Value "Write/Write/WriteImpl/PreFinalize.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:23.267Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:23.321Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:23.395Z:
JOB_MESSAGE_DEBUG: Value
"Write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(PreFinalize.out.0).output"
materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:23.461Z:
JOB_MESSAGE_BASIC: Executing operation
Write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:26.078Z:
JOB_MESSAGE_BASIC: Finished operation
Write/Write/WriteImpl/FinalizeWrite/FinalizeWrite
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:26.163Z:
JOB_MESSAGE_DEBUG: Executing success step success33
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:26.243Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:26.395Z:
JOB_MESSAGE_DEBUG: Starting **** pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:21:26.449Z:
JOB_MESSAGE_BASIC: Stopping **** pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:22:18.047Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized **** pool from 10 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:22:18.100Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2020-08-19T18:22:18.135Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2020-08-19_11_09_54-1938108206869501114 is in state JOB_STATE_DONE
apache_beam.io.filesystem: DEBUG: Listing files in
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results'
apache_beam.io.filesystem: DEBUG: translate_pattern:
'gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results*-of-*'
->
'gs\\:\\/\\/temp\\-storage\\-for\\-end\\-to\\-end\\-tests\\/py\\-it\\-cloud\\/output\\/1597860590279\\/results[^/\\\\]*\\-of\\-[^/\\\\]*'
apache_beam.io.gcp.gcsio: INFO: Starting the size estimation of the input
apache_beam.io.gcp.gcsio: INFO: Finished listing 32 files in 0.0665290355682
seconds.
apache_beam.testing.pipeline_verifiers: INFO: Find 32 files in
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results*-of-*:
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00015-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00028-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00010-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00025-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00009-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00019-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00008-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00020-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00004-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00026-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00012-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00016-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00006-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00003-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00017-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00002-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00022-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00023-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00007-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00027-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00018-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00030-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00024-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00021-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00031-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00013-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00029-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00005-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00001-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00011-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00014-of-00032
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results-00000-of-00032
apache_beam.testing.pipeline_verifiers: INFO: Read from given path
gs://temp-storage-for-end-to-end-tests/py-it-cloud/output/1597860590279/results*-of-*,
26186927 lines, checksum: ea0ca2e5ee4ea5f218790f28d0b9fe7d09d8d710.
google.auth._default: DEBUG: Checking None for explicit credentials as part of
auth process...
google.auth._default: DEBUG: Checking Cloud SDK credentials as part of auth
process...
google.auth._default: DEBUG: Cloud SDK credentials not found on disk; not using
them
google.auth._default: DEBUG: Checking for App Engine runtime as part of auth
process...
google.auth._default: DEBUG: No App Engine library was found so cannot
authentication via App Engine Identity Credentials.
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
urllib3.util.retry: DEBUG: Converted retries value: 3 -> Retry(total=3,
connect=None, read=None, redirect=None, status=None)
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 221
urllib3.connectionpool: DEBUG: Starting new HTTPS connection (1):
bigquery.googleapis.com:443
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance HTTP/1.1"
200 None
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "GET
/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance/tables/wordcount_py27_pkb_results
HTTP/1.1" 200 None
apache_beam.testing.load_tests.load_test_metrics_utils: INFO: Load test results
for test: 1beafc9086934a4a8634630dbe34fd1b and timestamp: 1597861484.09:
apache_beam.testing.load_tests.load_test_metrics_utils: INFO: Metric: runtime
Value: 893
urllib3.connectionpool: DEBUG: https://bigquery.googleapis.com:443 "POST
/bigquery/v2/projects/apache-beam-testing/datasets/beam_performance/tables/wordcount_py27_pkb_results/insertAll
HTTP/1.1" 200 None
apache_beam.testing.load_tests.load_test_metrics_utils: ERROR: no such field.
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-runPerformanceTest-df-py27.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PerformanceTests_WordCountIT_Py27/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 1 test in 894.402s
FAILED (errors=1)
> Task :sdks:python:test-suites:dataflow:py2:runPerformanceTest FAILED
:sdks:python:test-suites:dataflow:py2:runPerformanceTest (Thread[Execution ****
for ':',5,main]) completed. Took 14 mins 57.56 secs.
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':sdks:python:test-suites:dataflow:py2:runPerformanceTest'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 16m 20s
5 actionable tasks: 5 executed
Publishing build scan...
https://gradle.com/s/ljxvbfevlxrue
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]