See
<https://builds.apache.org/job/beam_PostCommit_Python2/2138/display/redirect?page=changes>
Changes:
[aldaircr] Change: Fixing typos on javadoc
------------------------------------------
[...truncated 11.84 MB...]
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [],
"pipeline_proto_coder_id":
"ref_Coder_FastPrimitivesCoder_5"
}
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "assert_that/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s22"
},
"serialized_fn": "<string of 1740 bytes>",
"user_name": "assert_that/Match"
}
}
],
"type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: u'2020-04-04T03:28:07.775723Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-04-03_20_28_05-15826380811483054618'
location: u'us-central1'
name: u'beamapp-jenkins-0404025646-418083'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-04-04T03:28:07.775723Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-04-03_20_28_05-15826380811483054618]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_28_05-15826380811483054618?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-04-03_20_28_05-15826380811483054618 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:05.974Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-04-03_20_28_05-15826380811483054618.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:05.974Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-04-03_20_28_05-15826380811483054618. The number of workers will be between
1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:11.040Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:11.744Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-f.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.340Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.375Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.448Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from
datastore/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.492Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.524Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.661Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.716Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.746Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read
from datastore/UserQuery/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.771Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Reify into
read from datastore/SplitQuery
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.806Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Write into
read from datastore/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.840Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from
datastore/GroupByKey/GroupByWindow into read from datastore/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.878Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Values into read from
datastore/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.915Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Flatten into read
from datastore/Values
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.947Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from
datastore/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:12.985Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/KeyWithVoid into read from
datastore/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.019Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
into Globally/CombineGlobally(CountCombineFn)/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.049Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.073Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.110Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.148Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.182Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/UnKey into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.215Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s19 for input s17.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.251Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten,
into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.280Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/GroupByWindow into
assert_that/Group/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.317Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.346Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into
assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.386Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.413Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s19-u40 for input s20-reify-value9-c38
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.438Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/Write, through flatten
assert_that/Group/Flatten/Unzipped-1, into producer
assert_that/Group/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.472Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into
assert_that/Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.502Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into
assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.544Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into
assert_that/Group/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.572Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.593Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.635Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into
assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.669Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into
assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.709Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.734Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.772Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:13.806Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.011Z:
JOB_MESSAGE_DEBUG: Executing wait step start51
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.084Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.117Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.130Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.146Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.164Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-f...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.211Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.226Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.227Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.280Z:
JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.313Z:
JOB_MESSAGE_DEBUG: Value "read from datastore/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.346Z:
JOB_MESSAGE_DEBUG: Value
"Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.387Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:14.421Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read
from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from
datastore/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:29.931Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:28:44.152Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running stage(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:30:00.832Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:30:00.866Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:35.365Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read
from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from
datastore/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:35.436Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:35.490Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:35.568Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Read+read
from datastore/GroupByKey/GroupByWindow+read from datastore/Values+read from
datastore/Flatten+read from
datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:42.125Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:48.826Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Read+read
from datastore/GroupByKey/GroupByWindow+read from datastore/Values+read from
datastore/Flatten+read from
datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:48.911Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:48.968Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:49.053Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:54.574Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:54.665Z:
JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/UnKey.out"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:54.741Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:54.790Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:54.849Z:
JOB_MESSAGE_DEBUG: Value
"Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:54.912Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:59.755Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:59.842Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:59.883Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:33:59.978Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:34:09.644Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:34:09.724Z:
JOB_MESSAGE_DEBUG: Executing success step success49
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:34:09.858Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:34:09.911Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:34:09.953Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:36:09.369Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:36:09.428Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-04-04T03:36:09.463Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-04-03_20_28_05-15826380811483054618 is in state JOB_STATE_DONE
test_datastore_write_limit
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 59 tests in 4356.092s
OK (SKIP=8)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_07-7844699458913680617?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_47_10-16923152172577250064?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_56_59-17912857481082748296?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_04_06-12202873676650383578?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_12_23-15915915864389775420?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_20_39-5363012068306100835?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_28_05-15826380811483054618?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_11-5728193425271653872?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_38_29-17582500359408321400?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_48_32-9742994725675111529?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_58_08-11680727585832273271?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_06_36-17834335802034147397?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_15_04-443505572428352237?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_10-1323979917608554734?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_36_34-10194166556290713973?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_44_57-17551895183698443607?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_53_09-4339986363695955455?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_01_23-11394131662131261103?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_09_27-16645705405366172407?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_09-5449677154200099173?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_43_00-8085423509767224304?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_51_02-16113991477166123644?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_08_20-8260631793384568185?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_08-7379401530177046368?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_33_42-7427614555512481748?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_42_26-11459370800906164402?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_50_17-10573659456841475229?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_58_56-6844740364287082034?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_06_34-17852034885663195096?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_15_21-15570466863472189587?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_07-8483724708275025442?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_32_07-13188731199628436410?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_40_43-6898734715438903902?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_49_11-6847764059840994843?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_57_45-2816216757517282059?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_04_54-18066481010936346961?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_11_35-4846834802410175816?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_19_16-2701666470055121051?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_09-603169983025842355?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_33_43-14092984136798536149?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_42_27-477530953731655494?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_50_18-10715159727788686912?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_58_12-10741096487988235330?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_07_15-16085721153639773305?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_24_08-5612496585316950190?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_34_42-7784620560890109569?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_46_07-12099244648470976500?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_19_53_42-130874291684089095?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_03_05-1513769715047862848?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-04-03_20_10_59-12973567442623588294?project=apache-beam-testing
FAILURE: Build completed with 2 failures.
1: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
line: 81
* What went wrong:
Execution failed for task
':sdks:python:test-suites:direct:py2:hdfsIntegrationTest'.
> Process 'command 'sh'' finished with non-zero exit value 255
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
2: Task failed with an exception.
-----------
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/direct/py2/build.gradle'>
line: 72
* What went wrong:
Execution failed for task ':sdks:python:test-suites:direct:py2:directRunnerIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
==============================================================================
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 14m 23s
127 actionable tasks: 100 executed, 24 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/fmd4aqgfti6xq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]