See
<https://builds.apache.org/job/beam_PostCommit_Python2/1818/display/redirect?page=changes>
Changes:
[github] [BEAM-8458] Add option to set temp dataset in BigQueryIO.Read (#9852)
------------------------------------------
[...truncated 10.45 MB...]
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "assert_that/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s22"
},
"serialized_fn": "<string of 1736 bytes>",
"user_name": "assert_that/Match"
}
}
],
"type": "JOB_TYPE_BATCH"
}
INFO:apache_beam.runners.dataflow.internal.apiclient:Create job: <Job
createTime: u'2020-02-26T21:19:13.686958Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2020-02-26_13_19_07-1549784008491091172'
location: u'us-central1'
name: u'beamapp-jenkins-0226205158-993117'
projectId: u'apache-beam-testing'
stageStates: []
startTime: u'2020-02-26T21:19:13.686958Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
INFO:apache_beam.runners.dataflow.internal.apiclient:Created job with id:
[2020-02-26_13_19_07-1549784008491091172]
INFO:apache_beam.runners.dataflow.internal.apiclient:To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_19_07-1549784008491091172?project=apache-beam-testing
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-02-26_13_19_07-1549784008491091172 is in state JOB_STATE_RUNNING
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:07.508Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2020-02-26_13_19_07-1549784008491091172. The number of workers will be between
1 and 1000.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:07.508Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2020-02-26_13_19_07-1549784008491091172.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:16.122Z:
JOB_MESSAGE_DETAILED: Checking permissions granted to controller Service
Account.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:17.326Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:17.991Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.030Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.182Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step read from
datastore/GroupByKey: GroupByKey not followed by a combiner.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.222Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.257Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.376Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.427Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.459Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/SplitQuery into read
from datastore/UserQuery/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.493Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Reify into
read from datastore/SplitQuery
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.521Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/GroupByKey/Write into
read from datastore/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.551Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from
datastore/GroupByKey/GroupByWindow into read from datastore/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.579Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Values into read from
datastore/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.614Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Flatten into read
from datastore/Values
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.654Z:
JOB_MESSAGE_DETAILED: Fusing consumer read from datastore/Read into read from
datastore/Flatten
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.687Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/KeyWithVoid into read from
datastore/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.721Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
into Globally/CombineGlobally(CountCombineFn)/KeyWithVoid
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.755Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.791Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.817Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.851Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.884Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/UnKey into
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.919Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s19 for input s17.None
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.958Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten,
into producer assert_that/Group/pair_with_0
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:18.992Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/GroupByWindow into
assert_that/Group/GroupByKey/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.019Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/GroupByWindow
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.052Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into
assert_that/Group/Map(_merge_tagged_vals_under_key)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.089Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.127Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s19-u40 for input s20-reify-value9-c38
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.162Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/Write, through flatten
assert_that/Group/Flatten/Unzipped-1, into producer
assert_that/Group/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.196Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into
assert_that/Create/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.229Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into
assert_that/Group/pair_with_1
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.254Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into
assert_that/Group/GroupByKey/Reify
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.292Z:
JOB_MESSAGE_DETAILED: Fusing consumer
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault into
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.328Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into
Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.366Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into
assert_that/WindowInto(WindowIntoFn)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.399Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into
assert_that/ToVoidKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.444Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.473Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.505Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.537Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.680Z:
JOB_MESSAGE_DEBUG: Executing wait step start51
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.755Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.776Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.802Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.811Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.837Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.880Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.893Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.893Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Create
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.949Z:
JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:19.978Z:
JOB_MESSAGE_DEBUG: Value
"Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Session"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:20.004Z:
JOB_MESSAGE_DEBUG: Value "read from datastore/GroupByKey/Session" materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:20.043Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:20.076Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/UserQuery/Read+read
from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from
datastore/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:43.730Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running step(s).
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:19:46.062Z:
JOB_MESSAGE_WARNING: Your project already contains 100 Dataflow-created metric
descriptors and Stackdriver will not create new Dataflow custom metrics for
this job. Each unique user-defined metric name (independent of the DoFn in
which it is defined) produces a new metric descriptor. To delete old / unused
metric descriptors see
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:20:51.152Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:20:51.198Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:28.629Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:32.235Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/UserQuery/Read+read
from datastore/SplitQuery+read from datastore/GroupByKey/Reify+read from
datastore/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:32.306Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:32.358Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:32.424Z:
JOB_MESSAGE_BASIC: Executing operation read from datastore/GroupByKey/Read+read
from datastore/GroupByKey/GroupByWindow+read from datastore/Values+read from
datastore/Flatten+read from
datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:41.978Z:
JOB_MESSAGE_BASIC: Finished operation read from datastore/GroupByKey/Read+read
from datastore/GroupByKey/GroupByWindow+read from datastore/Values+read from
datastore/Flatten+read from
datastore/Read+Globally/CombineGlobally(CountCombineFn)/KeyWithVoid+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Partial+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Reify+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:42.049Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:42.101Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:42.171Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:47.246Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/CombinePerKey/GroupByKey/Read+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine+Globally/CombineGlobally(CountCombineFn)/CombinePerKey/Combine/Extract+Globally/CombineGlobally(CountCombineFn)/UnKey
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:47.315Z:
JOB_MESSAGE_DEBUG: Value "Globally/CombineGlobally(CountCombineFn)/UnKey.out"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:47.393Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:47.443Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0)
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:47.528Z:
JOB_MESSAGE_DEBUG: Value
"Globally/CombineGlobally(CountCombineFn)/InjectDefault/_UnpickledSideInput(UnKey.out.0).output"
materialized.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:47.601Z:
JOB_MESSAGE_BASIC: Executing operation
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:52.167Z:
JOB_MESSAGE_BASIC: Finished operation
Globally/CombineGlobally(CountCombineFn)/DoOnce/Read+Globally/CombineGlobally(CountCombineFn)/InjectDefault/InjectDefault+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:52.248Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:52.311Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Close
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:23:52.377Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:24:01.653Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Group/GroupByKey/Read+assert_that/Group/GroupByKey/GroupByWindow+assert_that/Group/Map(_merge_tagged_vals_under_key)+assert_that/Unkey+assert_that/Match
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:24:01.742Z:
JOB_MESSAGE_DEBUG: Executing success step success49
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:24:01.852Z:
JOB_MESSAGE_DETAILED: Cleaning up.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:24:01.900Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:24:01.933Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:25:21.046Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:25:21.092Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
INFO:apache_beam.runners.dataflow.dataflow_runner:2020-02-26T21:25:21.118Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
INFO:apache_beam.runners.dataflow.dataflow_runner:Job
2020-02-26_13_19_07-1549784008491091172 is in state JOB_STATE_DONE
test_datastore_write_limit
(apache_beam.io.gcp.datastore_write_it_test.DatastoreWriteIT) ... ok
======================================================================
ERROR: test_streaming_wordcount_it
(apache_beam.examples.streaming_wordcount_it_test.StreamingWordCountIT)
----------------------------------------------------------------------
Traceback (most recent call last):
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/apache_beam/examples/streaming_wordcount_it_test.py",>
line 69, in setUp
ack_deadline_seconds=60)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/pubsub_v1/_gapic.py",>
line 40, in <lambda>
fx = lambda self, *a, **kw: wrapped_fx(self.api, *a, **kw) # noqa
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/cloud/pubsub_v1/gapic/subscriber_client.py",>
line 414, in create_subscription
request, retry=retry, timeout=timeout, metadata=metadata
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/gapic_v1/method.py",>
line 143, in __call__
return wrapped_func(*args, **kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/retry.py",>
line 286, in retry_wrapped_func
on_error=on_error,
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/retry.py",>
line 184, in retry_target
return target()
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/timeout.py",>
line 214, in func_with_timeout
return func(*args, **kwargs)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/google/api_core/grpc_helpers.py",>
line 59, in error_remapped_callable
six.raise_from(exceptions.from_grpc_error(exc), exc)
File
"<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/build/gradleenv/-194514014/local/lib/python2.7/site-packages/six.py",>
line 738, in raise_from
raise value
DeadlineExceeded: 504 Deadline Exceeded
-------------------- >> begin captured logging << --------------------
apache_beam.options.pipeline_options: WARNING: --region not set; will default
to us-central1. Future releases of Beam will require the user to set --region
explicitly, or else have a default set via the gcloud tool.
https://cloud.google.com/compute/docs/regions-zones
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
google.auth.transport._http_client: DEBUG: Making request: GET
http://169.254.169.254
google.auth.transport._http_client: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/project/project-id
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/default/?recursive=true
urllib3.connectionpool: DEBUG: Starting new HTTP connection (1):
metadata.google.internal:80
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/default/?recursive=true HTTP/1.1"
200 144
google.auth.transport.requests: DEBUG: Making request: GET
http://metadata.google.internal/computeMetadata/v1/instance/service-accounts/[email protected]/token
urllib3.connectionpool: DEBUG: http://metadata.google.internal:80 "GET
/computeMetadata/v1/instance/service-accounts/[email protected]/token
HTTP/1.1" 200 192
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-postCommitIT-df.xml
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 52 tests in 3546.457s
FAILED (SKIP=7, errors=1)
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_56-12352090299307970186?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_40_42-14905908562876185945?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_47_24-6355524218843426789?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_54_41-4227693402216256755?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_02_17-9412547350388823088?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_45-7662639406905958347?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_45_58-17271680656411414174?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_48-7779107861484340030?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_38_43-2610397512748666862?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_47_07-1012339759002597069?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_55_08-1096020300890936799?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_02_38-16415978887944049088?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_45-4590946242897188153?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_42_43-2862192647408841202?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_49_30-103188872736448475?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_55_51-5631692380663201355?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_02_38-5762508723537799046?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_45-8323035113532744490?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_34_29-29124323360412366?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_41_12-4002126965435556787?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_48_30-12201569391163669166?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_55_20-4134308367919320906?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_01_41-2545382258642685360?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_45-1332491042007729046?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_34_21-5808602080366161783?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_41_28-5466015946567961570?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_48_21-4593636518920520288?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_55_03-11305818875193244370?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_02_00-15268302774302028370?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_48-10822939083837645674?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_33_59-5154226098665411724?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_41_34-11905498279877925883?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_48_23-15567665250218443117?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_55_15-6753494903526407732?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_01_50-17835827606658048059?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_26_45-18174294260674802170?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_34_59-9810680128899152175?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_44_58-15253721319349130064?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_52_15-14414914258930388208?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_12_58_35-17969375982690412647?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_05_47-10210956313231355475?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_12_59-1513897537324023315?project=apache-beam-testing
Worker logs:
https://console.cloud.google.com/dataflow/jobs/us-central1/2020-02-26_13_19_07-1549784008491091172?project=apache-beam-testing
> Task :sdks:python:test-suites:dataflow:py2:postCommitIT FAILED
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Python2/ws/src/sdks/python/test-suites/dataflow/py2/build.gradle'>
line: 85
* What went wrong:
Execution failed for task ':sdks:python:test-suites:dataflow:py2:postCommitIT'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 6.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/5.2.1/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 1h 0m 25s
124 actionable tasks: 100 executed, 21 from cache, 3 up-to-date
Publishing build scan...
https://gradle.com/s/ezhagfk5avkxq
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]