See
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1153/display/redirect?page=changes>
Changes:
[herohde] Add Go support for universal runners, incl Flink
[herohde] CR: Fixed comments for job service helper functions
[herohde] [BEAM-3893] Add fallback to unauthenticated access for GCS IO
[robertwb] [BEAM-2927] Python support for dataflow portable side inputs over Fn
API
[herohde] CR: fix typo
[aaltay] [BEAM-3861] Improve test infra in Python SDK for streaming end-to-end
------------------------------------------
[...truncated 774.29 KB...]
"serialized_fn": "<string of 1160 bytes>",
"user_name": "assert_that/Match"
}
},
{
"kind": "ParallelDo",
"name": "s16",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "<lambda>"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:pair",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type":
"VarIntCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxhiUWeeSXOIA5XIYNmYyFjbSFTkh4A89cR+g==",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "compute/MapToVoidKey0.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s2"
},
"serialized_fn": "<string of 968 bytes>",
"user_name": "compute/MapToVoidKey0"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: u'2018-03-21T04:20:30.096409Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2018-03-20_21_20_28-237846840698724677'
location: u'us-central1'
name: u'beamapp-jenkins-0321042020-611623'
projectId: u'apache-beam-testing'
stageStates: []
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-20_21_20_28-237846840698724677]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_20_28-237846840698724677?project=apache-beam-testing
root: INFO: Job 2018-03-20_21_20_28-237846840698724677 is in state
JOB_STATE_PENDING
root: INFO: 2018-03-21T04:20:29.040Z: JOB_MESSAGE_WARNING: Job
2018-03-20_21_20_28-237846840698724677 might autoscale up to 250 workers.
root: INFO: 2018-03-21T04:20:29.049Z: JOB_MESSAGE_DETAILED: Autoscaling is
enabled for job 2018-03-20_21_20_28-237846840698724677. The number of workers
will be between 1 and 250.
root: INFO: 2018-03-21T04:20:29.066Z: JOB_MESSAGE_DETAILED: Autoscaling was
automatically enabled for job 2018-03-20_21_20_28-237846840698724677.
root: INFO: 2018-03-21T04:20:31.699Z: JOB_MESSAGE_DETAILED: Checking required
Cloud APIs are enabled.
root: INFO: 2018-03-21T04:20:31.863Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2018-03-21T04:20:32.570Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-21T04:20:32.602Z: JOB_MESSAGE_DEBUG: Combiner lifting
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a
combiner.
root: INFO: 2018-03-21T04:20:32.623Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into optimizable parts.
root: INFO: 2018-03-21T04:20:32.646Z: JOB_MESSAGE_DETAILED: Lifting
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-21T04:20:32.679Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2018-03-21T04:20:32.711Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-21T04:20:32.736Z: JOB_MESSAGE_DETAILED: Unzipping flatten
s11 for input s10.out
root: INFO: 2018-03-21T04:20:32.759Z: JOB_MESSAGE_DETAILED: Fusing unzipped
copy of assert_that/Group/GroupByKey/Reify, through flatten
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-03-21T04:20:32.774Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/GroupByWindow into
assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-21T04:20:32.797Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-21T04:20:32.819Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-21T04:20:32.850Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-21T04:20:32.885Z: JOB_MESSAGE_DETAILED: Unzipping flatten
s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-03-21T04:20:32.915Z: JOB_MESSAGE_DETAILED: Fusing unzipped
copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into
producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-21T04:20:32.941Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-21T04:20:32.971Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/MapToVoidKey0 into side/Read
root: INFO: 2018-03-21T04:20:32.997Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-21T04:20:33.014Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-03-21T04:20:33.040Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-21T04:20:33.061Z: JOB_MESSAGE_DETAILED: Fusing consumer
compute/compute into start/Read
root: INFO: 2018-03-21T04:20:33.091Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-21T04:20:33.113Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/WindowInto(WindowIntoFn) into compute/compute
root: INFO: 2018-03-21T04:20:33.125Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-21T04:20:33.146Z: JOB_MESSAGE_DEBUG: Workflow config is
missing a default resource spec.
root: INFO: 2018-03-21T04:20:33.176Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2018-03-21T04:20:33.196Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2018-03-21T04:20:33.219Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-21T04:20:33.354Z: JOB_MESSAGE_DEBUG: Executing wait step
start22
root: INFO: Job 2018-03-20_21_20_28-237846840698724677 is in state
JOB_STATE_RUNNING
root: INFO: 2018-03-21T04:20:33.406Z: JOB_MESSAGE_BASIC: Executing operation
side/Read+compute/MapToVoidKey0+compute/MapToVoidKey0
root: INFO: 2018-03-21T04:20:33.436Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-21T04:20:33.448Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2018-03-21T04:20:33.478Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-f...
root: INFO: 2018-03-21T04:20:33.549Z: JOB_MESSAGE_DEBUG: Value
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-03-21T04:20:33.610Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-21T04:20:42.880Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 0 based on the rate of progress in the currently
running step(s).
root: INFO: 2018-03-21T04:20:58.723Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 1 based on the rate of progress in the currently
running step(s).
root: INFO: 2018-03-21T04:21:14.963Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2018-03-21T04:25:22.900Z: JOB_MESSAGE_DEBUG: Value
"compute/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-21T04:25:22.959Z: JOB_MESSAGE_BASIC: Executing operation
compute/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-21T04:25:23.064Z: JOB_MESSAGE_DEBUG: Value
"compute/_DataflowIterableSideInput(MapToVoidKey0.out.0).output" materialized.
root: INFO: 2018-03-21T04:25:23.126Z: JOB_MESSAGE_BASIC: Executing operation
start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-21T04:25:28.795Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 341, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 373, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-21T04:25:32.320Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 341, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 373, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-21T04:25:35.757Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 341, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 373, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-21T04:25:39.395Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 341, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 373, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-21T04:25:39.432Z: JOB_MESSAGE_DEBUG: Executing failure step
failure21
root: INFO: 2018-03-21T04:25:39.458Z: JOB_MESSAGE_ERROR: Workflow failed.
Causes:
S05:start/Read+compute/compute+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., A work item was attempted 4 times without success. Each time the
worker eventually lost contact with the service. The work item was attempted
on:
beamapp-jenkins-032104202-03202120-e80a-harness-qkjt,
beamapp-jenkins-032104202-03202120-e80a-harness-qkjt,
beamapp-jenkins-032104202-03202120-e80a-harness-qkjt,
beamapp-jenkins-032104202-03202120-e80a-harness-qkjt
root: INFO: 2018-03-21T04:25:39.571Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-21T04:25:39.623Z: JOB_MESSAGE_DEBUG: Starting worker pool
teardown.
root: INFO: 2018-03-21T04:25:39.648Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-21T04:27:05.639Z: JOB_MESSAGE_DETAILED: Autoscaling:
Reduced the number of workers to 0 based on the rate of progress in the
currently running step(s).
root: INFO: 2018-03-21T04:27:05.665Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-03-21T04:27:05.695Z: JOB_MESSAGE_DEBUG: Tearing down pending
resources...
root: INFO: Job 2018-03-20_21_20_28-237846840698724677 is in state
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
Ran 16 tests in 1779.193s
FAILED (errors=9)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_58_52-3846411807089581415?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_05_58-10374766789742304390?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_13_38-614279896730368333?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_21_12-2709281492746898301?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_58_53-2088183411613223643?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_06_13-17946732127191601318?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_13_18-11992113256982728437?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_20_28-237846840698724677?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_58_53-14719454182198321188?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_05_47-7068178831499494931?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_12_42-18221228106339468969?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_19_22-17774021534607364972?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_20_58_52-16935153965861701745?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_05_58-16193925093597832948?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_13_08-2131554022236270317?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-20_21_20_13-2995835851342444359?project=apache-beam-testing
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]