See
<https://builds.apache.org/job/beam_PostCommit_Python_ValidatesRunner_Dataflow/1198/display/redirect>
------------------------------------------
[...truncated 979.90 KB...]
"component_encodings": [
{
"@type": "kind:pair",
"component_encodings": [
{
"@type": "kind:bytes"
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "concatenate/MapToVoidKey1.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s3"
},
"serialized_fn": "<string of 968 bytes>",
"user_name": "concatenate/MapToVoidKey1"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: u'2018-03-27T21:12:43.344355Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2018-03-27_14_12_42-8666691247344616596'
location: u'us-central1'
name: u'beamapp-jenkins-0327211232-279365'
projectId: u'apache-beam-testing'
stageStates: []
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-03-27_14_12_42-8666691247344616596]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_12_42-8666691247344616596?project=apache-beam-testing
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state
JOB_STATE_PENDING
root: INFO: 2018-03-27T21:12:42.408Z: JOB_MESSAGE_WARNING: Job
2018-03-27_14_12_42-8666691247344616596 might autoscale up to 1000 workers.
root: INFO: 2018-03-27T21:12:42.442Z: JOB_MESSAGE_DETAILED: Autoscaling is
enabled for job 2018-03-27_14_12_42-8666691247344616596. The number of workers
will be between 1 and 1000.
root: INFO: 2018-03-27T21:12:42.479Z: JOB_MESSAGE_DETAILED: Autoscaling was
automatically enabled for job 2018-03-27_14_12_42-8666691247344616596.
root: INFO: 2018-03-27T21:12:45.703Z: JOB_MESSAGE_DETAILED: Checking required
Cloud APIs are enabled.
root: INFO: 2018-03-27T21:12:45.823Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2018-03-27T21:12:46.601Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T21:12:46.645Z: JOB_MESSAGE_DEBUG: Combiner lifting
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a
combiner.
root: INFO: 2018-03-27T21:12:46.685Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into optimizable parts.
root: INFO: 2018-03-27T21:12:46.777Z: JOB_MESSAGE_DETAILED: Lifting
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-03-27T21:12:46.811Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2018-03-27T21:12:46.901Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2018-03-27T21:12:46.925Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/MapToVoidKey1 into side pairs/Read
root: INFO: 2018-03-27T21:12:46.960Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/MapToVoidKey1 into side pairs/Read
root: INFO: 2018-03-27T21:12:47.037Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T21:12:47.095Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/MapToVoidKey0 into side list/Read
root: INFO: 2018-03-27T21:12:47.190Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/GroupByWindow into
assert_that/Group/GroupByKey/Read
root: INFO: 2018-03-27T21:12:47.225Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-03-27T21:12:47.304Z: JOB_MESSAGE_DETAILED: Unzipping flatten
s14 for input s12.out
root: INFO: 2018-03-27T21:12:47.343Z: JOB_MESSAGE_DETAILED: Fusing unzipped
copy of assert_that/Group/GroupByKey/Reify, through flatten
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_0
root: INFO: 2018-03-27T21:12:47.411Z: JOB_MESSAGE_DETAILED: Unzipping flatten
s14-u13 for input s15-reify-value0-c11
root: INFO: 2018-03-27T21:12:47.445Z: JOB_MESSAGE_DETAILED: Fusing unzipped
copy of assert_that/Group/GroupByKey/Write, through flatten s14-u13, into
producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T21:12:47.484Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-03-27T21:12:47.568Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Match into assert_that/Unkey
root: INFO: 2018-03-27T21:12:47.608Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-03-27T21:12:47.693Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_1
root: INFO: 2018-03-27T21:12:47.766Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-03-27T21:12:47.794Z: JOB_MESSAGE_DETAILED: Fusing consumer
concatenate/concatenate into main input/Read
root: INFO: 2018-03-27T21:12:47.833Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/WindowInto(WindowIntoFn) into concatenate/concatenate
root: INFO: 2018-03-27T21:12:47.904Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-03-27T21:12:47.935Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-03-27T21:12:48.016Z: JOB_MESSAGE_DEBUG: Workflow config is
missing a default resource spec.
root: INFO: 2018-03-27T21:12:48.047Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2018-03-27T21:12:48.076Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2018-03-27T21:12:48.150Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-03-27T21:12:48.363Z: JOB_MESSAGE_DEBUG: Executing wait step
start23
root: INFO: 2018-03-27T21:12:48.436Z: JOB_MESSAGE_BASIC: Executing operation
side pairs/Read+concatenate/MapToVoidKey1+concatenate/MapToVoidKey1
root: INFO: 2018-03-27T21:12:48.523Z: JOB_MESSAGE_BASIC: Executing operation
side list/Read+concatenate/MapToVoidKey0+concatenate/MapToVoidKey0
root: INFO: 2018-03-27T21:12:48.536Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2018-03-27T21:12:48.559Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Create
root: INFO: 2018-03-27T21:12:48.562Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-f...
root: INFO: 2018-03-27T21:12:48.745Z: JOB_MESSAGE_DEBUG: Value
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state
JOB_STATE_RUNNING
root: INFO: 2018-03-27T21:12:48.806Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T21:12:58.494Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 0 based on the rate of progress in the currently
running step(s).
root: INFO: 2018-03-27T21:13:14.356Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 1 based on the rate of progress in the currently
running step(s).
root: INFO: 2018-03-27T21:15:21.834Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2018-03-27T21:18:38.828Z: JOB_MESSAGE_DEBUG: Value
"concatenate/MapToVoidKey1.out" materialized.
root: INFO: 2018-03-27T21:18:38.911Z: JOB_MESSAGE_BASIC: Executing operation
concatenate/_DataflowIterableSideInput(MapToVoidKey1.out.0)
root: INFO: 2018-03-27T21:18:39.035Z: JOB_MESSAGE_DEBUG: Value
"concatenate/_DataflowIterableSideInput(MapToVoidKey1.out.0).output"
materialized.
root: INFO: 2018-03-27T21:18:57.854Z: JOB_MESSAGE_DEBUG: Value
"concatenate/MapToVoidKey0.out" materialized.
root: INFO: 2018-03-27T21:18:57.930Z: JOB_MESSAGE_BASIC: Executing operation
concatenate/_DataflowIterableSideInput(MapToVoidKey0.out.0)
root: INFO: 2018-03-27T21:18:58.046Z: JOB_MESSAGE_DEBUG: Value
"concatenate/_DataflowIterableSideInput(MapToVoidKey0.out.0).output"
materialized.
root: INFO: 2018-03-27T21:18:58.123Z: JOB_MESSAGE_BASIC: Executing operation
main
input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-03-27T21:19:02.856Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 339, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 372, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-27T21:19:06.362Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 339, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 372, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-27T21:19:08.751Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 339, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 372, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-27T21:19:09.173Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 609, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 167, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 339, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 340, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 372, in
apache_beam.runners.worker.operations.DoOperation.start
self.dofn_runner = common.DoFnRunner(
File "apache_beam/runners/common.py", line 483, in
apache_beam.runners.common.DoFnRunner.__init__
self.do_fn_invoker = DoFnInvoker.create_invoker(
File "apache_beam/runners/common.py", line 203, in
apache_beam.runners.common.DoFnInvoker.create_invoker
return PerWindowInvoker(
File "apache_beam/runners/common.py", line 313, in
apache_beam.runners.common.PerWindowInvoker.__init__
input_args, input_kwargs, [si[global_window] for si in side_inputs])
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/transforms/sideinputs.py",
line 62, in __getitem__
self._cache[window] = self._view_class._from_runtime_iterable(
AttributeError: type object '_DataflowIterableSideInput' has no attribute
'_from_runtime_iterable'
root: INFO: 2018-03-27T21:19:09.223Z: JOB_MESSAGE_DEBUG: Executing failure step
failure22
root: INFO: 2018-03-27T21:19:09.257Z: JOB_MESSAGE_ERROR: Workflow failed.
Causes: S07:main
input/Read+concatenate/concatenate+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., A work item was attempted 4 times without success. Each time the
worker eventually lost contact with the service. The work item was attempted
on:
beamapp-jenkins-032721123-03271412-8153-harness-58mk,
beamapp-jenkins-032721123-03271412-8153-harness-58mk,
beamapp-jenkins-032721123-03271412-8153-harness-58mk,
beamapp-jenkins-032721123-03271412-8153-harness-58mk
root: INFO: 2018-03-27T21:19:09.382Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-03-27T21:19:09.435Z: JOB_MESSAGE_DEBUG: Starting worker pool
teardown.
root: INFO: 2018-03-27T21:19:09.471Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-03-27T21:20:47.349Z: JOB_MESSAGE_DETAILED: Autoscaling:
Reduced the number of workers to 0 based on the rate of progress in the
currently running step(s).
root: INFO: 2018-03-27T21:20:47.445Z: JOB_MESSAGE_DEBUG: Tearing down pending
resources...
root: INFO: Job 2018-03-27_14_12_42-8666691247344616596 is in state
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
Ran 16 tests in 929.102s
FAILED (errors=12)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-6766723166623993968?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_31-10929177711809778955?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_15_16-16354086526836046220?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-1201976823224416842?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_07_27-7271190073261436645?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_09_20-12600433319759080049?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_10_50-798850488600387760?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_12_42-8666691247344616596?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-1765600968155479915?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_26-10060460114915233142?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_14_51-11698478401267540067?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_16_36-2122733496028554464?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_05_50-16039646830469645721?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_13_00-6879600078595900695?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_14_46-15759488323730311834?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-03-27_14_16_22-5100222963713141427?project=apache-beam-testing.
Build step 'Execute shell' marked build as failure
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]
Not sending mail to unregistered user [email protected]