See
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/596/display/redirect?page=changes>
Changes:
[github] [BEAM-4752] Add support for newer dill dependency (#5931)
------------------------------------------
[...truncated 1.31 MB...]
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s14"
},
"serialized_fn": "<string of 1124 bytes>",
"user_name": "assert_that/Match"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: u'2018-07-13T18:06:48.527308Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2018-07-13_11_06_47-6503774580458991719'
location: u'us-central1'
name: u'beamapp-jenkins-0713180641-930106'
projectId: u'apache-beam-testing'
stageStates: []
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2018-07-13_11_06_47-6503774580458991719]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_06_47-6503774580458991719?project=apache-beam-testing
root: INFO: Job 2018-07-13_11_06_47-6503774580458991719 is in state
JOB_STATE_RUNNING
root: INFO: 2018-07-13T18:06:47.811Z: JOB_MESSAGE_DETAILED: Autoscaling is
enabled for job 2018-07-13_11_06_47-6503774580458991719. The number of workers
will be between 1 and 1000.
root: INFO: 2018-07-13T18:06:47.854Z: JOB_MESSAGE_DETAILED: Autoscaling was
automatically enabled for job 2018-07-13_11_06_47-6503774580458991719.
root: INFO: 2018-07-13T18:06:50.209Z: JOB_MESSAGE_DETAILED: Checking required
Cloud APIs are enabled.
root: INFO: 2018-07-13T18:06:50.333Z: JOB_MESSAGE_DETAILED: Checking
permissions granted to controller Service Account.
root: INFO: 2018-07-13T18:06:51.176Z: JOB_MESSAGE_BASIC: Worker configuration:
n1-standard-1 in us-central1-f.
root: INFO: 2018-07-13T18:06:51.738Z: JOB_MESSAGE_DETAILED: Expanding
CoGroupByKey operations into optimizable parts.
root: INFO: 2018-07-13T18:06:51.787Z: JOB_MESSAGE_DEBUG: Combiner lifting
skipped for step assert_that/Group/GroupByKey: GroupByKey not followed by a
combiner.
root: INFO: 2018-07-13T18:06:51.832Z: JOB_MESSAGE_DETAILED: Expanding
GroupByKey operations into optimizable parts.
root: INFO: 2018-07-13T18:06:51.869Z: JOB_MESSAGE_DETAILED: Lifting
ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2018-07-13T18:06:51.952Z: JOB_MESSAGE_DEBUG: Annotating graph with
Autotuner information.
root: INFO: 2018-07-13T18:06:51.998Z: JOB_MESSAGE_DETAILED: Fusing adjacent
ParDo, Read, Write, and Flatten operations
root: INFO: 2018-07-13T18:06:52.036Z: JOB_MESSAGE_DETAILED: Unzipping flatten
s11 for input s10.out
root: INFO: 2018-07-13T18:06:52.069Z: JOB_MESSAGE_DETAILED: Fusing unzipped
copy of assert_that/Group/GroupByKey/Reify, through flatten
assert_that/Group/Flatten, into producer assert_that/Group/pair_with_1
root: INFO: 2018-07-13T18:06:52.104Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/GroupByWindow
root: INFO: 2018-07-13T18:06:52.144Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Match into assert_that/Unkey
root: INFO: 2018-07-13T18:06:52.185Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Unkey into assert_that/Group/Map(_merge_tagged_vals_under_key)
root: INFO: 2018-07-13T18:06:52.229Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/GroupByWindow into
assert_that/Group/GroupByKey/Read
root: INFO: 2018-07-13T18:06:52.271Z: JOB_MESSAGE_DETAILED: Unzipping flatten
s11-u13 for input s12-reify-value0-c11
root: INFO: 2018-07-13T18:06:52.307Z: JOB_MESSAGE_DETAILED: Fusing unzipped
copy of assert_that/Group/GroupByKey/Write, through flatten s11-u13, into
producer assert_that/Group/GroupByKey/Reify
root: INFO: 2018-07-13T18:06:52.346Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/Reify into assert_that/Group/pair_with_0
root: INFO: 2018-07-13T18:06:52.387Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/Write into assert_that/Group/GroupByKey/Reify
root: INFO: 2018-07-13T18:06:52.430Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/pair_with_0 into assert_that/Create/Read
root: INFO: 2018-07-13T18:06:52.472Z: JOB_MESSAGE_DETAILED: Fusing consumer
Map(<lambda at sideinputs_test.py:213>)/Map(<lambda at sideinputs_test.py:213>)
into main input/Read
root: INFO: 2018-07-13T18:06:52.512Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/WindowInto(WindowIntoFn) into Map(<lambda at
sideinputs_test.py:213>)/Map(<lambda at sideinputs_test.py:213>)
root: INFO: 2018-07-13T18:06:52.549Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/pair_with_1 into assert_that/ToVoidKey
root: INFO: 2018-07-13T18:06:52.592Z: JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/ToVoidKey into assert_that/WindowInto(WindowIntoFn)
root: INFO: 2018-07-13T18:06:52.631Z: JOB_MESSAGE_DEBUG: Workflow config is
missing a default resource spec.
root: INFO: 2018-07-13T18:06:52.676Z: JOB_MESSAGE_DEBUG: Adding StepResource
setup and teardown to workflow graph.
root: INFO: 2018-07-13T18:06:52.712Z: JOB_MESSAGE_DEBUG: Adding workflow start
and stop steps.
root: INFO: 2018-07-13T18:06:52.757Z: JOB_MESSAGE_DEBUG: Assigning stage ids.
root: INFO: 2018-07-13T18:06:52.931Z: JOB_MESSAGE_DEBUG: Executing wait step
start21
root: INFO: 2018-07-13T18:06:53.012Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Group/GroupByKey/Create
root: INFO: 2018-07-13T18:06:53.048Z: JOB_MESSAGE_BASIC: Executing operation
side list/Read
root: INFO: 2018-07-13T18:06:53.071Z: JOB_MESSAGE_DEBUG: Starting worker pool
setup.
root: INFO: 2018-07-13T18:06:53.117Z: JOB_MESSAGE_BASIC: Starting 1 workers in
us-central1-f...
root: INFO: 2018-07-13T18:06:53.141Z: JOB_MESSAGE_DEBUG: Value "side
list/Read.out" materialized.
root: INFO: 2018-07-13T18:06:53.203Z: JOB_MESSAGE_DEBUG: Value
"assert_that/Group/GroupByKey/Session" materialized.
root: INFO: 2018-07-13T18:06:53.237Z: JOB_MESSAGE_BASIC: Executing operation
Map(<lambda at sideinputs_test.py:213>)/_UnpickledSideInput(Read.out.0)
root: INFO: 2018-07-13T18:06:53.271Z: JOB_MESSAGE_BASIC: Executing operation
Map(<lambda at sideinputs_test.py:213>)/_UnpickledSideInput(Read.out.1)
root: INFO: 2018-07-13T18:06:53.316Z: JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-07-13T18:06:53.361Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at
sideinputs_test.py:213>)/_UnpickledSideInput(Read.out.0).output" materialized.
root: INFO: 2018-07-13T18:06:53.397Z: JOB_MESSAGE_DEBUG: Value "Map(<lambda at
sideinputs_test.py:213>)/_UnpickledSideInput(Read.out.1).output" materialized.
root: INFO: 2018-07-13T18:06:53.478Z: JOB_MESSAGE_BASIC: Executing operation
main input/Read+Map(<lambda at sideinputs_test.py:213>)/Map(<lambda at
sideinputs_test.py:213>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
root: INFO: 2018-07-13T18:07:03.133Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 0 based on the rate of progress in the currently
running step(s).
root: INFO: 2018-07-13T18:07:34.533Z: JOB_MESSAGE_DETAILED: Autoscaling: Raised
the number of workers to 1 based on the rate of progress in the currently
running step(s).
root: INFO: 2018-07-13T18:07:34.576Z: JOB_MESSAGE_DETAILED: Autoscaling: Would
further reduce the number of workers but reached the minimum number allowed for
the job.
root: INFO: 2018-07-13T18:08:46.323Z: JOB_MESSAGE_DETAILED: Workers have
started successfully.
root: INFO: 2018-07-13T18:11:26.770Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 642, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 156, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 344, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 345, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 350, in
apache_beam.runners.worker.operations.DoOperation.start
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
238, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
return load(file)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
obj = pik.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1096, in load_global
klass = self.find_class(module, name)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 423, in
find_class
return StockUnpickler.find_class(self, module, name)
File "/usr/lib/python2.7/pickle.py", line 1130, in find_class
__import__(module)
ImportError: No module named _dill
root: INFO: 2018-07-13T18:11:29.950Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 642, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 156, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 344, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 345, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 350, in
apache_beam.runners.worker.operations.DoOperation.start
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
238, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
return load(file)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
obj = pik.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1096, in load_global
klass = self.find_class(module, name)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 423, in
find_class
return StockUnpickler.find_class(self, module, name)
File "/usr/lib/python2.7/pickle.py", line 1130, in find_class
__import__(module)
ImportError: No module named _dill
root: INFO: 2018-07-13T18:11:33.161Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 642, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 156, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 344, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 345, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 350, in
apache_beam.runners.worker.operations.DoOperation.start
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
238, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
return load(file)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
obj = pik.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1096, in load_global
klass = self.find_class(module, name)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 423, in
find_class
return StockUnpickler.find_class(self, module, name)
File "/usr/lib/python2.7/pickle.py", line 1130, in find_class
__import__(module)
ImportError: No module named _dill
root: INFO: 2018-07-13T18:11:36.347Z: JOB_MESSAGE_ERROR: Traceback (most recent
call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 642, in do_work
work_executor.execute()
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py",
line 156, in execute
op.start()
File "apache_beam/runners/worker/operations.py", line 344, in
apache_beam.runners.worker.operations.DoOperation.start
def start(self):
File "apache_beam/runners/worker/operations.py", line 345, in
apache_beam.runners.worker.operations.DoOperation.start
with self.scoped_start_state:
File "apache_beam/runners/worker/operations.py", line 350, in
apache_beam.runners.worker.operations.DoOperation.start
pickler.loads(self.spec.serialized_fn))
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
238, in loads
return dill.loads(s)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 277, in loads
return load(file)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 266, in load
obj = pik.load()
File "/usr/lib/python2.7/pickle.py", line 864, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1096, in load_global
klass = self.find_class(module, name)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 423, in
find_class
return StockUnpickler.find_class(self, module, name)
File "/usr/lib/python2.7/pickle.py", line 1130, in find_class
__import__(module)
ImportError: No module named _dill
root: INFO: 2018-07-13T18:11:36.397Z: JOB_MESSAGE_DEBUG: Executing failure step
failure20
root: INFO: 2018-07-13T18:11:36.444Z: JOB_MESSAGE_ERROR: Workflow failed.
Causes: S06:main input/Read+Map(<lambda at sideinputs_test.py:213>)/Map(<lambda
at
sideinputs_test.py:213>)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
failed., A work item was attempted 4 times without success. Each time the
worker eventually lost contact with the service. The work item was attempted
on:
beamapp-jenkins-071318064-07131106-eubu-harness-030m,
beamapp-jenkins-071318064-07131106-eubu-harness-030m,
beamapp-jenkins-071318064-07131106-eubu-harness-030m,
beamapp-jenkins-071318064-07131106-eubu-harness-030m
root: INFO: 2018-07-13T18:11:36.621Z: JOB_MESSAGE_DETAILED: Cleaning up.
root: INFO: 2018-07-13T18:11:36.678Z: JOB_MESSAGE_DEBUG: Starting worker pool
teardown.
root: INFO: 2018-07-13T18:11:36.722Z: JOB_MESSAGE_BASIC: Stopping worker pool...
root: INFO: 2018-07-13T18:13:14.536Z: JOB_MESSAGE_DETAILED: Autoscaling:
Reduced the number of workers to 0 based on the rate of progress in the
currently running step(s).
root: INFO: 2018-07-13T18:13:14.587Z: JOB_MESSAGE_BASIC: Worker pool stopped.
root: INFO: 2018-07-13T18:13:14.619Z: JOB_MESSAGE_DEBUG: Tearing down pending
resources...
root: INFO: Job 2018-07-13_11_06_47-6503774580458991719 is in state
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML:
<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 16 tests in 763.905s
FAILED (errors=15)
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_04-8861744825699680641?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_07_57-16361049430650504259?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_05-9924871110631356788?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_07_08-3767373755959007140?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_05-3715755387236352179?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_07_08-12883390187929332919?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_04-15291842371424654171?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_06_58-6497891430942434023?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_04-11024953510760491991?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_06_37-6721000336552321767?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_04-8375333924129771047?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_07_22-4157069115947863857?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_04-8345425708502502648?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_06_47-6503774580458991719?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_01_05-7292488924303189325?project=apache-beam-testing.
Found:
https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-07-13_11_06_43-5067674007401283983?project=apache-beam-testing.
> Task :beam-sdks-python:validatesRunnerTests FAILED
:beam-sdks-python:validatesRunnerTests (Thread[Task worker for ':',5,main])
completed. Took 12 mins 45.718 secs.
FAILURE: Build failed with an exception.
* Where:
Build file
'<https://builds.apache.org/job/beam_PostCommit_Py_VR_Dataflow/ws/src/sdks/python/build.gradle'>
line: 237
* What went wrong:
Execution failed for task ':beam-sdks-python:validatesRunnerTests'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --debug option to
get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 5.0.
See
https://docs.gradle.org/4.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 13m 14s
3 actionable tasks: 3 executed
Publishing build scan...
https://gradle.com/s/dodzjwzhubkvm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure