See
<https://builds.apache.org/job/beam_PostCommit_Python_Verify/2317/display/redirect?page=changes>
Changes:
[robertwb] More robust gen_protos on jenkins.
------------------------------------------
[...truncated 576.70 KB...]
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
}
]
},
"output_name": "out",
"user_name":
"write/Write/WriteImpl/FinalizeWrite/SideInput-s16.output"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s14"
},
"user_name": "write/Write/WriteImpl/FinalizeWrite/SideInput-s16"
}
},
{
"kind": "ParallelDo",
"name": "s17",
"properties": {
"display_data": [
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.CallableWrapperDoFn",
"type": "STRING",
"value": "_finalize_write"
},
{
"key": "fn",
"label": "Transform Function",
"namespace": "apache_beam.transforms.core.ParDo",
"shortValue": "CallableWrapperDoFn",
"type": "STRING",
"value": "apache_beam.transforms.core.CallableWrapperDoFn"
}
],
"non_parallel_inputs": {
"SideInput-s15": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "SideInput-s15"
},
"SideInput-s16": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "SideInput-s16"
}
},
"output_info": [
{
"encoding": {
"@type": "kind:windowed_value",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": [
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
},
{
"@type":
"FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/",
"component_encodings": []
}
],
"is_pair_like": true
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "out",
"user_name": "write/Write/WriteImpl/FinalizeWrite.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "out",
"step_name": "s7"
},
"serialized_fn": "<string of 1056 bytes>",
"user_name": "write/Write/WriteImpl/FinalizeWrite/Do"
}
}
],
"type": "JOB_TYPE_BATCH"
}
root: INFO: Create job: <Job
createTime: u'2017-05-25T20:33:52.834648Z'
currentStateTime: u'1970-01-01T00:00:00Z'
id: u'2017-05-25_13_33_52-10908932325153537174'
location: u'global'
name: u'beamapp-jenkins-0525203351-247367'
projectId: u'apache-beam-testing'
stageStates: []
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
root: INFO: Created job with id: [2017-05-25_13_33_52-10908932325153537174]
root: INFO: To access the Dataflow monitoring console, please navigate to
https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-05-25_13_33_52-10908932325153537174
root: INFO: Job 2017-05-25_13_33_52-10908932325153537174 is in state
JOB_STATE_RUNNING
root: INFO: 2017-05-25T20:33:52.299Z: JOB_MESSAGE_WARNING: (97645026d78d7dd2):
Setting the number of workers (1) disables autoscaling for this job. If you are
trying to cap autoscaling, consider only setting max_num_workers. If you want
to disable autoscaling altogether, the documented way is to explicitly use
autoscalingAlgorithm=NONE.
root: INFO: 2017-05-25T20:33:54.443Z: JOB_MESSAGE_DETAILED: (fde36659798150df):
Checking required Cloud APIs are enabled.
root: INFO: 2017-05-25T20:33:55.459Z: JOB_MESSAGE_DEBUG: (fde366597981550e):
Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey
not followed by a combiner.
root: INFO: 2017-05-25T20:33:55.461Z: JOB_MESSAGE_DEBUG: (fde36659798158d0):
Combiner lifting skipped for step group: GroupByKey not followed by a combiner.
root: INFO: 2017-05-25T20:33:55.464Z: JOB_MESSAGE_DETAILED: (fde3665979815c92):
Expanding GroupByKey operations into optimizable parts.
root: INFO: 2017-05-25T20:33:55.468Z: JOB_MESSAGE_DETAILED: (fde3665979815054):
Lifting ValueCombiningMappingFns into MergeBucketsMappingFns
root: INFO: 2017-05-25T20:33:55.476Z: JOB_MESSAGE_DEBUG: (fde3665979815b9a):
Annotating graph with Autotuner information.
root: INFO: 2017-05-25T20:33:55.490Z: JOB_MESSAGE_DETAILED: (fde366597981531e):
Fusing adjacent ParDo, Read, Write, and Flatten operations
root: INFO: 2017-05-25T20:33:55.493Z: JOB_MESSAGE_DETAILED: (fde36659798156e0):
Fusing consumer split into read/Read
root: INFO: 2017-05-25T20:33:55.495Z: JOB_MESSAGE_DETAILED: (fde3665979815aa2):
Fusing consumer group/Write into group/Reify
root: INFO: 2017-05-25T20:33:55.498Z: JOB_MESSAGE_DETAILED: (fde3665979815e64):
Fusing consumer group/GroupByWindow into group/Read
root: INFO: 2017-05-25T20:33:55.502Z: JOB_MESSAGE_DETAILED: (fde3665979815226):
Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into
write/Write/WriteImpl/GroupByKey/Read
root: INFO: 2017-05-25T20:33:55.504Z: JOB_MESSAGE_DETAILED: (fde36659798155e8):
Fusing consumer write/Write/WriteImpl/GroupByKey/Write into
write/Write/WriteImpl/GroupByKey/Reify
root: INFO: 2017-05-25T20:33:55.509Z: JOB_MESSAGE_DETAILED: (fde3665979815d6c):
Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into
write/Write/WriteImpl/Pair
root: INFO: 2017-05-25T20:33:55.512Z: JOB_MESSAGE_DETAILED: (fde366597981512e):
Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into
write/Write/WriteImpl/WindowInto(WindowIntoFn)
root: INFO: 2017-05-25T20:33:55.516Z: JOB_MESSAGE_DETAILED: (fde36659798154f0):
Fusing consumer pair_with_one into split
root: INFO: 2017-05-25T20:33:55.518Z: JOB_MESSAGE_DETAILED: (fde36659798158b2):
Fusing consumer group/Reify into pair_with_one
root: INFO: 2017-05-25T20:33:55.521Z: JOB_MESSAGE_DETAILED: (fde3665979815c74):
Fusing consumer write/Write/WriteImpl/WriteBundles/Do into format
root: INFO: 2017-05-25T20:33:55.526Z: JOB_MESSAGE_DETAILED: (fde3665979815036):
Fusing consumer write/Write/WriteImpl/Pair into
write/Write/WriteImpl/WriteBundles/Do
root: INFO: 2017-05-25T20:33:55.529Z: JOB_MESSAGE_DETAILED: (fde36659798153f8):
Fusing consumer format into count
root: INFO: 2017-05-25T20:33:55.532Z: JOB_MESSAGE_DETAILED: (fde36659798157ba):
Fusing consumer write/Write/WriteImpl/Extract into
write/Write/WriteImpl/GroupByKey/GroupByWindow
root: INFO: 2017-05-25T20:33:55.535Z: JOB_MESSAGE_DETAILED: (fde3665979815b7c):
Fusing consumer count into group/GroupByWindow
root: INFO: 2017-05-25T20:33:55.549Z: JOB_MESSAGE_DETAILED: (fde3665979815a84):
Fusing consumer write/Write/WriteImpl/InitializeWrite into
write/Write/WriteImpl/DoOnce/Read
root: INFO: 2017-05-25T20:33:55.643Z: JOB_MESSAGE_DEBUG: (fde3665979815b40):
Workflow config is missing a default resource spec.
root: INFO: 2017-05-25T20:33:55.647Z: JOB_MESSAGE_DETAILED: (fde3665979815f02):
Adding StepResource setup and teardown to workflow graph.
root: INFO: 2017-05-25T20:33:55.651Z: JOB_MESSAGE_DEBUG: (fde36659798152c4):
Adding workflow start and stop steps.
root: INFO: 2017-05-25T20:33:55.654Z: JOB_MESSAGE_DEBUG: (fde3665979815686):
Assigning stage ids.
root: INFO: 2017-05-25T20:33:55.698Z: JOB_MESSAGE_DEBUG: (e0f7eccdcc6dc18):
Executing wait step start25
root: INFO: 2017-05-25T20:33:55.711Z: JOB_MESSAGE_BASIC: (e0f7eccdcc6d3fa):
Executing operation
write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
root: INFO: 2017-05-25T20:33:55.716Z: JOB_MESSAGE_BASIC: (5649c2f38b72d0fb):
Executing operation group/Create
root: INFO: 2017-05-25T20:33:55.918Z: JOB_MESSAGE_DEBUG: (e463699c10aeac4f):
Starting worker pool setup.
root: INFO: 2017-05-25T20:33:55.921Z: JOB_MESSAGE_BASIC: (e463699c10aea855):
Starting 1 workers...
root: INFO: 2017-05-25T20:33:55.941Z: JOB_MESSAGE_DEBUG: (5649c2f38b72dac4):
Value "group/Session" materialized.
root: INFO: 2017-05-25T20:33:55.952Z: JOB_MESSAGE_BASIC: (5649c2f38b72d8cc):
Executing operation read/Read+split+pair_with_one+group/Reify+group/Write
root: INFO: 2017-05-25T20:36:47.990Z: JOB_MESSAGE_DETAILED: (ec660d236c030370):
Workers have started successfully.
root: INFO: 2017-05-25T20:38:35.245Z: JOB_MESSAGE_ERROR: (9eea00070bd5cecd):
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 706, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 446, in _load_main_session
pickler.load_session(session_file)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
247, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in
load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in
_import_module
return __import__(import_name)
ImportError: No module named gen_protos
root: INFO: 2017-05-25T20:38:37.315Z: JOB_MESSAGE_ERROR: (9eea00070bd5cf65):
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 706, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 446, in _load_main_session
pickler.load_session(session_file)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
247, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in
load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in
_import_module
return __import__(import_name)
ImportError: No module named gen_protos
root: INFO: 2017-05-25T20:38:39.377Z: JOB_MESSAGE_ERROR: (9eea00070bd5cffd):
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 706, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 446, in _load_main_session
pickler.load_session(session_file)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
247, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in
load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in
_import_module
return __import__(import_name)
ImportError: No module named gen_protos
root: INFO: 2017-05-25T20:38:41.442Z: JOB_MESSAGE_ERROR: (9eea00070bd5c095):
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 706, in run
self._load_main_session(self.local_staging_directory)
File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py",
line 446, in _load_main_session
pickler.load_session(session_file)
File
"/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line
247, in load_session
return dill.load_session(file_path)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in
load_session
module = unpickler.load()
File "/usr/lib/python2.7/pickle.py", line 858, in load
dispatch[key](self)
File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce
value = func(*args)
File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in
_import_module
return __import__(import_name)
ImportError: No module named gen_protos
root: INFO: 2017-05-25T20:38:42.686Z: JOB_MESSAGE_DEBUG: (e0f7eccdcc6d705):
Executing failure step failure24
root: INFO: 2017-05-25T20:38:42.689Z: JOB_MESSAGE_ERROR: (e0f7eccdcc6defb):
Workflow failed. Causes: (e0f7eccdcc6d719):
S01:write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite
failed., (134af2bc2c846c94): Failed to split source.
root: INFO: 2017-05-25T20:38:42.763Z: JOB_MESSAGE_DETAILED: (fde3665979815331):
Cleaning up.
root: INFO: 2017-05-25T20:38:42.769Z: JOB_MESSAGE_DEBUG: (fde36659798156f3):
Starting worker pool teardown.
root: INFO: 2017-05-25T20:38:42.774Z: JOB_MESSAGE_BASIC: (fde3665979815ab5):
Stopping worker pool...
root: INFO: 2017-05-25T20:40:02.786Z: JOB_MESSAGE_BASIC: (fde3665979815858):
Worker pool stopped.
root: INFO: 2017-05-25T20:40:02.831Z: JOB_MESSAGE_DEBUG: (fde3665979815760):
Tearing down pending resources...
root: INFO: Job 2017-05-25_13_33_52-10908932325153537174 is in state
JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
Ran 2 tests in 377.787s
FAILED (errors=1)
Found:
https://console.cloud.google.com/dataflow/job/2017-05-25_13_33_52-10908932325153537174?project=apache-beam-testing
Found:
https://console.cloud.google.com/dataflow/job/2017-05-25_13_33_52-12774029545839592167?project=apache-beam-testing
Build step 'Execute shell' marked build as failure