See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/2312/display/redirect?page=changes>
Changes: [robertwb] Automatically generate Python proto and grpc files. [robertwb] Remove auto-generated proto and grpc files. [robertwb] A couple of worker fixes. [robertwb] Adding a snippet for metrics ------------------------------------------ [...truncated 577.87 KB...] { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true } ] }, "output_name": "out", "user_name": "write/Write/WriteImpl/FinalizeWrite/SideInput-s16.output" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s14" }, "user_name": "write/Write/WriteImpl/FinalizeWrite/SideInput-s16" } }, { "kind": "ParallelDo", "name": "s17", "properties": { "display_data": [ { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", "type": "STRING", "value": "_finalize_write" }, { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.ParDo", "shortValue": "CallableWrapperDoFn", "type": "STRING", "value": "apache_beam.transforms.core.CallableWrapperDoFn" } ], "non_parallel_inputs": { "SideInput-s15": { "@type": "OutputReference", "output_name": "out", "step_name": "SideInput-s15" }, "SideInput-s16": { "@type": "OutputReference", "output_name": "out", "step_name": "SideInput-s16" } }, "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "write/Write/WriteImpl/FinalizeWrite.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s7" }, "serialized_fn": "<string of 1056 bytes>", "user_name": "write/Write/WriteImpl/FinalizeWrite/Do" } } ], "type": "JOB_TYPE_BATCH" } root: INFO: Create job: <Job createTime: u'2017-05-24T23:33:26.535110Z' currentStateTime: u'1970-01-01T00:00:00Z' id: u'2017-05-24_16_33_25-3759598730270575457' location: u'global' name: u'beamapp-jenkins-0524233324-227157' projectId: u'apache-beam-testing' stageStates: [] steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2017-05-24_16_33_25-3759598730270575457] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.developers.google.com/project/apache-beam-testing/dataflow/job/2017-05-24_16_33_25-3759598730270575457 root: INFO: Job 2017-05-24_16_33_25-3759598730270575457 is in state JOB_STATE_RUNNING root: INFO: 2017-05-24T23:33:25.876Z: JOB_MESSAGE_WARNING: (342cc7203fc98d55): Setting the number of workers (1) disables autoscaling for this job. If you are trying to cap autoscaling, consider only setting max_num_workers. If you want to disable autoscaling altogether, the documented way is to explicitly use autoscalingAlgorithm=NONE. root: INFO: 2017-05-24T23:33:28.186Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c49f): Checking required Cloud APIs are enabled. root: INFO: 2017-05-24T23:33:29.220Z: JOB_MESSAGE_DEBUG: (546cb52e93e4c1ca): Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2017-05-24T23:33:29.225Z: JOB_MESSAGE_DEBUG: (546cb52e93e4cc14): Combiner lifting skipped for step group: GroupByKey not followed by a combiner. root: INFO: 2017-05-24T23:33:29.227Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c65e): Expanding GroupByKey operations into optimizable parts. root: INFO: 2017-05-24T23:33:29.231Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c0a8): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2017-05-24T23:33:29.240Z: JOB_MESSAGE_DEBUG: (546cb52e93e4cf86): Annotating graph with Autotuner information. root: INFO: 2017-05-24T23:33:29.278Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c41a): Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2017-05-24T23:33:29.282Z: JOB_MESSAGE_DETAILED: (546cb52e93e4ce64): Fusing consumer split into read/Read root: INFO: 2017-05-24T23:33:29.284Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c8ae): Fusing consumer group/Write into group/Reify root: INFO: 2017-05-24T23:33:29.286Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c2f8): Fusing consumer group/GroupByWindow into group/Read root: INFO: 2017-05-24T23:33:29.289Z: JOB_MESSAGE_DETAILED: (546cb52e93e4cd42): Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read root: INFO: 2017-05-24T23:33:29.292Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c78c): Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify root: INFO: 2017-05-24T23:33:29.299Z: JOB_MESSAGE_DETAILED: (546cb52e93e4cc20): Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair root: INFO: 2017-05-24T23:33:29.302Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c66a): Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn) root: INFO: 2017-05-24T23:33:29.305Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c0b4): Fusing consumer pair_with_one into split root: INFO: 2017-05-24T23:33:29.308Z: JOB_MESSAGE_DETAILED: (546cb52e93e4cafe): Fusing consumer group/Reify into pair_with_one root: INFO: 2017-05-24T23:33:29.311Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c548): Fusing consumer write/Write/WriteImpl/WriteBundles/Do into format root: INFO: 2017-05-24T23:33:29.337Z: JOB_MESSAGE_DETAILED: (546cb52e93e4cf92): Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/Do root: INFO: 2017-05-24T23:33:29.342Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c9dc): Fusing consumer format into count root: INFO: 2017-05-24T23:33:29.344Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c426): Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow root: INFO: 2017-05-24T23:33:29.346Z: JOB_MESSAGE_DETAILED: (546cb52e93e4ce70): Fusing consumer count into group/GroupByWindow root: INFO: 2017-05-24T23:33:29.356Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c798): Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read root: INFO: 2017-05-24T23:33:29.475Z: JOB_MESSAGE_DEBUG: (546cb52e93e4cc44): Workflow config is missing a default resource spec. root: INFO: 2017-05-24T23:33:29.478Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c68e): Adding StepResource setup and teardown to workflow graph. root: INFO: 2017-05-24T23:33:29.481Z: JOB_MESSAGE_DEBUG: (546cb52e93e4c0d8): Adding workflow start and stop steps. root: INFO: 2017-05-24T23:33:29.484Z: JOB_MESSAGE_DEBUG: (546cb52e93e4cb22): Assigning stage ids. root: INFO: 2017-05-24T23:33:29.530Z: JOB_MESSAGE_DEBUG: (6d69d9988ce4eddc): Executing wait step start25 root: INFO: 2017-05-24T23:33:29.539Z: JOB_MESSAGE_BASIC: (6d69d9988ce4e3c6): Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite root: INFO: 2017-05-24T23:33:29.541Z: JOB_MESSAGE_BASIC: (1e5138f9b2d4cbb): Executing operation group/Create root: INFO: 2017-05-24T23:33:29.743Z: JOB_MESSAGE_DEBUG: (d21e228fdb83a76a): Starting worker pool setup. root: INFO: 2017-05-24T23:33:29.747Z: JOB_MESSAGE_BASIC: (d21e228fdb83a918): Starting 1 workers... root: INFO: 2017-05-24T23:33:29.767Z: JOB_MESSAGE_DEBUG: (1e5138f9b2d4aa4): Value "group/Session" materialized. root: INFO: 2017-05-24T23:33:29.782Z: JOB_MESSAGE_BASIC: (1e5138f9b2d41ac): Executing operation read/Read+split+pair_with_one+group/Reify+group/Write root: INFO: 2017-05-24T23:34:27.759Z: JOB_MESSAGE_DETAILED: (5c69f0b7af924bb2): Workers have started successfully. root: INFO: 2017-05-24T23:36:48.906Z: JOB_MESSAGE_ERROR: (56ea43f5850cf099): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-24T23:36:50.962Z: JOB_MESSAGE_ERROR: (56ea43f5850cf431): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-24T23:36:53.118Z: JOB_MESSAGE_ERROR: (56ea43f5850cf7c9): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-24T23:36:55.167Z: JOB_MESSAGE_ERROR: (56ea43f5850cfb61): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 706, in run self._load_main_session(self.local_staging_directory) File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 446, in _load_main_session pickler.load_session(session_file) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 247, in load_session return dill.load_session(file_path) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 363, in load_session module = unpickler.load() File "/usr/lib/python2.7/pickle.py", line 858, in load dispatch[key](self) File "/usr/lib/python2.7/pickle.py", line 1133, in load_reduce value = func(*args) File "/usr/local/lib/python2.7/dist-packages/dill/dill.py", line 766, in _import_module return __import__(import_name) ImportError: No module named gen_protos root: INFO: 2017-05-24T23:36:55.321Z: JOB_MESSAGE_DEBUG: (1e5138f9b2d4d57): Executing failure step failure24 root: INFO: 2017-05-24T23:36:55.323Z: JOB_MESSAGE_ERROR: (1e5138f9b2d4b19): Workflow failed. Causes: (1e5138f9b2d41d3): S05:read/Read+split+pair_with_one+group/Reify+group/Write failed., (5aa7c181116a26a4): Failed to split source. root: INFO: 2017-05-24T23:36:55.384Z: JOB_MESSAGE_DETAILED: (546cb52e93e4c3b9): Cleaning up. root: INFO: 2017-05-24T23:36:55.511Z: JOB_MESSAGE_DEBUG: (546cb52e93e4ce03): Starting worker pool teardown. root: INFO: 2017-05-24T23:36:55.513Z: JOB_MESSAGE_BASIC: (546cb52e93e4c84d): Stopping worker pool... root: INFO: 2017-05-24T23:38:05.521Z: JOB_MESSAGE_BASIC: (546cb52e93e4c7bc): Worker pool stopped. root: INFO: 2017-05-24T23:38:05.572Z: JOB_MESSAGE_DEBUG: (546cb52e93e4c0e4): Tearing down pending resources... root: INFO: Job 2017-05-24_16_33_25-3759598730270575457 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 2 tests in 349.172s FAILED (errors=1) Found: https://console.cloud.google.com/dataflow/job/2017-05-24_16_33_24-9956225657063932747?project=apache-beam-testing Found: https://console.cloud.google.com/dataflow/job/2017-05-24_16_33_25-3759598730270575457?project=apache-beam-testing Build step 'Execute shell' marked build as failure