See <https://builds.apache.org/job/beam_PostCommit_Python_Verify/4181/display/redirect>
------------------------------------------ [...truncated 1.54 MB...] "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true } ] }, "output_name": "out", "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0).output" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s14" }, "user_name": "write/Write/WriteImpl/FinalizeWrite/_UnpickledSideInput(Extract.out.0)" } }, { "kind": "ParallelDo", "name": "s17", "properties": { "display_data": [ { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.CallableWrapperDoFn", "type": "STRING", "value": "_finalize_write" }, { "key": "fn", "label": "Transform Function", "namespace": "apache_beam.transforms.core.ParDo", "shortValue": "CallableWrapperDoFn", "type": "STRING", "value": "apache_beam.transforms.core.CallableWrapperDoFn" } ], "non_parallel_inputs": { "SideInput-s15": { "@type": "OutputReference", "output_name": "out", "step_name": "SideInput-s15" }, "SideInput-s16": { "@type": "OutputReference", "output_name": "out", "step_name": "SideInput-s16" } }, "output_info": [ { "encoding": { "@type": "kind:windowed_value", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [ { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] }, { "@type": "FastPrimitivesCoder$eNprYEpOLEhMzkiNT0pNzNVLzk9JLSqGUlxuicUlAUWZuZklmWWpxc4gQa5CBs3GQsbaQqZQ/vi0xJycpMTk7Hiw+kJmPEYFZCZn56RCjWABGsFaW8iWVJykBwDlGS3/", "component_encodings": [] } ], "is_pair_like": true }, { "@type": "kind:global_window" } ], "is_wrapper": true }, "output_name": "out", "user_name": "write/Write/WriteImpl/FinalizeWrite.out" } ], "parallel_input": { "@type": "OutputReference", "output_name": "out", "step_name": "s7" }, "serialized_fn": "ref_AppliedPTransform_write/Write/WriteImpl/FinalizeWrite_20", "user_name": "write/Write/WriteImpl/FinalizeWrite/FinalizeWrite" } } ], "type": "JOB_TYPE_BATCH" } root: INFO: Create job: <Job createTime: u'2018-02-08T20:41:20.713075Z' currentStateTime: u'1970-01-01T00:00:00Z' id: u'2018-02-08_12_41_19-4541629610088221710' location: u'us-central1' name: u'beamapp-jenkins-0208204118-252128' projectId: u'apache-beam-testing' stageStates: [] steps: [] tempFiles: [] type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> root: INFO: Created job with id: [2018-02-08_12_41_19-4541629610088221710] root: INFO: To access the Dataflow monitoring console, please navigate to https://console.cloud.google.com/dataflow/jobsDetail/locations/us-central1/jobs/2018-02-08_12_41_19-4541629610088221710?project=apache-beam-testing root: INFO: Job 2018-02-08_12_41_19-4541629610088221710 is in state JOB_STATE_PENDING root: INFO: 2018-02-08T20:41:19.903Z: JOB_MESSAGE_WARNING: (3f071c115ab441d2): Job 2018-02-08_12_41_19-4541629610088221710 might autoscale up to 30 workers. root: INFO: 2018-02-08T20:41:19.923Z: JOB_MESSAGE_DETAILED: (3f071c115ab44f43): Autoscaling is enabled for job 2018-02-08_12_41_19-4541629610088221710. The number of workers will be between 1 and 30. root: INFO: 2018-02-08T20:41:19.960Z: JOB_MESSAGE_DETAILED: (3f071c115ab44cb4): Autoscaling was automatically enabled for job 2018-02-08_12_41_19-4541629610088221710. root: INFO: 2018-02-08T20:41:22.277Z: JOB_MESSAGE_DETAILED: (92eee085f775f008): Checking required Cloud APIs are enabled. root: INFO: 2018-02-08T20:41:22.506Z: JOB_MESSAGE_DETAILED: (92eee085f775f8d8): Expanding CoGroupByKey operations into optimizable parts. root: INFO: 2018-02-08T20:41:22.523Z: JOB_MESSAGE_DEBUG: (92eee085f775ff7f): Combiner lifting skipped for step write/Write/WriteImpl/GroupByKey: GroupByKey not followed by a combiner. root: INFO: 2018-02-08T20:41:22.554Z: JOB_MESSAGE_DEBUG: (92eee085f775fe99): Combiner lifting skipped for step group: GroupByKey not followed by a combiner. root: INFO: 2018-02-08T20:41:22.585Z: JOB_MESSAGE_DETAILED: (92eee085f775fdb3): Expanding GroupByKey operations into optimizable parts. root: INFO: 2018-02-08T20:41:22.609Z: JOB_MESSAGE_DETAILED: (92eee085f775fccd): Lifting ValueCombiningMappingFns into MergeBucketsMappingFns root: INFO: 2018-02-08T20:41:22.642Z: JOB_MESSAGE_DEBUG: (92eee085f775fa1b): Annotating graph with Autotuner information. root: INFO: 2018-02-08T20:41:22.668Z: JOB_MESSAGE_DETAILED: (92eee085f775f84f): Fusing adjacent ParDo, Read, Write, and Flatten operations root: INFO: 2018-02-08T20:41:22.696Z: JOB_MESSAGE_DETAILED: (92eee085f775f769): Fusing consumer split into read/Read root: INFO: 2018-02-08T20:41:22.729Z: JOB_MESSAGE_DETAILED: (92eee085f775f683): Fusing consumer group/Write into group/Reify root: INFO: 2018-02-08T20:41:22.762Z: JOB_MESSAGE_DETAILED: (92eee085f775f59d): Fusing consumer group/GroupByWindow into group/Read root: INFO: 2018-02-08T20:41:22.794Z: JOB_MESSAGE_DETAILED: (92eee085f775f4b7): Fusing consumer write/Write/WriteImpl/GroupByKey/GroupByWindow into write/Write/WriteImpl/GroupByKey/Read root: INFO: 2018-02-08T20:41:22.819Z: JOB_MESSAGE_DETAILED: (92eee085f775f3d1): Fusing consumer write/Write/WriteImpl/GroupByKey/Write into write/Write/WriteImpl/GroupByKey/Reify root: INFO: 2018-02-08T20:41:22.848Z: JOB_MESSAGE_DETAILED: (92eee085f775f2eb): Fusing consumer write/Write/WriteImpl/WindowInto(WindowIntoFn) into write/Write/WriteImpl/Pair root: INFO: 2018-02-08T20:41:22.878Z: JOB_MESSAGE_DETAILED: (92eee085f775f205): Fusing consumer write/Write/WriteImpl/GroupByKey/Reify into write/Write/WriteImpl/WindowInto(WindowIntoFn) root: INFO: 2018-02-08T20:41:22.904Z: JOB_MESSAGE_DETAILED: (92eee085f775f11f): Fusing consumer pair_with_one into split root: INFO: 2018-02-08T20:41:22.915Z: JOB_MESSAGE_DETAILED: (92eee085f775f039): Fusing consumer group/Reify into pair_with_one root: INFO: 2018-02-08T20:41:22.942Z: JOB_MESSAGE_DETAILED: (92eee085f775ff53): Fusing consumer write/Write/WriteImpl/WriteBundles/WriteBundles into format root: INFO: 2018-02-08T20:41:22.968Z: JOB_MESSAGE_DETAILED: (92eee085f775fe6d): Fusing consumer write/Write/WriteImpl/Pair into write/Write/WriteImpl/WriteBundles/WriteBundles root: INFO: 2018-02-08T20:41:22.995Z: JOB_MESSAGE_DETAILED: (92eee085f775fd87): Fusing consumer format into count root: INFO: 2018-02-08T20:41:23.018Z: JOB_MESSAGE_DETAILED: (92eee085f775fca1): Fusing consumer write/Write/WriteImpl/Extract into write/Write/WriteImpl/GroupByKey/GroupByWindow root: INFO: 2018-02-08T20:41:23.052Z: JOB_MESSAGE_DETAILED: (92eee085f775fbbb): Fusing consumer count into group/GroupByWindow root: INFO: 2018-02-08T20:41:23.086Z: JOB_MESSAGE_DETAILED: (92eee085f775fad5): Fusing consumer write/Write/WriteImpl/InitializeWrite into write/Write/WriteImpl/DoOnce/Read root: INFO: 2018-02-08T20:41:23.114Z: JOB_MESSAGE_DEBUG: (92eee085f775f9ef): Workflow config is missing a default resource spec. root: INFO: 2018-02-08T20:41:23.145Z: JOB_MESSAGE_DEBUG: (92eee085f775f909): Adding StepResource setup and teardown to workflow graph. root: INFO: 2018-02-08T20:41:23.169Z: JOB_MESSAGE_DEBUG: (92eee085f775f823): Adding workflow start and stop steps. root: INFO: 2018-02-08T20:41:23.199Z: JOB_MESSAGE_DEBUG: (92eee085f775f73d): Assigning stage ids. root: INFO: 2018-02-08T20:41:23.311Z: JOB_MESSAGE_DEBUG: (e23224b7e92083d6): Executing wait step start25 root: INFO: 2018-02-08T20:41:23.377Z: JOB_MESSAGE_BASIC: (a2e00ca6b90abb30): Executing operation write/Write/WriteImpl/DoOnce/Read+write/Write/WriteImpl/InitializeWrite root: INFO: 2018-02-08T20:41:23.400Z: JOB_MESSAGE_BASIC: (e23224b7e9208d7c): Executing operation group/Create root: INFO: 2018-02-08T20:41:23.411Z: JOB_MESSAGE_DEBUG: (bec9a3401cb34e21): Starting worker pool setup. root: INFO: 2018-02-08T20:41:23.447Z: JOB_MESSAGE_BASIC: (bec9a3401cb34d7b): Starting 1 workers in us-central1-f... root: INFO: 2018-02-08T20:41:23.513Z: JOB_MESSAGE_DEBUG: (e23224b7e92082e6): Value "group/Session" materialized. root: INFO: 2018-02-08T20:41:23.572Z: JOB_MESSAGE_BASIC: (fef37066d90d0532): Executing operation read/Read+split+pair_with_one+group/Reify+group/Write root: INFO: Job 2018-02-08_12_41_19-4541629610088221710 is in state JOB_STATE_RUNNING root: INFO: 2018-02-08T20:41:31.955Z: JOB_MESSAGE_DETAILED: (12e782f1271849f): Autoscaling: Raised the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2018-02-08T20:41:47.648Z: JOB_MESSAGE_DETAILED: (12e782f12718976): Autoscaling: Raised the number of workers to 1 based on the rate of progress in the currently running step(s). root: INFO: 2018-02-08T20:43:33.843Z: JOB_MESSAGE_DETAILED: (d05db0f41ce9d2f): Workers have started successfully. root: INFO: 2018-02-08T20:46:15.827Z: JOB_MESSAGE_ERROR: (8204ecdead94b6d4): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 346, in apache_beam.runners.worker.operations.DoOperation.start pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 217, in loads s = zlib.decompress(c) error: Error -3 while decompressing data: incorrect header check root: INFO: 2018-02-08T20:46:19.174Z: JOB_MESSAGE_ERROR: (8204ecdead94bf48): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 346, in apache_beam.runners.worker.operations.DoOperation.start pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 217, in loads s = zlib.decompress(c) error: Error -3 while decompressing data: incorrect header check root: INFO: 2018-02-08T20:46:22.505Z: JOB_MESSAGE_ERROR: (8204ecdead94b7bc): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 346, in apache_beam.runners.worker.operations.DoOperation.start pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 217, in loads s = zlib.decompress(c) error: Error -3 while decompressing data: incorrect header check root: INFO: 2018-02-08T20:46:25.858Z: JOB_MESSAGE_ERROR: (8204ecdead94b030): Traceback (most recent call last): File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/batchworker.py", line 609, in do_work work_executor.execute() File "/usr/local/lib/python2.7/dist-packages/dataflow_worker/executor.py", line 167, in execute op.start() File "apache_beam/runners/worker/operations.py", line 340, in apache_beam.runners.worker.operations.DoOperation.start def start(self): File "apache_beam/runners/worker/operations.py", line 341, in apache_beam.runners.worker.operations.DoOperation.start with self.scoped_start_state: File "apache_beam/runners/worker/operations.py", line 346, in apache_beam.runners.worker.operations.DoOperation.start pickler.loads(self.spec.serialized_fn)) File "/usr/local/lib/python2.7/dist-packages/apache_beam/internal/pickler.py", line 217, in loads s = zlib.decompress(c) error: Error -3 while decompressing data: incorrect header check root: INFO: 2018-02-08T20:46:25.897Z: JOB_MESSAGE_DEBUG: (a28958f0997a32ba): Executing failure step failure24 root: INFO: 2018-02-08T20:46:25.935Z: JOB_MESSAGE_ERROR: (a28958f0997a30d8): Workflow failed. Causes: (fef37066d90d0d53): S05:read/Read+split+pair_with_one+group/Reify+group/Write failed., (43fb6c8d75e9ff90): A work item was attempted 4 times without success. Each time the worker eventually lost contact with the service. The work item was attempted on: beamapp-jenkins-020820411-02081241-4461-harness-hm94, beamapp-jenkins-020820411-02081241-4461-harness-hm94, beamapp-jenkins-020820411-02081241-4461-harness-hm94, beamapp-jenkins-020820411-02081241-4461-harness-hm94 root: INFO: 2018-02-08T20:46:26.057Z: JOB_MESSAGE_DETAILED: (92eee085f775fb32): Cleaning up. root: INFO: 2018-02-08T20:46:26.105Z: JOB_MESSAGE_DEBUG: (92eee085f775f966): Starting worker pool teardown. root: INFO: 2018-02-08T20:46:26.130Z: JOB_MESSAGE_BASIC: (92eee085f775f880): Stopping worker pool... root: INFO: 2018-02-08T20:48:48.012Z: JOB_MESSAGE_DETAILED: (12e782f12718a2f): Autoscaling: Reduced the number of workers to 0 based on the rate of progress in the currently running step(s). root: INFO: 2018-02-08T20:48:48.052Z: JOB_MESSAGE_BASIC: (92eee085f775f4e8): Worker pool stopped. root: INFO: 2018-02-08T20:48:48.084Z: JOB_MESSAGE_DEBUG: (92eee085f775f31c): Tearing down pending resources... root: INFO: Job 2018-02-08_12_41_19-4541629610088221710 is in state JOB_STATE_FAILED --------------------- >> end captured logging << --------------------- ---------------------------------------------------------------------- Ran 3 tests in 465.602s FAILED (errors=3) Build step 'Execute shell' marked build as failure