See
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/390/display/redirect?page=changes>
Changes:
[noreply] [BEAM-11211] Update pandas and pyarrow in python container (#13987)
[noreply] Update go version to 1.12.7 (#13996)
------------------------------------------
[...truncated 1.61 MB...]
],
"is_pair_like": true,
"pipeline_proto_coder_id": "ref_Coder_FastPrimitivesCoder_5"
},
{
"@type": "kind:global_window"
}
],
"is_wrapper": true
},
"output_name": "None",
"user_name": "CheckSide/Match.out"
}
],
"parallel_input": {
"@type": "OutputReference",
"output_name": "None",
"step_name": "s23"
},
"serialized_fn":
"QlpoOTFBWSZTWVBf6zgABEp/9n//////////////////////+QABgYAQABARUAT3ji6glADEFoamRNEyKeo9NCDwo9T0jaBPUzJB6TQ9RpiaA09IaPSGnqeUHqGIaNPIj2kI0xoJ6g5poBpoZJ6nkQ0yaMRpoaZDIYmE0BghiMTIMmQGAmjRoGjAgyYICaBGpPSQ9TaZQGIGmQDQGgAwIGgMIAAABoZADTRoDTRBhMEZMBMAmCYIDAACYAJgBGmEMRgI0YRhNGgGTJkBKEmEIyNJP1NMmUyYnop5EYmmg0AD1B6hoNGjJoDQAaDQMg0ekD0gAYjumHBwEQh8EoeDFBJDlRzmJjFREAQwZBbLBQlxyDJ8yEEAKAKULm0T6Md3HTUuMQRFO1ymOZE6ankJD2h3nu8UcgJAU5qonO5zBHcUWXoUvDEon6FM5u+/mLIEPovIcuipV3TPw6ZkY1rMsByz5XjlFUn32BTd0d7BWvVzui6hNJZnWm+aELgfPxHvSCBH891LApDlTBO+ZrnBdVremIQ87448YsaQqqqIuqlrGrJ4QeM5o1pKAvEVmDUq5oAqIjRZ6MnJ96ZAC8LMEaTC0ntYteQvkIWUjCkJmCGaAZkVBmRIdCGsDQra6JnawzCpOysEVZ3N594vQtLOV1VmoVekrJ3IKZU3L/Mr1o2HfB5lZOAnFyqcK1jXsyI8tyZBCGPVc1iCDSBITquqJAIuQTdec1yRNSKMKfgcMkTvJJmVNn5WDiqlPPTa2JqhfvCQ+JmVvY3k3Vx2eAI8tLDoXBK1U0s4l+a2WUhZQJLhWKG52DDTBSSnTkqczcsyKEgu5UQw4rg/u68VFlcvhHCBB0Y/Z9FCRiVb7Ura8uPK5t1dwbko2RbIYU+biMYQJh/OTezGntPrVZh0nZFDw36dP4H9czcn6OpSwfdmaHsrPtL76e1+dBd5n69P9Oz9jLasW1d3tu1R/K+QJdEcyp0VI7pGK79FpRYGAOuJoaLRM8JCBDw0TYFJCGWiVjG0yTw0rB+tTDybCLyocLaPfdBFjtyOtzM5e+iyXhakEpK5z1RIxFUVd+M5bsWz+V35Sj4irGjbJjoYa8GIkIVjc197DdFqnEmfY0qrkbN79fVbWLRx6kICX1wIJGlG0DQIqxFqSvKWdFs3YbEnzF6utlRLomkVhJVFW9otHIbGoNOcA648ltR2hDMPW+IctSlJoqJGCbKoU9hCGvslc6TS+sJwiQfS6FrKtHOISa3PJihzycgOlOzJgtLmdoEYKYH1CTTEmy0A6lHv0AjIOJsa276evM7d3uz4OLhGMQ6uLBtGiWZ+TLCYH5JgRjjKPynhGs2N5ky3gdpnWYH69N7WFLPU8tt8HB+582YLTCXaUTComZhtsWvFni5cUtFKIuEcolEVm5VJpFx6RFSLLeciCqHDRb9KPgXNrCSKichYJGXAbUkMqSfEKWZmGBgmasZWVFCgRZgVKmNHhOvhVybxraY4oGFeuF6F/VKxIXYYkdmHx0akVKq1kWCHUBjkINcVVL9tMpFrrcdnFNQ0eNt9SLiX897S2DOOlnbkIuohNCvjMZJyWinXbMDXVHUlnhSnFArnCslC0jNEIy01xr2XC00k9NKlMkx6h2LDqsqENFkZNeZzwOBFB7nK4raCrnwBUeU1ZIohmbXoipRUnBxF5Lk4MAdklraRAhpiT10YWz+2sEDJkrofWkKN5mFDjXTZiWmujZWRxxxxpYejfm0tgwk4lDbg9KTVCD7pYWuMFJi6FdyRAUHGO3/tvjy7HgV1iz20mW7ghplhOPZIPFVUYpwbRJHVTlyUjT4TS8k3pddb2q/vC5ycl5Cxb7p0Ki+al8io/PSItnm3NSfkXCe47SyV3e30CywjfuFyd0yLqsQmaIvB1I6G/Qjmmr2RRVKwBArNDLcdv6UkGozdU3fHgWDY2Mxum/2dChqiW/nfKhRc713N3Z6SxDIXiA0bb969RBSwoj4e7vnj/07izIxWxZ2Mkk62Jgyudz4HKt6scrOPREFmvLiW8Hj/8XckU4UJBQX+s4A=",
"user_name": "CheckSide/Match"
}
}
],
"type": "JOB_TYPE_BATCH"
}
apache_beam.runners.dataflow.internal.apiclient: INFO: Create job: <Job
createTime: '2021-02-18T06:27:57.837448Z'
currentStateTime: '1970-01-01T00:00:00Z'
id: '2021-02-17_22_27_56-10123685652871005666'
location: 'us-central1'
name: 'beamapp-jenkins-0218062749-098103'
projectId: 'apache-beam-testing'
stageStates: []
startTime: '2021-02-18T06:27:57.837448Z'
steps: []
tempFiles: []
type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)>
apache_beam.runners.dataflow.internal.apiclient: INFO: Created job with id:
[2021-02-17_22_27_56-10123685652871005666]
apache_beam.runners.dataflow.internal.apiclient: INFO: Submitted job:
2021-02-17_22_27_56-10123685652871005666
apache_beam.runners.dataflow.internal.apiclient: INFO: To access the Dataflow
monitoring console, please navigate to
https://console.cloud.google.com/dataflow/jobs/us-central1/2021-02-17_22_27_56-10123685652871005666?project=apache-beam-testing
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2021-02-17_22_27_56-10123685652871005666 is in state JOB_STATE_RUNNING
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:00.011Z:
JOB_MESSAGE_DETAILED: Autoscaling is enabled for job
2021-02-17_22_27_56-10123685652871005666. The number of workers will be between
1 and 1000.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:00.393Z:
JOB_MESSAGE_DETAILED: Autoscaling was automatically enabled for job
2021-02-17_22_27_56-10123685652871005666.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:02.908Z:
JOB_MESSAGE_BASIC: Worker configuration: n1-standard-1 in us-central1-a.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.617Z:
JOB_MESSAGE_DETAILED: Expanding CoGroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.653Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
CheckSide/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.682Z:
JOB_MESSAGE_DEBUG: Combiner lifting skipped for step
assert_that/Group/GroupByKey: GroupByKey not followed by a combiner.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.718Z:
JOB_MESSAGE_DETAILED: Expanding GroupByKey operations into optimizable parts.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.749Z:
JOB_MESSAGE_DETAILED: Lifting ValueCombiningMappingFns into
MergeBucketsMappingFns
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.826Z:
JOB_MESSAGE_DEBUG: Annotating graph with Autotuner information.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.894Z:
JOB_MESSAGE_DETAILED: Fusing adjacent ParDo, Read, Write, and Flatten operations
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.918Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/WindowInto(WindowIntoFn) into
ExternalTransform(beam:transforms:xlang:test:multi)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.951Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/WindowInto(WindowIntoFn) into
ExternalTransform(beam:transforms:xlang:test:multi)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:03.977Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s10 for input s8.None
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.013Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/Reify, through flatten assert_that/Group/Flatten,
into producer assert_that/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.036Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Reify into
assert_that/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.061Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/GroupByKey/GroupByWindow into
assert_that/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.090Z:
JOB_MESSAGE_DETAILED: Fusing consumer
assert_that/Group/Map(_merge_tagged_vals_under_key) into
assert_that/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.128Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Unkey into
assert_that/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.188Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Match into assert_that/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.224Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s20 for input s18.None
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.249Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of CheckSide/Group/GroupByKey/Reify,
through flatten CheckSide/Group/Flatten, into producer
CheckSide/Group/pair_with_0
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.272Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Group/GroupByKey/Reify into
CheckSide/Group/pair_with_1
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.300Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Group/GroupByKey/GroupByWindow
into CheckSide/Group/GroupByKey/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.334Z:
JOB_MESSAGE_DETAILED: Fusing consumer
CheckSide/Group/Map(_merge_tagged_vals_under_key) into
CheckSide/Group/GroupByKey/GroupByWindow
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.364Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Unkey into
CheckSide/Group/Map(_merge_tagged_vals_under_key)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.395Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Match into CheckSide/Unkey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.429Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s10-u22 for input s11-reify-value0-c20
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.459Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of
assert_that/Group/GroupByKey/Write, through flatten
assert_that/Group/Flatten/Unzipped-1, into producer
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.505Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/GroupByKey/Write into
assert_that/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.539Z:
JOB_MESSAGE_DETAILED: Unzipping flatten s20-u27 for input s21-reify-value9-c25
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.561Z:
JOB_MESSAGE_DETAILED: Fusing unzipped copy of CheckSide/Group/GroupByKey/Write,
through flatten CheckSide/Group/Flatten/Unzipped-1, into producer
CheckSide/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.597Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Group/GroupByKey/Write into
CheckSide/Group/GroupByKey/Reify
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.631Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/ToVoidKey into
assert_that/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.655Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_1 into
assert_that/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.689Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/ToVoidKey into
CheckSide/WindowInto(WindowIntoFn)
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.725Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Group/pair_with_1 into
CheckSide/ToVoidKey
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.757Z:
JOB_MESSAGE_DETAILED: Fusing consumer assert_that/Group/pair_with_0 into
assert_that/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.791Z:
JOB_MESSAGE_DETAILED: Fusing consumer CheckSide/Group/pair_with_0 into
CheckSide/Create/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.828Z:
JOB_MESSAGE_DEBUG: Workflow config is missing a default resource spec.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.888Z:
JOB_MESSAGE_DEBUG: Adding StepResource setup and teardown to workflow graph.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.916Z:
JOB_MESSAGE_DEBUG: Adding workflow start and stop steps.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:04.939Z:
JOB_MESSAGE_DEBUG: Assigning stage ids.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.090Z:
JOB_MESSAGE_DEBUG: Executing wait step start39
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.148Z:
JOB_MESSAGE_BASIC: Executing operation Main1/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.182Z:
JOB_MESSAGE_BASIC: Executing operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.183Z:
JOB_MESSAGE_BASIC: Finished operation Main1/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.207Z:
JOB_MESSAGE_BASIC: Executing operation Main2/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.221Z:
JOB_MESSAGE_DEBUG: Starting worker pool setup.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.227Z:
JOB_MESSAGE_BASIC: Executing operation CheckSide/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.243Z:
JOB_MESSAGE_BASIC: Starting 1 workers in us-central1-a...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.253Z:
JOB_MESSAGE_BASIC: Executing operation Side/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.276Z:
JOB_MESSAGE_DEBUG: Value "Main1/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.278Z:
JOB_MESSAGE_BASIC: Finished operation Main2/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.278Z:
JOB_MESSAGE_BASIC: Finished operation Side/Read
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.345Z:
JOB_MESSAGE_DEBUG: Value "Side/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.370Z:
JOB_MESSAGE_DEBUG: Value "Main2/Read.out" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.684Z:
JOB_MESSAGE_BASIC: Finished operation assert_that/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.709Z:
JOB_MESSAGE_BASIC: Finished operation CheckSide/Group/GroupByKey/Create
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.807Z:
JOB_MESSAGE_DEBUG: Value "assert_that/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.842Z:
JOB_MESSAGE_DEBUG: Value "CheckSide/Group/GroupByKey/Session" materialized.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.903Z:
JOB_MESSAGE_BASIC: Executing operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.935Z:
JOB_MESSAGE_BASIC: Executing operation
ExternalTransform(beam:transforms:xlang:test:multi)+CheckSide/WindowInto(WindowIntoFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+CheckSide/ToVoidKey+CheckSide/Group/pair_with_1+CheckSide/Group/GroupByKey/Reify+CheckSide/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:05.969Z:
JOB_MESSAGE_BASIC: Executing operation
CheckSide/Create/Read+CheckSide/Group/pair_with_0+CheckSide/Group/GroupByKey/Reify+CheckSide/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:23.530Z:
JOB_MESSAGE_BASIC: Your project already contains 100 Dataflow-created metric
descriptors, so new user metrics of the form custom.googleapis.com/* will not
be created. However, all user metrics are also available in the metric
dataflow.googleapis.com/job/user_counter. If you rely on the custom metrics,
you can delete old / unused metric descriptors. See
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.list
and
https://developers.google.com/apis-explorer/#p/monitoring/v3/monitoring.projects.metricDescriptors.delete
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:28:30.872Z:
JOB_MESSAGE_DETAILED: Autoscaling: Raised the number of workers to 1 based on
the rate of progress in the currently running stage(s).
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:29:03.091Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:29:03.142Z:
JOB_MESSAGE_DETAILED: Workers have started successfully.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:37.034Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 48, in
dataflow_worker.native_operations.NativeReadOperation.start
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/inmemory.py",
line 108, in __iter__
yield self._source.coder.decode(value)
File "/usr/local/lib/python3.6/site-packages/apache_beam/coders/coders.py",
line 456, in decode
return self.get_impl().decode(encoded)
File "apache_beam/coders/coder_impl.py", line 226, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 228, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 483, in
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
ValueError: Unknown type tag 7f
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:40.457Z:
JOB_MESSAGE_BASIC: Finished operation
CheckSide/Create/Read+CheckSide/Group/pair_with_0+CheckSide/Group/GroupByKey/Reify+CheckSide/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:43.720Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 48, in
dataflow_worker.native_operations.NativeReadOperation.start
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/inmemory.py",
line 108, in __iter__
yield self._source.coder.decode(value)
File "/usr/local/lib/python3.6/site-packages/apache_beam/coders/coders.py",
line 456, in decode
return self.get_impl().decode(encoded)
File "apache_beam/coders/coder_impl.py", line 226, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 228, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 483, in
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
ValueError: Unknown type tag 7f
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:47.007Z:
JOB_MESSAGE_BASIC: Finished operation
assert_that/Create/Read+assert_that/Group/pair_with_0+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:50.263Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 48, in
dataflow_worker.native_operations.NativeReadOperation.start
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/inmemory.py",
line 108, in __iter__
yield self._source.coder.decode(value)
File "/usr/local/lib/python3.6/site-packages/apache_beam/coders/coders.py",
line 456, in decode
return self.get_impl().decode(encoded)
File "apache_beam/coders/coder_impl.py", line 226, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 228, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 483, in
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
ValueError: Unknown type tag 7f
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.542Z:
JOB_MESSAGE_ERROR: Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/batchworker.py",
line 649, in do_work
work_executor.execute()
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/executor.py",
line 179, in execute
op.start()
File "dataflow_worker/native_operations.py", line 38, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 39, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 44, in
dataflow_worker.native_operations.NativeReadOperation.start
File "dataflow_worker/native_operations.py", line 48, in
dataflow_worker.native_operations.NativeReadOperation.start
File "/usr/local/lib/python3.6/site-packages/dataflow_worker/inmemory.py",
line 108, in __iter__
yield self._source.coder.decode(value)
File "/usr/local/lib/python3.6/site-packages/apache_beam/coders/coders.py",
line 456, in decode
return self.get_impl().decode(encoded)
File "apache_beam/coders/coder_impl.py", line 226, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 228, in
apache_beam.coders.coder_impl.StreamCoderImpl.decode
File "apache_beam/coders/coder_impl.py", line 483, in
apache_beam.coders.coder_impl.FastPrimitivesCoderImpl.decode_from_stream
ValueError: Unknown type tag 7f
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.568Z:
JOB_MESSAGE_BASIC: Finished operation
ExternalTransform(beam:transforms:xlang:test:multi)+CheckSide/WindowInto(WindowIntoFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+CheckSide/ToVoidKey+CheckSide/Group/pair_with_1+CheckSide/Group/GroupByKey/Reify+CheckSide/Group/GroupByKey/Write
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.636Z:
JOB_MESSAGE_DEBUG: Executing failure step failure38
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.670Z:
JOB_MESSAGE_ERROR: Workflow failed. Causes:
S04:ExternalTransform(beam:transforms:xlang:test:multi)+CheckSide/WindowInto(WindowIntoFn)+assert_that/WindowInto(WindowIntoFn)+assert_that/ToVoidKey+assert_that/Group/pair_with_1+assert_that/Group/GroupByKey/Reify+assert_that/Group/GroupByKey/Write+CheckSide/ToVoidKey+CheckSide/Group/pair_with_1+CheckSide/Group/GroupByKey/Reify+CheckSide/Group/GroupByKey/Write
failed., The job failed because a work item has failed 4 times. Look in
previous log entries for the cause of each one of the 4 failures. For more
information, see https://cloud.google.com/dataflow/docs/guides/common-errors.
The work item was attempted on these workers:
beamapp-jenkins-021806274-02172227-l6ev-harness-w3vx
Root cause: Work item failed.,
beamapp-jenkins-021806274-02172227-l6ev-harness-w3vx
Root cause: Work item failed.,
beamapp-jenkins-021806274-02172227-l6ev-harness-w3vx
Root cause: Work item failed.,
beamapp-jenkins-021806274-02172227-l6ev-harness-w3vx
Root cause: Work item failed.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.749Z:
JOB_MESSAGE_DETAILED: Cleaning up.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.816Z:
JOB_MESSAGE_DEBUG: Starting worker pool teardown.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:34:53.851Z:
JOB_MESSAGE_BASIC: Stopping worker pool...
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:35:47.234Z:
JOB_MESSAGE_DETAILED: Autoscaling: Resized worker pool from 1 to 0.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:35:47.277Z:
JOB_MESSAGE_BASIC: Worker pool stopped.
apache_beam.runners.dataflow.dataflow_runner: INFO: 2021-02-18T06:35:47.318Z:
JOB_MESSAGE_DEBUG: Tearing down pending resources...
apache_beam.runners.dataflow.dataflow_runner: INFO: Job
2021-02-17_22_27_56-10123685652871005666 is in state JOB_STATE_FAILED
--------------------- >> end captured logging << ---------------------
----------------------------------------------------------------------
XML: nosetests-xlangValidateRunner.xml
----------------------------------------------------------------------
XML:
<https://ci-beam.apache.org/job/beam_PostCommit_XVR_Dataflow/ws/src/sdks/python/nosetests.xml>
----------------------------------------------------------------------
Ran 8 tests in 508.383s
FAILED (errors=8)
> Task
> :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava
> FAILED
> Task :runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerCleanup
> Task :runners:google-cloud-dataflow-java:cleanUpDockerImages
Digests:
-
us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e878c24f531a411d3042a59c54ee3b8e5910800206e9fd9534d873144dd6b0c4
Associated tags:
- 20210218060108
Tags:
- us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210218060108
Deleted [us.gcr.io/apache-beam-testing/java-postcommit-it/java:20210218060108].
Deleted
[us.gcr.io/apache-beam-testing/java-postcommit-it/java@sha256:e878c24f531a411d3042a59c54ee3b8e5910800206e9fd9534d873144dd6b0c4].
FAILURE: Build failed with an exception.
* What went wrong:
Execution failed for task
':runners:google-cloud-dataflow-java:validatesCrossLanguageRunnerPythonUsingJava'.
> Process 'command 'sh'' finished with non-zero exit value 1
* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug
option to get more log output. Run with --scan to get full insights.
* Get more help at https://help.gradle.org
Deprecated Gradle features were used in this build, making it incompatible with
Gradle 7.0.
Use '--warning-mode all' to show the individual deprecation warnings.
See
https://docs.gradle.org/6.8/userguide/command_line_interface.html#sec:command_line_warnings
BUILD FAILED in 35m 39s
112 actionable tasks: 100 executed, 8 from cache, 4 up-to-date
Gradle was unable to watch the file system for changes. The inotify watches
limit is too low.
Publishing build scan...
https://gradle.com/s/y5uvuhega6hzm
Build step 'Invoke Gradle script' changed build result to FAILURE
Build step 'Invoke Gradle script' marked build as failure
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]