tvalentyn opened a new issue, #23277:
URL: https://github.com/apache/beam/issues/23277

   ### What happened?
   
   Looking at the failure logs in 
https://ci-beam.apache.org/job/beam_PostCommit_Py_ValCont_PR/240/
   I don't see the Job ID, so I'd have to find the failing job from the UI or 
other means.
   
   ```
   20:07:19 apache_beam/runners/dataflow/dataflow_runner.py:1658: Failed
   20:07:19 ______________ ExerciseMetricsPipelineTest.test_metrics_fnapi_it 
_______________
   20:07:19 [gw0] linux -- Python 3.9.10 
/home/jenkins/jenkins-slave/workspace/beam_PostCommit_Py_ValCont_PR/src/build/gradleenv/-1734967050/bin/python3.9
   20:07:19 
   20:07:19 self = 
<apache_beam.runners.dataflow.dataflow_exercise_metrics_pipeline_test.ExerciseMetricsPipelineTest
 testMethod=test_metrics_fnapi_it>
   20:07:19 
   20:07:19     @pytest.mark.it_postcommit
   20:07:19     @pytest.mark.it_validatescontainer
   20:07:19     def test_metrics_fnapi_it(self):
   20:07:19 >     result = self.run_pipeline(experiment='beam_fn_api')
   20:07:19 
   20:07:19 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py:57: 
   20:07:19 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ _ _ _ _ 
   20:07:19 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline_test.py:43: in 
run_pipeline
   20:07:19     return dataflow_exercise_metrics_pipeline.apply_and_run(p)
   20:07:19 
apache_beam/runners/dataflow/dataflow_exercise_metrics_pipeline.py:176: in 
apply_and_run
   20:07:19     result = pipeline.run()
   20:07:19 apache_beam/pipeline.py:547: in run
   20:07:19     return Pipeline.from_runner_api(
   20:07:19 apache_beam/pipeline.py:574: in run
   20:07:19     return self.runner.run_pipeline(self, self._options)
   20:07:19 apache_beam/runners/dataflow/test_dataflow_runner.py:66: in 
run_pipeline
   20:07:19     self.result.wait_until_finish(duration=wait_duration)
   20:07:19 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
_ _ _ _ _ _ 
   20:07:19 
   20:07:19 self = <DataflowPipelineResult <Job
   20:07:19  clientRequestId: '20220916024924607233-3728'
   20:07:19  createTime: '2022-09-16T02:49:26.025361Z'
   20:07:19 ...022-09-16T02:49:26.025361Z'
   20:07:19  steps: []
   20:07:19  tempFiles: []
   20:07:19  type: TypeValueValuesEnum(JOB_TYPE_BATCH, 1)> at 0x7f67a40ddcd0>
   20:07:19 duration = None
   20:07:19 
   20:07:19     def wait_until_finish(self, duration=None):
   20:07:19       if not self.is_in_terminal_state():
   20:07:19         if not self.has_job:
   20:07:19           raise IOError('Failed to get the Dataflow job id.')
   20:07:19         consoleUrl = (
   20:07:19             "Console URL: https://console.cloud.google.com/";
   20:07:19             f"dataflow/jobs/<RegionId>/{self.job_id()}"
   20:07:19             "?project=<ProjectId>")
   20:07:19         thread = threading.Thread(
   20:07:19             target=DataflowRunner.poll_for_job_completion,
   20:07:19             args=(self._runner, self, duration))
   20:07:19     
   20:07:19         # Mark the thread as a daemon thread so a keyboard 
interrupt on the main
   20:07:19         # thread will terminate everything. This is also the reason 
we will not
   20:07:19         # use thread.join() to wait for the polling thread.
   20:07:19         thread.daemon = True
   20:07:19         thread.start()
   20:07:19         while thread.is_alive():
   20:07:19 >         time.sleep(5.0)
   20:07:19 E         Failed: Timeout >900.0s
   20:07:19 
   20:07:19 apache_beam/runners/dataflow/dataflow_runner.py:1658: Failed
   ```
   
   ### Issue Priority
   
   Priority: 2
   
   ### Issue Component
   
   Component: testing


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to