PanJ commented on issue #28715:
URL: https://github.com/apache/beam/issues/28715#issuecomment-1766896214

   I have a similar issue posted 
[here](https://stackoverflow.com/questions/77293191/apache-beam-pythons-writetobigtable-sometimes-causes-a-step-to-keep-running-inf)
   
   In my case, even a small `WriteToBigTable` job could get stuck (but at a 
very low chance). Not sure if my logs helps with the diagnosis
   
   ```
   Unable to perform SDK-split for work-id: 5193980908353266575 due to error: 
INTERNAL: Empty split returned. 
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.TrailProto]
 { trail_point { source_file_loc { filepath: 
"dist_proc/dax/workflow/worker/fnapi_operators.cc" line: 2738 } } }']
   === Source Location Trace: ===
   dist_proc/dax/internal/status_utils.cc:236
    And could not Checkpoint reader due to error: OUT_OF_RANGE: Cannot 
checkpoint when range tracker is finished. 
[type.googleapis.com/util.MessageSetPayload='[dist_proc.dax.internal.TrailProto]
 { trail_point { source_file_loc { filepath: 
"dist_proc/dax/workflow/worker/operator.cc" line: 340 } } }']
   === Source Location Trace: ===
   dist_proc/dax/io/dax_reader_driver.cc:253
   dist_proc/dax/workflow/worker/operator.cc:340
   ```
   
   Also, the issue still occurs in `2.51.0` version


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]

Reply via email to