w41ter opened a new pull request, #46009:
URL: https://github.com/apache/doris/pull/46009
### What problem does this PR solve?
Issue Number: close #xxx
Related PR: #xxx
Problem Summary:
The clone task will compare the differences and similarities between local
and downloaded files. If the md5sum of two binlog files is the same, it will
set `skip_link_file` to skip the linking of that file. However,
`skip_link_file` isn't reset on time. As a result, subsequent files will all be
skipped because `skip_link_file` has been set, which leads to the problem that
for some rowsets, the meta exists while the segment file doesn't.
The error msg:
```
[NOT_FOUND]failed to get file size
/mnt/disk3/VEC_ASAN2/doris.HDD/data/900/13372467/1624966973/02000000002e1250744766e3a98c0e213d847daea65158a9_0.dat
```
### Release note
None
### Check List (For Author)
- Test <!-- At least one of them must be included. -->
- [ ] Regression test
- [ ] Unit Test
- [ ] Manual test (add detailed scripts or steps below)
- [ ] No need to test or manual test. Explain why:
- [ ] This is a refactor/code format and no logic has been changed.
- [ ] Previous test can cover this change.
- [ ] No code files have been changed.
- [ ] Other reason <!-- Add your reason? -->
- Behavior changed:
- [ ] No.
- [ ] Yes. <!-- Explain the behavior change -->
- Does this need documentation?
- [ ] No.
- [ ] Yes. <!-- Add document PR link here. eg:
https://github.com/apache/doris-website/pull/1214 -->
### Check List (For Reviewer who merge this PR)
- [ ] Confirm the release note
- [ ] Confirm test cases
- [ ] Confirm document
- [ ] Add branch pick label <!-- Add branch pick label that this PR should
merge into -->
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]