carc-harsh opened a new issue, #14813:
URL: https://github.com/apache/iceberg/issues/14813
### Feature Request / Improvement
### Problem Description
The `rewrite_data_files` Spark SQL procedure does not support operating on
branches. When a table identifier includes a branch name (e.g.,
`table.branch_myBranch`), the procedure:
1. Parses the branch name from the identifier
2. Loads a `SparkTable` with branch information
3. But then operates on the main branch instead of the specified branch
### Expected Behavior
When calling:
```sql
CALL catalog.system.rewrite_data_files(
table => 'db.table.branch_myBranch',
options => map('max-concurrent-file-group-rewrites', '75')
)
```
The procedure should:
- Read files from the snapshot at the head of `myBranch`
- Commit the rewrite operation to `myBranch` (not main branch)
### Current Behavior
The procedure:
- Reads from the main branch snapshot (via `table.currentSnapshot()`)
- Commits to the main branch (doesn't call `rewriteFiles.toBranch()`)
### Context
- Other operations support branches (e.g., `INSERT INTO
table.branch_myBranch`, `UPDATE table.branch_myBranch`)
- The `fast_forward` procedure accepts branch parameters
- The `RewriteFiles` API supports `.toBranch()` method (see
`BaseRewriteFiles.toBranch()`)
- `SparkTable` stores branch information but it's not accessible via
`table()` method
### Query engine
Spark
### Willingness to contribute
- [ ] I can contribute this improvement/feature independently
- [x] I would be willing to contribute this improvement/feature with
guidance from the Iceberg community
- [ ] I cannot contribute this improvement/feature at this time
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]