Heatao opened a new issue, #15060:
URL: https://github.com/apache/iceberg/issues/15060
### Apache Iceberg version
main (development)
### Query engine
Spark
### Please describe the bug 🐞
### Description
Custom snapshot properties set via Spark session config (e.g.,
`spark.sql.iceberg.snapshot-property.custom-key`) are not applied to DELETE
operations, while they work correctly for INSERT, UPDATE, and MERGE operations.
### Expected Behavior
DELETE should respect custom snapshot properties from session config,
consistent with other DML operations.
### Actual Behavior
DELETE operations silently ignore custom snapshot properties.
### Steps to Reproduce
```java
spark.conf().set("spark.sql.iceberg.snapshot-property.my-key", "my-value");
// INSERT works - snapshot has my-key ✅
sql("INSERT INTO table VALUES (1, 'a')");
// DELETE doesn't work - snapshot missing my-key ❌
sql("DELETE FROM table WHERE id = 1");
```
### Root Cause
In SparkTable.deleteFromRowFilter(), the code doesn't call
SparkWriteConf.extraSnapshotMetadata() to apply session properties, unlike
insert/update paths.
### Impact
Users relying on custom snapshot properties for tracking (e.g., audit
trails, job metadata) will have incomplete information for DELETE operations.
### Related
- Feature introduced in #14545
- Affects Spark 4.0 and 4.1
### Willingness to contribute
- [x] I can contribute a fix for this bug independently
- [ ] I would be willing to contribute a fix for this bug with guidance from
the Iceberg community
- [ ] I cannot contribute a fix for this bug at this time
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]