gustavoatt commented on code in PR #7120:
URL: https://github.com/apache/iceberg/pull/7120#discussion_r1165618038
##########
spark/v3.1/spark/src/main/java/org/apache/iceberg/spark/source/SparkWrite.java:
##########
@@ -127,6 +129,16 @@ class SparkWrite {
this.dsSchema = dsSchema;
this.extraSnapshotMetadata = writeConf.extraSnapshotMetadata();
this.partitionedFanoutEnabled = writeConf.fanoutWriterEnabled();
+
+ if (writeConf.outputSpecId() == null) {
Review Comment:
Done, I moved this logic to `SparkWriteConf`. The main reason why I
initially did not do it there was because I did not want to store the specs and
current spec there, but I think that should be ok.
##########
spark/v3.1/spark/src/main/java/org/apache/iceberg/spark/source/SparkWrite.java:
##########
@@ -586,8 +607,8 @@ public DataWriter<InternalRow> createWriter(int
partitionId, long taskId) {
@Override
public DataWriter<InternalRow> createWriter(int partitionId, long taskId,
long epochId) {
Table table = tableBroadcast.value();
- PartitionSpec spec = table.spec();
FileIO io = table.io();
+ PartitionSpec outputSpec = table.specs().get(outputSpecId);
Review Comment:
Done.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]