rdblue commented on a change in pull request #263: [WIP] Make metrics 
collection configurable via table properties
URL: https://github.com/apache/incubator-iceberg/pull/263#discussion_r300879667
 
 

 ##########
 File path: spark/src/main/java/org/apache/iceberg/spark/source/Writer.java
 ##########
 @@ -251,12 +252,14 @@ public String toString() {
       @Override
       public FileAppender<InternalRow> newAppender(OutputFile file, FileFormat 
fileFormat) {
         Schema schema = spec.schema();
+        MetricsSpec metricsSpec = MetricsSpec.fromProperties(properties);
         try {
           switch (fileFormat) {
             case PARQUET:
               return Parquet.write(file)
                   .createWriterFunc(msgType -> 
SparkParquetWriters.buildWriter(schema, msgType))
                   .setAll(properties)
+                  .metricsSpec(metricsSpec)
 
 Review comment:
   You're right, that would require passing the table, which is probably why we 
don't do this. Then this is fine as you have it.
   
   We don't want to serialize the table everywhere. That's difficult and there 
is no guarantee that `TableOperations` is serializable.

----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
 
For queries about this service, please contact Infrastructure at:
[email protected]


With regards,
Apache Git Services

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to