rymurr commented on pull request #1640:
URL: https://github.com/apache/iceberg/pull/1640#issuecomment-717194528
> I think we also want to pass options from the catalog config in Flink and
Spark, where users can pass properties like `uri` and `warehouse`. Could you
add a `Map` to this to pass the catalog config options?
I like the Map over Configuration suggestion as well. In #1587 I made the
constructor take `String name, Map props, Configuration conf` as it still needs
a `HadoopFileIO`
Has anyone thought of how to do this for the `IcebergSource`? Currently
`df.write().format("iceberg")` is as far as I understand going to use Hive/HDFS
regardless of these settings.
----------------------------------------------------------------
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]