rdblue commented on a change in pull request #2664:
URL: https://github.com/apache/iceberg/pull/2664#discussion_r648771172
##########
File path: nessie/src/main/java/org/apache/iceberg/nessie/NessieCatalog.java
##########
@@ -80,12 +79,14 @@
private UpdateableReference reference;
private String name;
private FileIO fileIO;
+ private Map<String, String> catalogOptions;
public NessieCatalog() {
}
@Override
public void initialize(String inputName, Map<String, String> options) {
+ this.catalogOptions = ImmutableMap.copyOf(options);
Review comment:
I agree with @rymurr. The properties passed as `options` in the Spark
catalog API are the ones under the catalog name and not general options (unless
Spark added this and I don't know about it).
I think it would be better to ensure that the application ID is taken from
the Spark context in the [`SparkCatalog`
implementation](https://github.com/apache/iceberg/blob/master/spark3/src/main/java/org/apache/iceberg/spark/SparkCatalog.java#L99).
That has already called `SparkSession.active()` so you can get the ID from it
directly and add it as an option to the property map. By doing it that way and
using a common name, like `app-id`, to pass the app ID, we can set a convention
that could also be used by Flink to set the app ID context.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]