twalthr commented on a change in pull request #18290:
URL: https://github.com/apache/flink/pull/18290#discussion_r783903468



##########
File path: docs/content/docs/connectors/table/filesystem.md
##########
@@ -451,15 +491,19 @@ The parallelism of writing files into external file 
system (including Hive) can
 <table class="table table-bordered">
   <thead>
     <tr>
-        <th class="text-left" style="width: 20%">Key</th>
-        <th class="text-left" style="width: 15%">Default</th>
+        <th class="text-left" style="width: 25%">Option</th>
+        <th class="text-left" style="width: 8%">Required</th>
+        <th class="text-left" style="width: 8%">Forwarded</th>
+        <th class="text-left" style="width: 7%">Default</th>
         <th class="text-left" style="width: 10%">Type</th>
-        <th class="text-left" style="width: 55%">Description</th>
+        <th class="text-left" style="width: 42%">Description</th>
     </tr>
   </thead>
   <tbody>
     <tr>
         <td><h5>sink.parallelism</h5></td>
+        <td>optional</td>
+        <td>false</td>

Review comment:
       `no`
   
   side comment: We should generate those tables automatically soon.

##########
File path: docs/content/docs/connectors/table/formats/json.md
##########
@@ -69,29 +69,33 @@ Format Options
       <tr>
         <th class="text-left" style="width: 25%">Option</th>
         <th class="text-center" style="width: 8%">Required</th>
+        <th class="text-center" style="width: 8%">Forwarded</th>
         <th class="text-center" style="width: 7%">Default</th>
         <th class="text-center" style="width: 10%">Type</th>
-        <th class="text-center" style="width: 50%">Description</th>
+        <th class="text-center" style="width: 42%">Description</th>
       </tr>
     </thead>
     <tbody>
     <tr>
       <td><h5>format</h5></td>
       <td>required</td>
+      <td>no</td>
       <td style="word-wrap: break-word;">(none)</td>
       <td>String</td>
       <td>Specify what format to use, here should be <code>'json'</code>.</td>
     </tr>
     <tr>
       <td><h5>json.fail-on-missing-field</h5></td>
       <td>optional</td>
+      <td>no</td>

Review comment:
       why is this no but `ignore-parse-errors` is?

##########
File path: 
flink-connectors/flink-connector-elasticsearch-base/src/main/java/org/apache/flink/connector/elasticsearch/table/ElasticsearchDynamicSinkFactoryBase.java
##########
@@ -224,6 +215,28 @@ static void validate(boolean condition, Supplier<String> 
message) {
                 .collect(Collectors.toSet());
     }
 
+    @Override
+    public Set<ConfigOption<?>> forwardOptions() {
+        return Stream.of(
+                        HOSTS_OPTION,
+                        INDEX_OPTION,
+                        PASSWORD_OPTION,
+                        USERNAME_OPTION,
+                        KEY_DELIMITER_OPTION,
+                        BULK_FLUSH_MAX_ACTIONS_OPTION,
+                        BULK_FLUSH_MAX_SIZE_OPTION,
+                        BULK_FLUSH_INTERVAL_OPTION,
+                        BULK_FLUSH_BACKOFF_TYPE_OPTION,
+                        BULK_FLUSH_BACKOFF_MAX_RETRIES_OPTION,
+                        BULK_FLUSH_BACKOFF_DELAY_OPTION,
+                        CONNECTION_PATH_PREFIX_OPTION,
+                        CONNECTION_REQUEST_TIMEOUT,
+                        CONNECTION_TIMEOUT,
+                        SOCKET_TIMEOUT,
+                        DELIVERY_GUARANTEE_OPTION)

Review comment:
       for Kafka I think we disable this?

##########
File path: 
flink-connectors/flink-connector-files/src/main/java/org/apache/flink/connector/file/table/AbstractFileSystemTable.java
##########
@@ -43,16 +41,17 @@
 
     List<String> partitionKeys;
 
-    AbstractFileSystemTable(DynamicTableFactory.Context context) {
-        this.context = context;
-        this.tableIdentifier = context.getObjectIdentifier();
-        this.tableOptions = new Configuration();
-        
context.getCatalogTable().getOptions().forEach(tableOptions::setString);
-        this.schema = context.getCatalogTable().getResolvedSchema();
+    AbstractFileSystemTable(
+            ObjectIdentifier tableIdentifier,
+            ResolvedSchema schema,
+            List<String> partitionKeys,
+            ReadableConfig config) {
+        this.tableIdentifier = tableIdentifier;
+        this.tableOptions = (Configuration) config;

Review comment:
       name it `tableOptions`

##########
File path: docs/content/docs/connectors/table/kinesis.md
##########
@@ -178,69 +184,79 @@ Connector Options
     <tr>
       <td><h5>aws.credentials.provider</h5></td>
       <td>optional</td>
+      <td>no</td>
       <td style="word-wrap: break-word;">AUTO</td>
       <td>String</td>
       <td>A credentials provider to use when authenticating against the 
Kinesis endpoint. See <a href="#authentication">Authentication</a> for 
details.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.basic.accesskeyid</h5></td>
          <td>optional</td>
+      <td>no</td>
          <td style="word-wrap: break-word;">(none)</td>
          <td>String</td>
          <td>The AWS access key ID to use when setting credentials provider 
type to BASIC.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.basic.secretkey</h5></td>
          <td>optional</td>
+      <td>no</td>
          <td style="word-wrap: break-word;">(none)</td>
          <td>String</td>
          <td>The AWS secret key to use when setting credentials provider type 
to BASIC.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.profile.path</h5></td>
          <td>optional</td>
+      <td>no</td>
          <td style="word-wrap: break-word;">(none)</td>
          <td>String</td>
          <td>Optional configuration for profile path if credential provider 
type is set to be PROFILE.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.profile.name</h5></td>
          <td>optional</td>
+      <td>no</td>
          <td style="word-wrap: break-word;">(none)</td>
          <td>String</td>
          <td>Optional configuration for profile name if credential provider 
type is set to be PROFILE.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.role.arn</h5></td>
          <td>optional</td>
+      <td>no</td>
          <td style="word-wrap: break-word;">(none)</td>
          <td>String</td>
          <td>The role ARN to use when credential provider type is set to 
ASSUME_ROLE or WEB_IDENTITY_TOKEN.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.role.sessionName</h5></td>
          <td>optional</td>
+      <td>no</td>
          <td style="word-wrap: break-word;">(none)</td>
          <td>String</td>
          <td>The role session name to use when credential provider type is set 
to ASSUME_ROLE or WEB_IDENTITY_TOKEN.</td>
     </tr>
     <tr>
          <td><h5>aws.credentials.role.externalId</h5></td>
          <td>optional</td>
+      <td>no</td>

Review comment:
       invalid formatting at various locations in this file

##########
File path: 
flink-formats/flink-avro-confluent-registry/src/main/java/org/apache/flink/formats/avro/registry/confluent/RegistryAvroFormatFactory.java
##########
@@ -175,6 +175,14 @@ public String factoryIdentifier() {
         return options;
     }
 
+    @Override
+    public Set<ConfigOption<?>> forwardOptions() {
+        Set<ConfigOption<?>> options = new HashSet<>();
+        options.addAll(requiredOptions());

Review comment:
       this looks dangerous to me if somebody adds an option above. wouldn't it 
be better to list them explicitly?

##########
File path: 
flink-connectors/flink-connector-files/src/main/java/org/apache/flink/connector/file/table/FileSystemTableSink.java
##########
@@ -117,20 +118,22 @@
     @Nullable private Integer configuredParallelism;
 
     FileSystemTableSink(
-            DynamicTableFactory.Context context,
+            ObjectIdentifier tableIdentifier,
+            ResolvedSchema schema,
+            List<String> partitionKeys,
+            ReadableConfig config,

Review comment:
       `tableOptions`




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to