clintropolis commented on a change in pull request #11362:
URL: https://github.com/apache/druid/pull/11362#discussion_r656054607



##########
File path: docs/development/extensions-core/avro.md
##########
@@ -26,7 +26,7 @@ title: "Apache Avro"
 
 This Apache Druid extension enables Druid to ingest and understand the Apache 
Avro data format. This extension provides 
 two Avro Parsers for stream ingestion and Hadoop batch ingestion. 
-See [Avro Hadoop Parser](../../ingestion/data-formats.md#avro-hadoop-parser) 
and [Avro Stream Parser](../../ingestion/data-formats.md#avro-stream-parser)
+See [Avro Hadoop Parser](../../ingestion/data-formats.md#avro-hadoop-parser) 
and [Avro Stream Parser](../../ingestion/c.md#avro-stream-parser)

Review comment:
       accidental change?

##########
File path: docs/ingestion/data-formats.md
##########
@@ -380,8 +380,8 @@ For details, see the Schema Registry 
[documentation](http://docs.confluent.io/cu
 | url | String | Specifies the url endpoint of the Schema Registry. | yes |
 | capacity | Integer | Specifies the max size of the cache (default = 
Integer.MAX_VALUE). | no |
 | urls | Array<String> | Specifies the url endpoints of the multiple Schema 
Registry instances. | yes(if `url` is not provided) |
-| config | Json | To send additional configurations, configured for Schema 
Registry | no |
-| headers | Json | To send headers to the Schema Registry | no |
+| config | Json | To send additional configurations, configured for Schema 
Registry.  User can implement a `DynamicConfigProvider` to supply some 
properties at runtime, by adding `"druid.dynamic.config.provider"`:`{"type": 
"<registered_dynamic_config_provider_name>", ...}` in json. | no |

Review comment:
       I suggest linking to `operations/dynamic-config-provider.md` instead of 
providing an example inline, maybe something like 
   ```suggestion
   | config | Json | To send additional configurations, configured for Schema 
Registry. This can be supplied via a 
[DynamicConfigProvider](../../operations/dynamic-config-provider.md). | no |
   ```
   or something similar (and for the other sections that have been updated)

##########
File path: 
extensions-core/avro-extensions/src/main/java/org/apache/druid/data/input/avro/SchemaRegistryBasedAvroBytesDecoder.java
##########
@@ -48,27 +52,30 @@
   private final String url;
   private final int capacity;
   private final List<String> urls;
-  private final Map<String, ?> config;
-  private final Map<String, String> headers;
+  private final Map<String, Object> config;
+  private final Map<String, Object> headers;
+  private final ObjectMapper mapper;
+  public static final String DRUID_DYNAMIC_CONFIG_PROVIDER_KEY = 
"druid.dynamic.config.provider";
 
   @JsonCreator
   public SchemaRegistryBasedAvroBytesDecoder(
       @JsonProperty("url") @Deprecated String url,
       @JsonProperty("capacity") Integer capacity,
       @JsonProperty("urls") @Nullable List<String> urls,
-      @JsonProperty("config") @Nullable Map<String, ?> config,
-      @JsonProperty("headers") @Nullable Map<String, String> headers
+      @JsonProperty("config") @Nullable Map<String, Object> config,
+      @JsonProperty("headers") @Nullable Map<String, Object> headers
   )
   {
     this.url = url;
     this.capacity = capacity == null ? Integer.MAX_VALUE : capacity;
     this.urls = urls;
     this.config = config;
     this.headers = headers;
+    this.mapper = new ObjectMapper();

Review comment:
       I think instead of creating a new ObjectMapper this should be 
`@JacksonInject` as a constructor argument.

##########
File path: 
extensions-core/protobuf-extensions/src/main/java/org/apache/druid/data/input/protobuf/SchemaRegistryBasedProtobufBytesDecoder.java
##########
@@ -50,27 +53,30 @@
   private final String url;
   private final int capacity;
   private final List<String> urls;
-  private final Map<String, ?> config;
-  private final Map<String, String> headers;
+  private final Map<String, Object> config;
+  private final Map<String, Object> headers;
+  private final ObjectMapper mapper;
+  public static final String DRUID_DYNAMIC_CONFIG_PROVIDER_KEY = 
"druid.dynamic.config.provider";
 
   @JsonCreator
   public SchemaRegistryBasedProtobufBytesDecoder(
       @JsonProperty("url") @Deprecated String url,
       @JsonProperty("capacity") Integer capacity,
       @JsonProperty("urls") @Nullable List<String> urls,
-      @JsonProperty("config") @Nullable Map<String, ?> config,
-      @JsonProperty("headers") @Nullable Map<String, String> headers
+      @JsonProperty("config") @Nullable Map<String, Object> config,
+      @JsonProperty("headers") @Nullable Map<String, Object> headers
   )
   {
     this.url = url;
     this.capacity = capacity == null ? Integer.MAX_VALUE : capacity;
     this.urls = urls;
     this.config = config;
     this.headers = headers;
+    this.mapper = new ObjectMapper();

Review comment:
       same comment about injecting instead of making a new `ObjectMapper`

##########
File path: 
extensions-core/avro-extensions/src/main/java/org/apache/druid/data/input/avro/SchemaRegistryBasedAvroBytesDecoder.java
##########
@@ -91,17 +98,62 @@ public int getCapacity()
   }
 
   @JsonProperty
-  public Map<String, ?> getConfig()
+  public Map<String, Object> getConfig()
   {
     return config;
   }
 
   @JsonProperty
-  public Map<String, String> getHeaders()
+  public Map<String, Object> getHeaders()
   {
     return headers;
   }
 
+  protected Map<String, String> createRegistryHeader()

Review comment:
       These methods all look basically the same for both avro and protobuf 
(and kafka).
   
   I suggest making a static method or two, perhaps in a new class, maybe 
`DynamicConfigProviderUtils` or something similar in `druid-core`, that takes a 
`Map<String, Object>` and performs this logic




-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

For queries about this service, please contact Infrastructure at:
[email protected]



---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to