chia7712 commented on code in PR #19449:
URL: https://github.com/apache/kafka/pull/19449#discussion_r2253560947


##########
connect/json/src/main/java/org/apache/kafka/connect/json/JsonConverter.java:
##########
@@ -340,13 +351,16 @@ public SchemaAndValue toConnectData(String topic, byte[] 
value) {
             throw new DataException("Converting byte[] to Kafka Connect data 
failed due to serialization error: ", e);
         }
 
-        if (config.schemasEnabled() && (!jsonValue.isObject() || 
jsonValue.size() != 2 || !jsonValue.has(JsonSchema.ENVELOPE_SCHEMA_FIELD_NAME) 
|| !jsonValue.has(JsonSchema.ENVELOPE_PAYLOAD_FIELD_NAME)))
-            throw new DataException("JsonConverter with schemas.enable 
requires \"schema\" and \"payload\" fields and may not contain additional 
fields." +
+        if (config.schemasEnabled()) {
+            if (schema != null) {

Review Comment:
   Users need to set `schemas.enable=false` in the source connector and then 
set `schemas.enable=true` along with `schema.content=xxx` in the sink connector 
to avoid embedding the schema in every message. Is this understanding correct? 
If so, the `schema.enable` setting might be confusing, as the same flag has 
completely different meanings for the source and sink connectors



##########
connect/json/src/main/java/org/apache/kafka/connect/json/JsonConverter.java:
##########
@@ -340,13 +351,16 @@ public SchemaAndValue toConnectData(String topic, byte[] 
value) {
             throw new DataException("Converting byte[] to Kafka Connect data 
failed due to serialization error: ", e);
         }
 
-        if (config.schemasEnabled() && (!jsonValue.isObject() || 
jsonValue.size() != 2 || !jsonValue.has(JsonSchema.ENVELOPE_SCHEMA_FIELD_NAME) 
|| !jsonValue.has(JsonSchema.ENVELOPE_PAYLOAD_FIELD_NAME)))
-            throw new DataException("JsonConverter with schemas.enable 
requires \"schema\" and \"payload\" fields and may not contain additional 
fields." +
+        if (config.schemasEnabled()) {
+            if (schema != null) {
+                return new SchemaAndValue(schema, convertToConnect(schema, 
jsonValue, config));

Review Comment:
   What happens if the message has the schema field?



##########
connect/json/src/main/java/org/apache/kafka/connect/json/JsonConverterConfig.java:
##########
@@ -35,6 +36,11 @@ public final class JsonConverterConfig extends 
ConverterConfig {
     private static final String SCHEMAS_ENABLE_DOC = "Include schemas within 
each of the serialized values and keys.";
     private static final String SCHEMAS_ENABLE_DISPLAY = "Enable Schemas";
 
+    public static final String SCHEMA_CONTENT_CONFIG = "schema.content";
+    public static final String SCHEMA_CONTENT_DEFAULT = null;
+    private static final String SCHEMA_CONTENT_DOC = "When set, this is used 
as the schema for all messages. Otherwise, the schema will be included in the 
content of each message.";

Review Comment:
   The configuration is used for the sink connector, not the source connector. 
Is it correct? If so, could you please add it to the documentation?



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: jira-unsubscr...@kafka.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org

Reply via email to