anantdamle commented on a change in pull request #14858:
URL: https://github.com/apache/beam/pull/14858#discussion_r645623911
##########
File path:
sdks/java/core/src/main/java/org/apache/beam/sdk/schemas/utils/AvroUtils.java
##########
@@ -1277,4 +1308,17 @@ private static void checkTypeName(Schema.TypeName got,
Schema.TypeName expected,
checkArgument(
got.equals(expected), "Can't convert '%s' to %s, expected: %s", label,
got, expected);
}
+
+ /** Helper factory to build JDBC Logical types for AVRO Schema. */
+ private static org.apache.avro.Schema makeJdbcLogicalStringAvroType(
+ Schema.LogicalType<?, ?> logicalType) {
+ JDBCType jdbcType = JDBCType.valueOf(logicalType.getIdentifier());
+ Integer size = logicalType.getArgument();
+
+ String schemaJson =
Review comment:
No worries @iemejia, concern in using is that Hive only provides
`varchar`, how to then deal with others like longvarchar etc.
If I use lowercase then converting back to JDBCType would become hard.
Do you suggest converting all the string based logical types to just varchar
with appropriate maxLength?
The approach in JdbcIO schema is to represent them with Uppercase logical
type of JDBC.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
For queries about this service, please contact Infrastructure at:
[email protected]