[
https://issues.apache.org/jira/browse/FLINK-7451?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16253456#comment-16253456
]
ASF GitHub Bot commented on FLINK-7451:
---------------------------------------
Github user twalthr commented on a diff in the pull request:
https://github.com/apache/flink/pull/4544#discussion_r151128673
--- Diff:
flink-libraries/flink-table/src/main/scala/org/apache/flink/table/calcite/FlinkTypeFactory.scala
---
@@ -269,6 +272,10 @@ class FlinkTypeFactory(typeSystem: RelDataTypeSystem)
extends JavaTypeFactoryImp
canonize(newType)
}
+
+ override def getDefaultCharset: Charset = {
+ Charset.forName(ConversionUtil.NATIVE_UTF16_CHARSET_NAME)
--- End diff --
Calcite only supports UTF-16, I guess because this is the default of Java
Strings anyway.
> Query fails when non-ascii characters are used in string literals
> -----------------------------------------------------------------
>
> Key: FLINK-7451
> URL: https://issues.apache.org/jira/browse/FLINK-7451
> Project: Flink
> Issue Type: Bug
> Components: Table API & SQL
> Reporter: Jark Wu
> Assignee: Jark Wu
>
> I found that using non-ascii characters in string literals causes calcite
> planner to throw the following exception:
> {code}
> org.apache.calcite.runtime.CalciteException: Failed to encode '%测试%' in
> character set 'ISO-8859-1'
> {code}
> The query is
> {code}
> SELECT * FROM T WHERE f0 LIKE '%测试%'
> {code}
> The reason for the issue is that calcite uses latin encoding ('ISO-8859-1')
> by default. In order to pass non-latin characters we should use unicode
> encoding as default.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)