Github user DaveBirdsall commented on a diff in the pull request:
https://github.com/apache/trafodion/pull/1544#discussion_r185953650
--- Diff:
docs/sql_reference/src/asciidoc/_chapters/sql_functions_and_expressions.adoc ---
@@ -581,12 +581,12 @@ characters. See
<<character_value_expressions,Character Value Expressions>>.
[[considerations_for_ascii]]
=== Considerations For ASCII
-For a string expression in the UTF8 character set, if the value of the
+If the value of the
--- End diff --
By "inserting into a <some character set name> column", I meant:
create table t ( x char(4) character set <some character set name>);
insert into t values ('<some string>');
select x from t;
You're doing something different. You are selecting a constant expression
from DUAL.
By default, the character set for your string constant is ISO88591. So, in
your examples, you are executing the ASCII function against an ISO88591 string.
Now, 'ñ' and 'ÿ' are valid ISO88591 characters. The behavior you got was the
same as the behavior I got when I insert these into an ISO88591 column. In your
last example, the compiler senses that the literal string is not ISO88591, so
it gives it a datatype of CHAR CHARACTER SET UTF8 (or possibly UCS2). And that
results in the 4106 error.
So there is a very subtle difference in behavior here. For literals, the
compiler is inferring the datatype of the string from its contents. For column
references, the compiler is using the datatype of the column.
---