Araika Singh created HIVE-28703: ----------------------------------- Summary: Integral Data Type Overflow Not Enforced for JsonSerDe and UDFs Key: HIVE-28703 URL: https://issues.apache.org/jira/browse/HIVE-28703 Project: Hive Issue Type: Bug Reporter: Araika Singh Assignee: Araika Singh
When executing the following Hive queries with JsonSerDe to create and manipulate a table, the data exceeding the respective type limits (TINYINT, SMALLINT, INT, BIGINT) is not throwing an error. Instead, it remains within the defined limits, bypassing SQL standard checks for strict data type enforcement. *Steps to Reproduce:* {code:java} {code} *// sample json: \{"tiny_value": 128, "small_value" : 32768, "int_value" : 2147483648, "big_value" : 9223372036854775808}* *DROP TABLE IF EXISTS json_serde1_1; CREATE TABLE json_serde1_1 ( tiny_value TINYINT, small_value SMALLINT, int_value INT, big_value BIGINT ) ROW FORMAT SERDE 'org.apache.hadoop.hive.serde2.JsonSerDe'; INSERT INTO TABLE json_serde1_1 VALUES (128, 32768, 2147483648, 9223372036854775808); LOAD DATA LOCAL INPATH '../../data/files/sampleJson.json' INTO TABLE json_serde1_1; SELECT * FROM json_serde1_1; DROP TABLE json_serde1_1;* *Proposed Resolution:* Enforce stricter data type validation during data insertion for {{JsonSerDe}} and UDFs. This should ensure compliance with the defined column data types and raise errors for out-of-bound values.{*}{*} -- This message was sent by Atlassian Jira (v8.20.10#820010)