During the migration from Hive to spark, there was a problem when the view created in Hive was used in Spark SQL. The origin Hive SQL show below:
CREATE VIEW myView AS SELECT CASE WHEN age > 12 THEN CAST(gender * 0.3 - 0.1 AS double) END AS TT, gender, age FROM myTable; Users use Spark SQL to query the view, but encountered up cast error. The error message is as follows: Cannot up cast TT from decimal(13, 1) to double. The type path of the target object is: You can either add an explicit cast to the input data or choose a higher precision type of the field in the target object How should we solve this problem?