[jira] [Created] (HIVE-25455) Include root cause of error in CopyUtils exception handling usecase.
Haymant Mangla created HIVE-25455: - Summary: Include root cause of error in CopyUtils exception handling usecase. Key: HIVE-25455 URL: https://issues.apache.org/jira/browse/HIVE-25455 Project: Hive Issue Type: Bug Reporter: Haymant Mangla Assignee: Haymant Mangla -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (HIVE-25454) Invalid request to metadata catalog for a WITH clause table
Soumyakanti Das created HIVE-25454: -- Summary: Invalid request to metadata catalog for a WITH clause table Key: HIVE-25454 URL: https://issues.apache.org/jira/browse/HIVE-25454 Project: Hive Issue Type: Bug Reporter: Soumyakanti Das Assignee: Soumyakanti Das For CTEs, there are many calls to get_table_req(), which throws a NoSuchObjectException and return null. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (HIVE-25453) Add LLAP IO support for Iceberg ORC tables
Ádám Szita created HIVE-25453: - Summary: Add LLAP IO support for Iceberg ORC tables Key: HIVE-25453 URL: https://issues.apache.org/jira/browse/HIVE-25453 Project: Hive Issue Type: New Feature Reporter: Ádám Szita Assignee: Ádám Szita -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (HIVE-25452) CLONE - Hive job fails while closing reducer output - Unable to rename
Nikhil Gupta created HIVE-25452: --- Summary: CLONE - Hive job fails while closing reducer output - Unable to rename Key: HIVE-25452 URL: https://issues.apache.org/jira/browse/HIVE-25452 Project: Hive Issue Type: Bug Components: Query Processor Affects Versions: 0.11.0, 0.12.0, 0.13.0, 0.13.1, 2.3.0 Environment: OS: 2.6.18-194.el5xen #1 SMP Fri Apr 2 15:34:40 EDT 2010 x86_64 x86_64 x86_64 GNU/Linux Hadoop 1.1.2 Reporter: Nikhil Gupta Assignee: Oleksiy Sayankin Attachments: HIVE-4605.2.patch, HIVE-4605.3.patch, HIVE-4605.patch 1, create a table with ORC storage model {code} create table iparea_analysis_orc (network int, ip string, ) stored as ORC; {code} 2, insert table iparea_analysis_orc select network, ip, , the script success, but failed after add *OVERWRITE* keyword. the main error log list as here. {code} java.lang.RuntimeException: Hive Runtime Error while closing operators: Unable to rename output from: hdfs://qa3hop001.uucun.com:9000/tmp/hive-hadoop/hive_2013-05-24_15-11-06_511_7746839019590922068/_task_tmp.-ext-1/_tmp.00_0 to: hdfs://qa3hop001.uucun.com:9000/tmp/hive-hadoop/hive_2013-05-24_15-11-06_511_7746839019590922068/_tmp.-ext-1/00_0 at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:317) at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:530) at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:421) at org.apache.hadoop.mapred.Child$4.run(Child.java:255) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:396) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1149) at org.apache.hadoop.mapred.Child.main(Child.java:249) Caused by: org.apache.hadoop.hive.ql.metadata.HiveException: Unable to rename output from: hdfs://qa3hop001.uucun.com:9000/tmp/hive-hadoop/hive_2013-05-24_15-11-06_511_7746839019590922068/_task_tmp.-ext-1/_tmp.00_0 to: hdfs://qa3hop001.uucun.com:9000/tmp/hive-hadoop/hive_2013-05-24_15-11-06_511_7746839019590922068/_tmp.-ext-1/00_0 at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.commit(FileSinkOperator.java:197) at org.apache.hadoop.hive.ql.exec.FileSinkOperator$FSPaths.access$300(FileSinkOperator.java:108) at org.apache.hadoop.hive.ql.exec.FileSinkOperator.closeOp(FileSinkOperator.java:867) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:588) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:597) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:597) at org.apache.hadoop.hive.ql.exec.Operator.close(Operator.java:597) at org.apache.hadoop.hive.ql.exec.ExecReducer.close(ExecReducer.java:309) ... 7 more {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (HIVE-25451) from_utc_timestamp uses wrong timezone offset for dates prior to 1895-12-01
Anurag Shekhar created HIVE-25451: - Summary: from_utc_timestamp uses wrong timezone offset for dates prior to 1895-12-01 Key: HIVE-25451 URL: https://issues.apache.org/jira/browse/HIVE-25451 Project: Hive Issue Type: Bug Components: UDF Environment: from beeline select from_utc_timestamp('1895-12-01', 'Australia/Perth'); ++ | _c0 | ++ | 1895-12-01 08:00:00.0 | ++ This one is using correct offset of 8 hours. Same query with date as 1895-11-30 select from_utc_timestamp('1895-11-30', 'Australia/Perth'); ++ | _c0 | ++ | 1895-11-30 07:43:24.0 | ++ Now the offset is 07:43:24.0 Reporter: Anurag Shekhar -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (HIVE-25450) Delta metrics keys should contain database name
Karen Coppage created HIVE-25450: Summary: Delta metrics keys should contain database name Key: HIVE-25450 URL: https://issues.apache.org/jira/browse/HIVE-25450 Project: Hive Issue Type: Sub-task Reporter: Karen Coppage Currently metrics about the number of deltas in a given partition or unpartitioned table includes information about the table name and the partition name (if applicable), but they should also include the database name, since there could be 2 tables in different databases with the same name. -- This message was sent by Atlassian Jira (v8.3.4#803005)
[jira] [Created] (HIVE-25449) datediff() gives wrong output when we set tez.task.launch.cmd-opts to some non UTC timezone
Shubham Chaurasia created HIVE-25449: Summary: datediff() gives wrong output when we set tez.task.launch.cmd-opts to some non UTC timezone Key: HIVE-25449 URL: https://issues.apache.org/jira/browse/HIVE-25449 Project: Hive Issue Type: Bug Components: UDF Reporter: Shubham Chaurasia Assignee: Shubham Chaurasia Repro (thanks Qiaosong Dong) - {code} create external table test_dt(id string, dt date); insert into test_dt values('11', '2021-07-06'), ('22', '2021-07-07'); select datediff(dt1.dt, '2021-07-01') from test_dt dt1 left join test_dt dt on dt1.id = dt.id; +--+ | _c0 | +--+ | 6| | 7| +--+ {code} Expected output - {code} +--+ | _c0 | +--+ | 5| | 6| +--+ {code} *Cause* This happens because in {{VectorUDFDateDiffColScalar}} class 1. For second argument(scalar) , we use {{java.text.SimpleDateFormat}} to parse the date strings which interprets it to be local timezone. 2. For first column we get a column vector which represents the date as epoch day. This is always in UTC. *Solution* We need to check other variants of datediff UDFs as well and change the parsing mechanism to always interpret date strings in UTC. I did a quick change in {{VectorUDFDateDiffColScalar}} which fixes the issue. {code} - date.setTime(formatter.parse(new String(bytesValue, "UTF-8")).getTime()); - baseDate = DateWritableV2.dateToDays(date); + org.apache.hadoop.hive.common.type.Date hiveDate + = org.apache.hadoop.hive.common.type.Date.valueOf(new String(bytesValue, "UTF-8")); + date.setTime(hiveDate.toEpochMilli()); + baseDate = hiveDate.toEpochDay(); {code} -- This message was sent by Atlassian Jira (v8.3.4#803005)