[
https://issues.apache.org/jira/browse/HIVE-27153?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Aman Raj resolved HIVE-27153.
-----------------------------
Resolution: Fixed
> Revert "HIVE-20182: Backport HIVE-20067 to branch-3"
> ----------------------------------------------------
>
> Key: HIVE-27153
> URL: https://issues.apache.org/jira/browse/HIVE-27153
> Project: Hive
> Issue Type: Sub-task
> Reporter: Aman Raj
> Assignee: Aman Raj
> Priority: Major
> Labels: pull-request-available
> Time Spent: 50m
> Remaining Estimate: 0h
>
> The mm_all.q test is failing because of this commit. This commit was not
> validated before committing.
> There is no stack trace for this exception. Link to the exception :
> [http://ci.hive.apache.org/blue/organizations/jenkins/hive-precommit/detail/PR-4126/2/tests]
>
> {code:java}
> java.lang.AssertionError: Client execution failed with error code = 1 running
> "insert into table part_mm_n0 partition(key_mm=455) select key from
> intermediate_n0" fname=mm_all.q See ./ql/target/tmp/log/hive.log or
> ./itests/qtest/target/tmp/log/hive.log, or check ./ql/target/surefire-reports
> or ./itests/qtest/target/surefire-reports/ for specific test cases logs. at
> org.junit.Assert.fail(Assert.java:88) at
> org.apache.hadoop.hive.ql.QTestUtil.failed(QTestUtil.java:2232) at
> org.apache.hadoop.hive.cli.control.CoreCliDriver.runTest(CoreCliDriver.java:180)
> at
> org.apache.hadoop.hive.cli.control.CliAdapter.runTest(CliAdapter.java:104)
> at
> org.apache.hadoop.hive.cli.split1.TestMiniLlapCliDriver.testCliDriver(TestMiniLlapCliDriver.java:62)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498) {code}
>
>
> Found the actual error :
> {code:java}
> 2023-03-19T15:18:07,705 DEBUG [699603ee-f4a1-43b7-b160-7faf858ca4b4 main]
> converters.ArrayConverter: Converting 'java.net.URL[]' value
> '[Ljava.net.URL;@7535f28' to type 'java.net.URL[]'
> 2023-03-19T15:18:07,705 DEBUG [699603ee-f4a1-43b7-b160-7faf858ca4b4 main]
> converters.ArrayConverter: No conversion required, value is already a
> java.net.URL[]
> 2023-03-19T15:18:07,819 INFO [699603ee-f4a1-43b7-b160-7faf858ca4b4 main]
> beanutils.FluentPropertyBeanIntrospector: Error when creating
> PropertyDescriptor for public final void
> org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)!
> Ignoring this property.
> 2023-03-19T15:18:07,819 DEBUG [699603ee-f4a1-43b7-b160-7faf858ca4b4 main]
> beanutils.FluentPropertyBeanIntrospector: Exception is:
> java.beans.IntrospectionException: bad write method arg count: public final
> void
> org.apache.commons.configuration2.AbstractConfiguration.setProperty(java.lang.String,java.lang.Object)
> at
> java.beans.PropertyDescriptor.findPropertyType(PropertyDescriptor.java:657)
> ~[?:1.8.0_342]
> at
> java.beans.PropertyDescriptor.setWriteMethod(PropertyDescriptor.java:327)
> ~[?:1.8.0_342]
> at java.beans.PropertyDescriptor.<init>(PropertyDescriptor.java:139)
> ~[?:1.8.0_342]
> at
> org.apache.commons.beanutils.FluentPropertyBeanIntrospector.createFluentPropertyDescritor(FluentPropertyBeanIntrospector.java:178)
> ~[commons-beanutils-1.9.3.jar:1.9.3]
> at
> org.apache.commons.beanutils.FluentPropertyBeanIntrospector.introspect(FluentPropertyBeanIntrospector.java:141)
> [commons-beanutils-1.9.3.jar:1.9.3]
> at
> org.apache.commons.beanutils.PropertyUtilsBean.fetchIntrospectionData(PropertyUtilsBean.java:2245)
> [commons-beanutils-1.9.3.jar:1.9.3]
> at
> org.apache.commons.beanutils.PropertyUtilsBean.getIntrospectionData(PropertyUtilsBean.java:2226)
> [commons-beanutils-1.9.3.jar:1.9.3]
> at
> org.apache.commons.beanutils.PropertyUtilsBean.getPropertyDescriptor(PropertyUtilsBean.java:954)
> [commons-beanutils-1.9.3.jar:1.9.3]
> at
> org.apache.commons.beanutils.PropertyUtilsBean.isWriteable(PropertyUtilsBean.java:1478)
> [commons-beanutils-1.9.3.jar:1.9.3]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.isPropertyWriteable(BeanHelper.java:521)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.initProperty(BeanHelper.java:357)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.initBeanProperties(BeanHelper.java:273)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.initBean(BeanHelper.java:192)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper$BeanCreationContextImpl.initBean(BeanHelper.java:669)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.DefaultBeanFactory.initBeanInstance(DefaultBeanFactory.java:162)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.DefaultBeanFactory.createBean(DefaultBeanFactory.java:116)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:459)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:479)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.beanutils.BeanHelper.createBean(BeanHelper.java:492)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResultInstance(BasicConfigurationBuilder.java:447)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.builder.BasicConfigurationBuilder.createResult(BasicConfigurationBuilder.java:417)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.commons.configuration2.builder.BasicConfigurationBuilder.getConfiguration(BasicConfigurationBuilder.java:285)
> [commons-configuration2-2.1.1.jar:2.1.1]
> at
> org.apache.hadoop.metrics2.impl.MetricsConfig.loadFirst(MetricsConfig.java:119)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.metrics2.impl.MetricsConfig.create(MetricsConfig.java:98)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.configure(MetricsSystemImpl.java:478)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.start(MetricsSystemImpl.java:188)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.init(MetricsSystemImpl.java:163)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.init(DefaultMetricsSystem.java:62)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.initialize(DefaultMetricsSystem.java:58)
> [hadoop-common-3.1.0.jar:?]
> at
> org.apache.hadoop.mapred.LocalJobRunnerMetrics.create(LocalJobRunnerMetrics.java:45)
> [hadoop-mapreduce-client-common-3.1.0.jar:?]
> at
> org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:771)
> [hadoop-mapreduce-client-common-3.1.0.jar:?]
> at
> org.apache.hadoop.mapred.LocalJobRunner.<init>(LocalJobRunner.java:764)
> [hadoop-mapreduce-client-common-3.1.0.jar:?]
> at
> org.apache.hadoop.mapred.LocalClientProtocolProvider.create(LocalClientProtocolProvider.java:42)
> [hadoop-mapreduce-client-common-3.1.0.jar:?]
> at org.apache.hadoop.mapreduce.Cluster.initialize(Cluster.java:130)
> [hadoop-mapreduce-client-core-3.1.0.jar:?]
> at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:109)
> [hadoop-mapreduce-client-core-3.1.0.jar:?]
> at org.apache.hadoop.mapreduce.Cluster.<init>(Cluster.java:102)
> [hadoop-mapreduce-client-core-3.1.0.jar:?]
> at org.apache.hadoop.mapred.JobClient.init(JobClient.java:475)
> [hadoop-mapreduce-client-core-3.1.0.jar:?]
> at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:454)
> [hadoop-mapreduce-client-core-3.1.0.jar:?]
> at
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.execute(ExecDriver.java:381)
> [classes/:?]
> at
> org.apache.hadoop.hive.ql.exec.mr.MapRedTask.execute(MapRedTask.java:149)
> [classes/:?]
> at org.apache.hadoop.hive.ql.exec.Task.executeTask(Task.java:210)
> [classes/:?]
> at
> org.apache.hadoop.hive.ql.exec.TaskRunner.runSequential(TaskRunner.java:97)
> [classes/:?]
> at org.apache.hadoop.hive.ql.Driver.launchTask(Driver.java:2692)
> [classes/:?]
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2363) [classes/:?]
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2039)
> [classes/:?]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1737) [classes/:?]
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1731) [classes/:?]
> at
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
> [classes/:?]
> at
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
> [classes/:?]
> at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
> [classes/:?]
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
> [classes/:?]
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
> [classes/:?]
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:335)
> [classes/:?]
> at
> org.apache.hadoop.hive.ql.QTestUtil.initFromScript(QTestUtil.java:1216)
> [classes/:?]
> at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:1201)
> [classes/:?]
> at org.apache.hadoop.hive.ql.QTestUtil.createSources(QTestUtil.java:1188)
> [classes/:?]
> at
> org.apache.hadoop.hive.cli.control.CoreCliDriver$3.invokeInternal(CoreCliDriver.java:83)
> [classes/:?]
> at
> org.apache.hadoop.hive.cli.control.CoreCliDriver$3.invokeInternal(CoreCliDriver.java:80)
> [classes/:?]
> at
> org.apache.hadoop.hive.util.ElapsedTimeLoggingWrapper.invoke(ElapsedTimeLoggingWrapper.java:33)
> [classes/:?]
> at
> org.apache.hadoop.hive.cli.control.CoreCliDriver.beforeClass(CoreCliDriver.java:86)
> [classes/:?]
> at
> org.apache.hadoop.hive.cli.control.CliAdapter$1$1.evaluate(CliAdapter.java:71)
> [classes/:?]
> at org.junit.rules.RunRules.evaluate(RunRules.java:20) [junit-4.11.jar:?]
> at org.junit.runners.ParentRunner.run(ParentRunner.java:309)
> [junit-4.11.jar:?]
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:365)
> [surefire-junit4-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:273)
> [surefire-junit4-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238)
> [surefire-junit4-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:159)
> [surefire-junit4-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:379)
> [surefire-booter-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:340)
> [surefire-booter-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.booter.ForkedBooter.execute(ForkedBooter.java:125)
> [surefire-booter-2.21.0.jar:2.21.0]
> at
> org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:413)
> [surefire-booter-2.21.0.jar:2.21.0] {code}
>
> Also this can be an issue :
> {code:java}
> 2023-03-19T15:19:06,187 ERROR [699603ee-f4a1-43b7-b160-7faf858ca4b4 main]
> ql.Driver: FAILED: Hive Internal Error:
> org.apache.hadoop.hive.ql.metadata.HiveException(Error while invoking
> PreHook. hooks: java.lang.RuntimeException: Cannot overwrite read-only table:
> src
> at
> org.apache.hadoop.hive.ql.hooks.EnforceReadOnlyTables.run(EnforceReadOnlyTables.java:64)
> at
> org.apache.hadoop.hive.ql.hooks.EnforceReadOnlyTables.run(EnforceReadOnlyTables.java:44)
> at
> org.apache.hadoop.hive.ql.HookRunner.invokeGeneralHook(HookRunner.java:296)
> at org.apache.hadoop.hive.ql.HookRunner.runPreHooks(HookRunner.java:273)
> at org.apache.hadoop.hive.ql.Driver.execute(Driver.java:2304)
> at org.apache.hadoop.hive.ql.Driver.runInternal(Driver.java:2039)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1737)
> at org.apache.hadoop.hive.ql.Driver.run(Driver.java:1731)
> at
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:157)
> at
> org.apache.hadoop.hive.ql.reexec.ReExecDriver.run(ReExecDriver.java:218)
> at
> org.apache.hadoop.hive.cli.CliDriver.processLocalCmd(CliDriver.java:239)
> at org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:188)
> at org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:402)
> {code}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)