[
https://issues.apache.org/jira/browse/HIVE-17575?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16175490#comment-16175490
]
mohammed morshed commented on HIVE-17575:
-----------------------------------------
Below is the Workaround to resolve the issue:
===================================
In your Hadoop cluster nodes, edit the file
“/etc/hive/conf.dist/hive-exec-log4j2.properties”
property.hive.root.logger = FA
appender.FA.type = RandomAccessFile
appender.FA.name = FA
appender.FA.fileName = ${sys:hive.log.dir}/${sys:hive.log.file1}
and
Also edit in your cluster nodes “/etc/hadoop/conf.empty/log4j.properties”
hadoop.log.file1=hadoop1.log
hadoop.log.file2=hadoop2.log
hadoop.log.file3=hadoop3.log
log4j.appender.HADOOP.File=${hadoop.log.dir}/${hadoop.log.file1}
log4j.appender.MAPRED.File=${hadoop.log.dir}/${hadoop.log.file2}
log4j.appender.RFA.File=${hadoop.log.dir}/${hadoop.log.file3}
+++++++++++++++++++++++++++++++++++++++++++++++++++++++
> main ERROR Unable to locate appender "FA" for logger config "root"
> ------------------------------------------------------------------
>
> Key: HIVE-17575
> URL: https://issues.apache.org/jira/browse/HIVE-17575
> Project: Hive
> Issue Type: Bug
> Components: Beeline, Hive
> Reporter: mohammed morshed
> Priority: Critical
> Labels: usability
>
> Problem: In Hive version 2.3, when running 'INSERT' statements from beeline,
> observed following error as below:
> 0: jdbc:hive2://localhost:10000>INSERT INTO TABLE
> w2867998436858995169_write_orders_bkt_tgt_tmp_m_orders_updtx_50percent SELECT
> orders_bkt.o_orderkey as a0, orders_bkt.o_custkey as a1,
> orders_bkt.o_totalprice as a2, orders_bkt.o_orderdate as a3,
> orders_bkt.o_orderpriority as a4, orders_bkt.o_clerk as a5,
> orders_bkt.o_shippriority as a6, orders_bkt.o_comment as a7,
> orders_bkt.o_orderstatus as a8 FROM
> w2867998436858995169_write_orders_bkt_src_tmp_m_orders_updtx_50percent JOIN
> TPCH_TEXT_S3_SINGLE_100.orders_bkt ON
> (w2867998436858995169_write_orders_bkt_src_tmp_m_orders_updtx_50percent.a0 =
> orders_bkt.o_orderkey);
> WARN : Hive-on-MR is deprecated in Hive 2 and may not be available in the
> future versions. Consider using a different execution engine (i.e. spark,
> tez) or using Hive 1.X releases.
> INFO : WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available
> in the future versions. Consider using a different execution engine (i.e.
> spark, tez) or using Hive 1.X releases.
> WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available in the
> future versions. Consider using a different execution engine (i.e. spark,
> tez) or using Hive 1.X releases.
> INFO : Query ID = hive_20170921204456_1d837547-fa86-43c5-b57f-e16085abc5d8
> INFO : Total jobs = 3
> INFO : Starting task [Stage-6:CONDITIONAL] in serial mode
> INFO : Stage-7 is selected by condition resolver.
> INFO : Stage-1 is filtered out by condition resolver.
> INFO : Starting task [Stage-7:MAPREDLOCAL] in serial mode
> 2017-09-21 20:46:54,822 main ERROR Unable to invoke factory method in class
> class org.apache.logging.log4j.core.appender.RandomAccessFileAppender for
> element RandomAccessFile. java.lang.reflect.InvocationTargetException
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at
> org.apache.logging.log4j.core.config.plugins.util.PluginBuilder.build(PluginBuilder.java:132)
> at
> org.apache.logging.log4j.core.config.AbstractConfiguration.createPluginObject(AbstractConfiguration.java:918)
> at
> org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:858)
> at
> org.apache.logging.log4j.core.config.AbstractConfiguration.createConfiguration(AbstractConfiguration.java:850)
> at
> org.apache.logging.log4j.core.config.AbstractConfiguration.doConfigure(AbstractConfiguration.java:479)
> at
> org.apache.logging.log4j.core.config.AbstractConfiguration.initialize(AbstractConfiguration.java:219)
> at
> org.apache.logging.log4j.core.config.AbstractConfiguration.start(AbstractConfiguration.java:231)
> at
> org.apache.logging.log4j.core.LoggerContext.setConfiguration(LoggerContext.java:496)
> at
> org.apache.logging.log4j.core.LoggerContext.reconfigure(LoggerContext.java:566)
> at
> org.apache.logging.log4j.core.LoggerContext.setConfigLocation(LoggerContext.java:555)
> at
> org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.locateContext(ClassLoaderContextSelector.java:157)
> at
> org.apache.logging.log4j.core.selector.ClassLoaderContextSelector.getContext(ClassLoaderContextSelector.java:74)
> at
> org.apache.logging.log4j.core.impl.Log4jContextFactory.getContext(Log4jContextFactory.java:227)
> at
> org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:157)
> at
> org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:130)
> at
> org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:100)
> at
> org.apache.logging.log4j.core.config.Configurator.initialize(Configurator.java:187)
> at
> org.apache.hadoop.hive.common.LogUtils.initHiveLog4jDefault(LogUtils.java:154)
> at
> org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:90)
> at
> org.apache.hadoop.hive.common.LogUtils.initHiveLog4jCommon(LogUtils.java:82)
> at
> org.apache.hadoop.hive.common.LogUtils.initHiveExecLog4j(LogUtils.java:76)
> at
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.setupChildLog4j(ExecDriver.java:634)
> at
> org.apache.hadoop.hive.ql.exec.mr.ExecDriver.main(ExecDriver.java:714)
> at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
> at
> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
> at
> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
> at java.lang.reflect.Method.invoke(Method.java:498)
> at org.apache.hadoop.util.RunJar.run(RunJar.java:221)
> at org.apache.hadoop.util.RunJar.main(RunJar.java:136)
> Caused by: java.lang.ClassCastException:
> org.apache.logging.log4j.core.appender.RandomAccessFileManager$FactoryData
> cannot be cast to
> org.apache.logging.log4j.core.appender.rolling.RollingRandomAccessFileManager$FactoryData
> at
> org.apache.logging.log4j.core.appender.rolling.RollingRandomAccessFileManager.updateData(RollingRandomAccessFileManager.java:253)
> at
> org.apache.logging.log4j.core.appender.AbstractManager.getManager(AbstractManager.java:80)
> at
> org.apache.logging.log4j.core.appender.OutputStreamManager.getManager(OutputStreamManager.java:81)
> at
> org.apache.logging.log4j.core.appender.RandomAccessFileManager.getFileManager(RandomAccessFileManager.java:70)
> at
> org.apache.logging.log4j.core.appender.RandomAccessFileAppender.createAppender(RandomAccessFileAppender.java:166)
> ... 33 more
> 2017-09-21 20:46:54,825 main ERROR Null object returned for RandomAccessFile
> in Appenders.
> 2017-09-21 20:46:54,825 main ERROR Unable to locate appender "FA" for logger
> config "root"
> =======================================================================
> The error is due to multiple log4j appender is trying to write in the same
> file or location.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)