The default log level is WARN. Please change it to INFO.

hive.root.logger=INFO,DRFA

Of course you can also use LOG.warn() in your test code.

Zheng

On Sun, Aug 16, 2009 at 11:58 PM, Saurabh Nanda<[email protected]> wrote:
> I still can't find the log output anywhere.
>
> The log file is in /tmp/ct-admin/hive.log for me. The only contents in the
> log file are:
>
> 2009-08-17 11:18:18,018 WARN  mapred.JobClient
> (JobClient.java:configureCommandLineOptions(510)) - Use GenericOptionsParser
> for parsing the arguments. Applications should implement Tool for the same.
> 2009-08-17 11:26:45,380 WARN  mapred.JobClient
> (JobClient.java:configureCommandLineOptions(510)) - Use GenericOptionsParser
> for parsing the arguments. Applications should implement Tool for the same.
>
> Here's the exact change I made in SemanticAnalyzer.java:
>
>   Operator output = putOpInsertMap(
>       OperatorFactory.getAndMakeChild(
>         new fileSinkDesc(queryTmpdir, table_desc,
>                          conf.getBoolVar(HiveConf.ConfVars.COMPRESSRESULT),
> currentTableId),
>         fsRS, input), inputRR);
>
>     LOG.info("Created FileSink Plan for clause: " + dest + "dest_path: "
>              + dest_path + " row schema: "
>              + inputRR.toString()
>              + ". HiveConf.ConfVars.COMPRESSRESULT="
>              + conf.getBoolVar(HiveConf.ConfVars.COMPRESSRES
> ULT));
>
> Here's what conf/hive-log4j.properties looks like:
>
> # Define some default values that can be overridden by system properties
> hive.root.logger=WARN,DRFA
> hive.log.dir=/tmp/${user.name}
> hive.log.file=hive.log
>
> # Define the root logger to the system property "hadoop.root.logger".
> log4j.rootLogger=${hive.root.logger}, EventCounter
>
> # Logging Threshold
> log4j.threshhold=ALL
>
> #
> # Daily Rolling File Appender
> #
>
> log4j.appender.DRFA=org.apache.log4j.DailyRollingFileAppender
> log4j.appender.DRFA.File=${hive.log.dir}/${hive.log.file}
>
> # Rollver at midnight
> log4j.appender.DRFA.DatePattern=.yyyy-MM-dd
>
> # 30-day backup
> #log4j.appender.DRFA.MaxBackupIndex=30
> log4j.appender.DRFA.layout=org.apache.log4j.PatternLayout
>
> # Pattern format: Date LogLevel LoggerName LogMessage
> #log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %p %c: %m%n
> # Debugging Pattern format
> log4j.appender.DRFA.layout.ConversionPattern=%d{ISO8601} %-5p %c{2}
> (%F:%M(%L)) - %m%n
>
>
> #
> # console
> # Add "console" to rootlogger above if you want to use this
> #
>
> log4j.appender.console=org.apache.log4j.ConsoleAppender
> log4j.appender.console.target=System.err
> log4j.appender.console.layout=org.apache.log4j.PatternLayout
> log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p
> %c{2}: %m%n
>
> #custom logging levels
> # log4j.logger.root=DEBUG
>
> #
> # Event Counter Appender
> # Sends counts of logging messages at different severity levels to Hadoop
> Metrics.
> #
> log4j.appender.EventCounter=org.apache.hadoop.metrics.jvm.EventCounter
>
>
> log4j.category.DataNucleus=ERROR,DRFA
> log4j.category.Datastore=ERROR,DRFA
> log4j.category.Datastore.Schema=ERROR,DRFA
> log4j.category.JPOX.Datastore=ERROR,DRFA
> log4j.category.JPOX.Plugin=ERROR,DRFA
> log4j.category.JPOX.MetaData=ERROR,DRFA
> log4j.category.JPOX.Query=ERROR,DRFA
> log4j.category.JPOX.General=ERROR,DRFA
> log4j.category.JPOX.Enhancer=ERROR,DRFA
>
> What is going wrong?
>
> Saurabh.
> --
> http://nandz.blogspot.com
> http://foodieforlife.blogspot.com
>



-- 
Yours,
Zheng

Reply via email to