[ 
http://issues.apache.org/jira/browse/HADOOP-292?page=comments#action_12415465 ] 

Owen O'Malley commented on HADOOP-292:
--------------------------------------

It is also arguable whether the default logger for applications should be 
"WARN,console" or "INFO,console". The default is currently info, which is why 
you're seeing those messages. The default can be changed via the environment 
variable HADOOP_ROOT_LOGGER or the log4j.properties file in the config 
directory.

> hadoop dfs commands should not output superfluous data to stdout
> ----------------------------------------------------------------
>
>          Key: HADOOP-292
>          URL: http://issues.apache.org/jira/browse/HADOOP-292
>      Project: Hadoop
>         Type: Bug

>   Components: dfs
>     Reporter: Yoram Arnon
>     Priority: Minor
>  Attachments: stderr-log.patch
>
> running a command such as hadoop dfs -ls /data
> produces output such as the following:
> 06/06/08 17:42:32 INFO conf.Configuration: parsing jar:file: 
> /hadoop/hadoop-0.4-dev/hadoop-0.4-dev.jar!/hadoop-default.xml
> 06/06/08 17:42:32 INFO conf.Configuration: parsing file:hadoop/hadoop-site.xml
> 06/06/08 17:42:32 INFO dfs.DistributedFileSystem: No FS indicated, using 
> default:kry1200:8020
> 06/06/08 17:42:32 INFO ipc.Client: Client connection to 172.30.111.134:8020: 
> starting
> Found 2 items
> /data/a <dir>
> /data/b     <dir>
> the first few lines shouldn't be there.
> it's especially annoying when running -cat into a file or into some post 
> processing program, but in general, the output should be clean.

-- 
This message is automatically generated by JIRA.
-
If you think it was sent incorrectly contact one of the administrators:
   http://issues.apache.org/jira/secure/Administrators.jspa
-
For more information on JIRA, see:
   http://www.atlassian.com/software/jira

Reply via email to