hi ,i change chukwa-collector-conf.xml chukwaCollector.outputDir back from /home/futureha/chukwa/logs/ to /chukwa/logs/ now i see the correct folders
*Name* *Type* *Size* *Replication* *Block Size* *Modification Time* *Permission* *Owner* *Group* *archivesProcessing<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2FarchivesProcessing&namenodeInfoPort=50070> * *dir* *2010-11-20 12:15* *rwxr-xr-x* *futureha* *supergroup* *dataSinkArchives<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2FdataSinkArchives&namenodeInfoPort=50070> * *dir* *2010-11-20 15:23* *rwxr-xr-x* *futureha* *supergroup* *demuxProcessing<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2FdemuxProcessing&namenodeInfoPort=50070> * *dir* *2010-11-20 17:08* *rwxr-xr-x* *futureha* *supergroup* *finalArchives<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2FfinalArchives&namenodeInfoPort=50070> * *dir* *2010-11-20 12:15* *rwxr-xr-x* *futureha* *supergroup* *logs<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2Flogs&namenodeInfoPort=50070> * *dir* *2010-11-20 17:08* *rwxr-xr-x* *futureha* *supergroup* *postProcess<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2FpostProcess&namenodeInfoPort=50070> * *dir* *2010-11-20 17:08* *rwxr-xr-x* *futureha* *supergroup* *repos<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2Frepos&namenodeInfoPort=50070> * *dir* *2010-11-20 15:23* *rwxr-xr-x* *futureha* *supergroup* *rolling<http://s2.idfs.cn:50075/browseDirectory.jsp?dir=%2Fchukwa%2Frolling&namenodeInfoPort=50070> * *dir* *2010-11-20 15:23* *rwxr-xr-x* *futureha* *supergroup* but i still dont know how to use the data , there is not much doc for me . 2、how can i get it? 3、how can i read it,and count it by my mapreduce or something? thanks. 2010/11/20 梁景明 <[email protected]> > hi , follow Chukwa Agent Setup Guide, > i ran Agents and Collector successful > > from the collector > http://192.168.1.123:10080/chukwa?ping=true > i see > > Date:1290227294961 > Now:1290227353478 > numberHTTPConnection in time window:4 > numberchunks in time window:4 > lifetimechunks:18 > > yes.it seems to get logs ok, > and next step puzzled me, > > i ran the bin/start-data-processors.sh, > i fount there were some new files in my hdfs. > /chukwa > /chukwa/archivesProcessing > /chukwa/dataSinkArchives > /chukwa/finalArchives > > but all it's empty.where is the data? > > or it didn't been collected by hadoop just in my local files. > > 1、where is the data? > 2、how can i send it to hadoop? > 2、how can i get it? > 3、how can i read it,and count it by my mapreduce or something? > > > thanks for any help. > > >
