measure the time taken to complete map and reduce phase

2011-07-04 Thread sangroya
Hi, I am trying to monitor the time to complete a map phase and reduce phase in hadoop. Is there any way to measure the time taken to complete map and reduce phase in a cluster. Thanks, Amit -- View this message in context: http://lucene.472066.n3.nabble.com/measure-the-time-taken-to-complete-m

Re: measure the time taken to complete map and reduce phase

2011-07-07 Thread sangroya
wondering, what is the exact pattern to name every file like this. Best Regards, Amit On Tue, Jul 5, 2011 at 6:53 AM, Hailong [via Lucene] wrote: > Hi sangroya, > > You can look at the job administration portal at port of 50030 on your > JobTracker such as ' href="http://

Re: Re: measure the time taken to complete map and reduce phase

2011-07-07 Thread sangroya
executed maps. I have the same question for REDUCE tasks. Thanks, Amit On Thu, Jul 7, 2011 at 10:58 AM, Hailong [via Lucene] wrote: > Hi sangroya, > > I think you may be interested in reading the following piece of code from > JobHistory.java in Hadoop. > > /** > * Gene

Re: Some question about fault Injection

2011-12-29 Thread sangroya
Hi, Is there any good documentation to start with fault injection. Please share if there is any link to any examples that demonstrate the use of fault injection. Thanks, Amit -- View this message in context: http://lucene.472066.n3.nabble.com/Re-Some-question-about-fault-Injection-tp25559

Invitation to connect on LinkedIn

2012-01-08 Thread sangroya
LinkedIn Hailong, I'd like to add you to my professional network on LinkedIn. - Amit Amit Sangroya PhD Student at INRIA Lyon Area, France Confirm that you know Amit Sangroya: https://www.linkedin.com/e/7jf1zj-gx6d2jri-5i/isd/5441531746/oGgka5tN/?hs=false&tok=0WqBp

Sorting text data

2012-01-30 Thread sangroya
Hello, I have a large amount of text file 1GB, that I want to sort. So far, I know of hadoop examples that takes sequence file as an input to sort program. Does anyone know of any implementation that uses text data as input? Thanks, Amit - Sangroya -- View this message in context: http

Re: Sorting text data

2012-02-08 Thread sangroya
Hi, I tried to run the sort example by specifying the input format. But I got the following error, while running it. bin/hadoop jar hadoop-0.20.2-examples.jar sort -inFormat org.apache.hadoop.mapred.TextInputFormat /user/sangroya/test1 outtest16 Running on 1 nodes to sort from hdfs

mapred.map.tasks and mapred.reduce.tasks parameter meaning

2012-02-22 Thread sangroya
parameters clearly. Thanks in advance, Amit - Sangroya -- View this message in context: http://lucene.472066.n3.nabble.com/mapred-map-tasks-and-mapred-reduce-tasks-parameter-meaning-tp3766224p3766224.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

How to use GridMix3

2012-06-13 Thread sangroya
? Thanks in advance, Amit - Sangroya -- View this message in context: http://lucene.472066.n3.nabble.com/How-to-use-GridMix3-tp3989438.html Sent from the Hadoop lucene-users mailing list archive at Nabble.com.

Re: How to use GridMix3

2012-06-18 Thread sangroya
at org.apache.hadoop.mapred.gridmix.Gridmix.run(Gridmix.java:215) at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65) at org.apache.hadoop.mapred.gridmix.Gridmix.main(Gridmix.java:390) Thanks a lot! Amit - Sangroya -- View this message in context: http://lucene

Re: How to use GridMix3

2012-06-19 Thread sangroya
/sangroya/TestGridmix/gridmix to 0777 This seems to be a file permission issue. I granted 777 to this folder by: bin/hadoop dfs -chmod -R 777 /user/sangroya But still, I get the same error. Searching the error on the web, it seems that it is a known issue in some hadoop version. Do you or

Hadoop MapReduce dependability and performance benchmarking

2012-07-06 Thread Amit Sangroya
Dear Hadoop users, MRBS, the Hadoop MapReduce dependability and performance benchmarking system is available at: http://sardes.inrialpes.fr/research/mrbs/ It can be automatically deployed on Amazon EC2 and other cloud environments. For further information, please feel free to contact us. Best