Hello Harsh,
   Thanks for the quick info. I will go through the specifics. I hope you will 
address any further doubts that I have on the issue.

Regards,
Ranjan

On 03/06/12, Harsh J   wrote:
> Ranjan,
> 
> Schedulers do not apply per-job. You need to change it at the JobTracker.
> 
> Follow instructions at
> http://hadoop.apache.org/common/docs/r1.0.0/fair_scheduler.html to
> switch scheduler to FairScheduler.
> 
> On Wed, Mar 7, 2012 at 4:08 AM, Ranjan Banerjee <rbanerj...@wisc.edu> wrote:
> >
> > Hello,
> >    I am relatively new to Hadoop. Started playing with it around two weeks 
> > ago. I have finished running the canonical word count map reduce example. 
> > My class project involves coming with a different scheduler for Hadoop. I 
> > know that by default Hadoop uses the FIFO and it also has the fair 
> > scheduler to be used. Can someone suggest where do I exactly write my 
> > scheduler code and what change do I need to do to the Conf object so that 
> > my scheduler is used by Hadoop and not the default scheduler in scheduling 
> > map reduce jobs.
> >
> > Regards,
> > Ranjan
> >
> >
> >
> >
> > On 03/02/12, mohammed elsaeedy   wrote:
> >> Dear Mailing list,
> >>
> >>    I've been trying to simply build the hadoop source code on my Macbook
> >> Pro. I tried following the tutorial mentioned
> >> here<http://wiki.apache.org/hadoop/EclipseEnvironment>.
> >> I grabbed the source code from github, and then I try  to run maven:
> >>
> >> mvn install -DskipTests
> >>
> >> mvn clean package -DskipTests -Pdist -Dtar -Dmaven.javadoc.skip=true
> >>
> >>
> >> but I always get the following error:
> >>
> >> *[INFO]
> >> ------------------------------------------------------------------------*
> >> *[INFO] Total time: 14:48.283s*
> >> *[INFO] Finished at: Fri Mar 02 13:29:20 EET 2012*
> >> *[INFO] Final Memory: 68M/123M*
> >> *[INFO]
> >> ------------------------------------------------------------------------*
> >> *[ERROR] Failed to execute goal
> >> org.apache.maven.plugins:maven-pdf-plugin:1.1:pdf (pdf) on project
> >> hadoop-distcp: Error during document generation: Error parsing
> >> /Users/SaSa/Desktop/Hadoop/src/hadoop-common/hadoop-tools/hadoop-distcp/target/pdf/site.tmp/xdoc/index.xml:
> >> Error validating the model: Fatal error:*
> >> *[ERROR] Public ID: null*
> >> *[ERROR] System ID: http://maven.apache.org/xsd/xdoc-2.0.xsd*
> >> *[ERROR] Line number: 2699*
> >> *[ERROR] Column number: 5*
> >> *[ERROR] Message: The element type "xs:element" must be terminated by the
> >> matching end-tag "</xs:element>".*
> >> *[ERROR] -> [Help 1]*
> >> *[ERROR] *
> >> *[ERROR] To see the full stack trace of the errors, re-run Maven with the
> >> -e switch.*
> >> *[ERROR] Re-run Maven using the -X switch to enable full debug logging.*
> >> *[ERROR] *
> >> *[ERROR] For more information about the errors and possible solutions,
> >> please read the following articles:*
> >> *[ERROR] [Help 1]
> >> http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException*
> >> *[ERROR] *
> >> *[ERROR] After correcting the problems, you can resume the build with the
> >> command*
> >> *[ERROR]   mvn <goals> -rf :hadoop-distcp*
> >>
> >> I googled for a similar problem, all I got was some people cleared the m2
> >> and ivy2 cahce, and rerun the command. But unfortunately didn't work for
> >> me. I've been stuck here, I really hope anyone can help me with this issue.
> >>
> >> Thank you all.
> >> Regards,
> >> SaSa
> >>
> >>
> >>
> >> --
> >> Mohammed El Sayed
> >> Computer Science Department
> >> King Abdullah University of Science and Technology
> >> 2351 - 4700 KAUST, Saudi Arabia
> >> Home Page <http://cloud.kaust.edu.sa/SiteCollectionDocuments/melsayed.aspx>
> >
> 
> 
> 
> -- 
> Harsh J

Reply via email to