Hi Harsh

You were right - it was executed via the LocalJobRunner. What settings do I
need to change to ensure it goes through RM/NM? I did specify to use the
yarn framework in the mapred-site.xml. The eclipse console even shows to
check http://localhost:8080 to track job progress.

I will again cross-check all the parameters today with the ones specified
in the urls you gave me. By the way, I found this link <
http://hadoop.apache.org/common/docs/r2.0.0-alpha/hadoop-project-dist/hadoop-common/DeprecatedProperties.html
>
which gives a list of deprecated properties. I have, in fact, specified a
lot of old properties in my core-site.xml, hdfs-site.xml, mapred-site.xml
files. Will that cause a problem?

Regards,
Prajakta



On Wed, Jun 13, 2012 at 8:18 PM, Harsh J <ha...@cloudera.com> wrote:

> Good to know your progress Prajakta!
>
> Did your submission surely go via the RM/NM or did it execute via the
> LocalJobRunner (logs show this classname)?
>
> You would ideally want to set the config "mapreduce.framework.name" to
> value "yarn" (in either config object before you use it, or in local
> mapred-site.xml), for it to use the YARN framework. A set of general
> configs for YARN deployment may be found at http://bit.ly/M2Eobz or at
> http://bit.ly/LW3Var.
>
> Does this help?
>
> On Wed, Jun 13, 2012 at 6:39 PM, Prajakta Kalmegh <pkalm...@gmail.com>
> wrote:
> > Hi Harsh
> >
> > Appreciate the response. I was able to configure and implement basic
> JUnits
> > within eclipse and get some code running. Still getting familiar with the
> > new YARN and federation architecture.
> >
> > I was, however, not able to check the MR jobs submitted within eclipse
> for
> > a sample WordCount program on the
> > http://localhost:8088/<http://localhost:8080/>page. I am starting my
> > namenode/datanode/resourcemanager/nodemanager/historyserver as instructed
> > on the wiki page. And then executing JUnit tests from eclipse.
> >
> > I believe a single MR job will be submitted as a single application in
> the
> > new framework, right? The eclipse console shows a successful execution
> (the
> > details are pretty neat). However, the webpage shows 'No applications
> > submitted'. Do I have to tweak with any config properties to get this
> done?
> >
> > Please let me know.
> >
> > Regards,
> > Prajakta
> >
> >
> >
> >
> >
> > On Wed, Jun 13, 2012 at 6:09 PM, Harsh J <ha...@cloudera.com> wrote:
> >
> >> Hi Prajakta,
> >>
> >> I have Eclipse setup with M2E plugins. And once thats done, I merely
> >> clone a repo and import projects in via M2E's "Import existing maven
> >> projects" feature. This seems to work just fine for apache/hadoop's
> >> trunk.
> >>
> >> On Thu, Jun 7, 2012 at 5:18 PM, Prajakta Kalmegh <prkal...@in.ibm.com>
> >> wrote:
> >> > Hi
> >> >
> >> > I have done MapReduce programming using Eclipse before but now I need
> to
> >> > learn the Hadoop code internals for one of my projects.
> >> >
> >> > I have forked Hadoop from github (
> >> https://github.com/apache/hadoop-common
> >> > ) and need to configure it to work with Eclipse. All the links I could
> >> > find list steps for earlier versions of Hadoop. I am right now
> following
> >> > instructions given in these links:
> >> > - http://wiki.apache.org/hadoop/GitAndHadoop
> >> > - http://wiki.apache.org/hadoop/EclipseEnvironment
> >> > - http://wiki.apache.org/hadoop/HowToContribute
> >> >
> >> > Can someone please give me a link to the steps to be followed for
> getting
> >> > Hadoop (latest from trunk) started in Eclipse? I need to be able to
> >> commit
> >> > changes to my forked repository on github.
> >> >
> >> > Thanks in advance.
> >> > Regards,
> >> > Prajakta
> >>
> >>
> >>
> >> --
> >> Harsh J
> >>
>
>
>
> --
> Harsh J
>

Reply via email to