The option should be passed to the child JVM environment when it is started. You can set most of the environment variables to garbage with no side-effect. A more important question what is the LD_LIBRARY_PATH in your JVM environment.
Once again, check the job.xml file in the mapred.local.dir (should be /tmp/cache/${user.name}/... or something like this in the pseudo-config environment) or try to print out the environment variables directly in your map/reduce task. Alex K On Fri, Apr 29, 2011 at 10:37 AM, Donatella Firmani < donatellafirm...@yahoo.com> wrote: > I just tried giving the option -Dmapred.child.env="LD_LIBRARY_PATH=/ > home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3" writing > no-sense environment variables like -Dmapred.child.env="blahblablah". > > It continues working... so I think that the option is completely ignored by > the bin/hadoop script. > > Do you think it is an expected behavior? > > Cheers, > DF > > ------------------------------ > *From:* Alex Kozlov <ale...@cloudera.com> > > *To:* mapreduce-user@hadoop.apache.org > *Sent:* Fri, April 29, 2011 7:03:50 PM > > *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH > > You need only to edit the config files on the client or give the option > with a -Dmapred.child.env="LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/ > mylibpath2/lib2;home/mylibpath3/lib3" flag (if you implement Tool). You > can check the job.xml file via JT UI to verify that the parameters have the > correct values for the job. > > On Fri, Apr 29, 2011 at 9:05 AM, Donatella Firmani < > donatellafirm...@yahoo.com> wrote: > >> Dear Yin, >> >> Good point: I can try to install 0.19 and reproduce the problem. I'll let >> you know ASAP. >> >> Thanks, >> DF >> >> >> ------------------------------ >> *From:* Yin Lou <yin.lou...@gmail.com> >> >> *To:* mapreduce-user@hadoop.apache.org >> *Sent:* Fri, April 29, 2011 5:59:14 PM >> *Subject:* Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH >> >> Just curious, can we do this in 0.19? >> >> Thanks, >> Yin >> >> On Fri, Apr 29, 2011 at 10:29 AM, Robert Evans <ev...@yahoo-inc.com>wrote: >> >>> DF, >>> >>> You can set mapred.child.java.opts to set java options, but you can also >>> set mapred.child.env to set environment variables, be careful because they >>> are space separated with an = in between them. >>> >>> <property> >>> >>> <name>mapred.child.env</name> >>> >>> >>> <value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value> >>> >>> </property> >>> >>> --Bobby >>> >>> >>> On 4/29/11 5:58 AM, "Donatella Firmani" <donatellafirm...@yahoo.com> >>> wrote: >>> >>> To solve the issue addressed in my previous message, i tried setting >>> property >>> mapred.child.java.opts in mapred-site.xml. But - even if it seems the >>> right >>> approach in relation to what said in blogs & forums - there is a big >>> problem >>> with it. >>> >>> Following the tutorial (hadoop website) as section Task Execution & >>> Environment, >>> >>> my xml looks like: >>> >>> <configuration> >>> <property> >>> <name>mapred.job.tracker</name> >>> <value>localhost:9001</value> >>> </property> >>> <property> >>> <name>mapred.child.java.opts</name> >>> <value> >>> >>> -Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3 >>> >>> >>> </value> >>> </property> >>> </configuration> >>> >>> The problem arises when executing the job, because it is thrown an >>> exception: >>> >>> Exception in thread "main" java.lang.NoClassDefFoundError: >>> >>> -Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3 >>> >>> >>> >>> Any help would be appreciated. >>> Thanks in advance, >>> >>> DF >>> >>> >>> >>> ----- Original Message ---- >>> From: Donatella Firmani <donatellafirm...@yahoo.com> >>> To: mapreduce-user@hadoop.apache.org >>> Sent: Fri, April 29, 2011 12:57:52 PM >>> Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH >>> >>> >>> >>> Hi to all, >>> >>> I just subscribed to this mailing list and I'd like to ask you if anyone >>> knows >>> how to deal with LD_LIBRARY_PATH. >>> I have a Java application that needs a proper setting of this environment >>> variable to work under Linux-Ubuntu. >>> I want to use tis application from a mapreduce job, unfortunately I could >>> not >>> find a way to make things work against the LD_LIBRARY_PATH environment >>> variable. >>> >>> I tried so many different strategies and I am stuck. Maybe someone of you >>> can >>> help. >>> >>> Thanks in advance, >>> Cheers. >>> >>> DF >>> >>> PS: I use hadoop-0-20-2 >>> >>> >> >