Just to confirm: the libraries aren't in the HDFS but in the node FS. Due to the fact that they are a lot... my wish is to link the job to a location in the node FS. It entails that the path that I described with " /home/mylibpath1/lib1" is local.
I think it is possible, isn't it? Thanks for your assistance, DF ----- Original Message ---- From: Donatella Firmani <donatellafirm...@yahoo.com> To: mapreduce-user@hadoop.apache.org Sent: Fri, April 29, 2011 5:58:37 PM Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH Yes, I stopped, removed temp files in /tmp/hadoop (just to be sure :-)) then started again. DF ----- Original Message ---- From: Joey Echeverria <j...@cloudera.com> To: mapreduce-user@hadoop.apache.org Sent: Fri, April 29, 2011 5:55:54 PM Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH Just to confirm, you restarted hadoop after making the changes to mapred-site.xml? -Joey On Fri, Apr 29, 2011 at 11:53 AM, Donatella Firmani <donatellafirm...@yahoo.com> wrote: > Hi Alex, > > I'm just editing mapred-site.xml in /conf directory of my hadoop > installation root. > I'm running in pseudo-distributed mode? > > Should I edit something else? > > Thanks for your quick reply (I'm getting crazy with this problem :-P) > > Regards, > DF > > ________________________________ > From: Alex Kozlov <ale...@cloudera.com> > To: mapreduce-user@hadoop.apache.org > Sent: Fri, April 29, 2011 5:45:55 PM > Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH > > Hi Donatella, Are you sure you are passing the parameters the correct way? > Do you see these set in your job.xml file (on a slave)? > > On Fri, Apr 29, 2011 at 8:41 AM, Donatella Firmani > <donatellafirm...@yahoo.com> wrote: >> >> In any case - I tried different sitaxes - I have the same result. >> I made my mapper process dump on log files the result of >> >> System.getenv("LD_LIBRARY_PATH") >> System.getProperty("java.library.path") >> >> and none of the values seem to be affected neither by the setting of >> mapred.child.java.opts or of mapred.child.env. :-( >> >> Maybe hadoop ovverrides something at run time or at start time?? >> >> Thanks in advance, >> DF >> >> >> ________________________________ >> From: Donatella Firmani <donatellafirm...@yahoo.com> >> To: mapreduce-user@hadoop.apache.org >> Sent: Fri, April 29, 2011 5:17:00 PM >> Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH >> >> Dear Robert, >> >> thanks for your quick reply. So you are saying that i can add a property >> in hadoop-0.20.2 adding an item in mapred-site.xml. I have two questions >> related to the syntax, do you confirm me that >> >> 1) different path are to be separated by a ":" like in unix systems or by >> a ";"? >> 2) the blank space is link LD_LIBRARY_PATH = /home/mylibpath1/lib1 >> (and so LD_LIBRARY_PATH=/home/mylibpath1/lib1 is wrong) >> >> ? >> >> Thanks in advance, >> DF >> >> ________________________________ >> From: Robert Evans <ev...@yahoo-inc.com> >> To: "mapreduce-user@hadoop.apache.org" <mapreduce-user@hadoop.apache.org> >> Sent: Fri, April 29, 2011 4:29:51 PM >> Subject: Re: Hadoop Mapreduce jobs and LD_LIBRARY_PATH >> >> DF, >> >> You can set mapred.child.java.opts to set java options, but you can also >> set mapred.child.env to set environment variables, be careful because they >> are space separated with an = in between them. >> >> <property> >> >> <name>mapred.child.env</name> >> >> >><value>LD_LIBRARY_PATH=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3</value> >> >>> >> >> </property> >> >> --Bobby >> >> On 4/29/11 5:58 AM, "Donatella Firmani" <donatellafirm...@yahoo.com> >> wrote: >> >> To solve the issue addressed in my previous message, i tried setting >> property >> mapred.child.java.opts in mapred-site.xml. But - even if it seems the >> right >> approach in relation to what said in blogs & forums - there is a big >> problem >> with it. >> >> Following the tutorial (hadoop website) as section Task Execution & >> Environment, >> >> my xml looks like: >> >> <configuration> >> <property> >> <name>mapred.job.tracker</name> >> <value>localhost:9001</value> >> </property> >> <property> >> <name>mapred.child.java.opts</name> >> <value> >> >>-Djava.library.path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3 >> >>3 >> >> >> </value> >> </property> >> </configuration> >> >> The problem arises when executing the job, because it is thrown an >> exception: >> >> Exception in thread "main" java.lang.NoClassDefFoundError: >> >>-Djava/library/path=/home/mylibpath1/lib1;home/mylibpath2/lib2;home/mylibpath3/lib3 >> >>3 >> >> >> >> Any help would be appreciated. >> Thanks in advance, >> >> DF >> >> >> >> ----- Original Message ---- >> From: Donatella Firmani <donatellafirm...@yahoo.com> >> To: mapreduce-user@hadoop.apache.org >> Sent: Fri, April 29, 2011 12:57:52 PM >> Subject: Hadoop Mapreduce jobs and LD_LIBRARY_PATH >> >> >> >> Hi to all, >> >> I just subscribed to this mailing list and I'd like to ask you if anyone >> knows >> how to deal with LD_LIBRARY_PATH. >> I have a Java application that needs a proper setting of this environment >> variable to work under Linux-Ubuntu. >> I want to use tis application from a mapreduce job, unfortunately I could >> not >> find a way to make things work against the LD_LIBRARY_PATH environment >> variable. >> >> I tried so many different strategies and I am stuck. Maybe someone of you >> can >> help. >> >> Thanks in advance, >> Cheers. >> >> DF >> >> PS: I use hadoop-0-20-2 >> > > -- Joseph Echeverria Cloudera, Inc. 443.305.9434