RE: Running in cluster mode causes native library linking to fail

2015-10-26 Thread prajod.vettiyattil
(WT01 - BAS) <prajod.vettiyat...@wipro.com> Cc: user <user@spark.apache.org> Subject: Re: Running in cluster mode causes native library linking to fail Hello guys, After lots of time trying to make things work, I finally found what was causing the issue: I was calling the function from

Re: Running in cluster mode causes native library linking to fail

2015-10-26 Thread Bernardo Vecchia Stein
t; found out that I did not reply to the group in my original reply. > > > > *From:* Prajod S Vettiyattil (WT01 - BAS) > *Sent:* 15 October 2015 11:45 > *To:* 'Bernardo Vecchia Stein' <bernardovst...@gmail.com> > *Subject:* RE: Running in cluster mode causes nati

RE: Running in cluster mode causes native library linking to fail

2015-10-15 Thread prajod.vettiyattil
278322 ] Regards, Prajod From: Bernardo Vecchia Stein [mailto:bernardovst...@gmail.com] Sent: 15 October 2015 00:36 To: Prajod S Vettiyattil (WT01 - BAS) <prajod.vettiyat...@wipro.com> Subject: Re: Running in cluster mode causes native library linking to fail Hello Prajod, Thanks for your repl

Re: Running in cluster mode causes native library linking to fail

2015-10-14 Thread Bernardo Vecchia Stein
Hi Deenar, Yes, the native library is installed on all machines of the cluster. I tried a simpler approach by just using System.load() and passing the exact path of the library, and things still won't work (I get exactly the same error and message). Any ideas of what might be failing? Thank

Re: Running in cluster mode causes native library linking to fail

2015-10-14 Thread Renato Marroquín Mogrovejo
Hi Bernardo, So is this in distributed mode? or single node? Maybe fix the issue with a single node first ;) You are right that Spark finds the library but not the *.so file. I also use System.load() with LD_LIBRARY_PATH set, and I am able to execute without issues. Maybe you'd like to double

Re: Running in cluster mode causes native library linking to fail

2015-10-14 Thread Bernardo Vecchia Stein
Hi Renato, I am using a single master and a single worker node, both in the same machine, to simplify everything. I have tested with System.loadLibrary() as well (setting all the necessary paths) and get the same error. Just double checked everything and the parameters are fine. Bernardo On 14

Re: Running in cluster mode causes native library linking to fail

2015-10-14 Thread Renato Marroquín Mogrovejo
Sorry Bernardo, I just double checked. I use: System.loadLibrary(); Could you also try that? Renato M. 2015-10-14 21:51 GMT+02:00 Renato Marroquín Mogrovejo < renatoj.marroq...@gmail.com>: > Hi Bernardo, > > So is this in distributed mode? or single node? Maybe fix the issue with a >

Re: Running in cluster mode causes native library linking to fail

2015-10-14 Thread Renato Marroquín Mogrovejo
You can also try setting the env variable LD_LIBRARY_PATH to point where your compiled libraries are. Renato M. 2015-10-14 21:07 GMT+02:00 Bernardo Vecchia Stein : > Hi Deenar, > > Yes, the native library is installed on all machines of the cluster. I > tried a

Re: Running in cluster mode causes native library linking to fail

2015-10-14 Thread Bernardo Vecchia Stein
Hi Renato, I have done that as well, but so far no luck. I believe spark is finding the library correctly, otherwise the error message would be "no libraryname found" or something like that. The problem seems to be something else, and I'm not sure how to find it. Thanks, Bernardo On 14 October

Re: Running in cluster mode causes native library linking to fail

2015-10-13 Thread Deenar Toraskar
Hi Bernardo Is the native library installed on all machines of your cluster and are you setting both the spark.driver.extraLibraryPath and spark.executor.extraLibraryPath ? Deenar On 14 October 2015 at 05:44, Bernardo Vecchia Stein < bernardovst...@gmail.com> wrote: > Hello, > > I am trying

Running in cluster mode causes native library linking to fail

2015-10-13 Thread Bernardo Vecchia Stein
Hello, I am trying to run some scala code in cluster mode using spark-submit. This code uses addLibrary to link with a .so that exists in the machine, and this library has a function to be called natively (there's a native definition as needed in the code). The problem I'm facing is: whenever I