Hello guys,

After lots of time trying to make things work, I finally found what was
causing the issue:

I was calling the function from the library inside a map function, which
caused the code inside it to be run in executors instead of the driver.
Since only the driver had loaded the library, the executors would then give
an error. The tricky part is that the same error message was being
replicated in the driver's and the executor's logs, so it led me to believe
it was a global error. Only after testing running stuff *only* on the
driver was that I discovered that everything worked.

For future reference: if you are running into this issue, please check if
you are also loading the library on the executors! In the case of my map
example, the fix was to create a wrapper function that 1) loaded libraries
and then 2) called functions within the library. After that, map things to
this wrapper function. This way, you ensure every executor also loads the
libraries.

I'd like to thank Prajod, Renato and Deenar for the help.

Bernardo

On 15 October 2015 at 03:27, <prajod.vettiyat...@wipro.com> wrote:

> Forwarding to the group, in case someone else has the same error. Just
> found out that I did not reply to the group in my original reply.
>
>
>
> *From:* Prajod S Vettiyattil (WT01 - BAS)
> *Sent:* 15 October 2015 11:45
> *To:* 'Bernardo Vecchia Stein' <bernardovst...@gmail.com>
> *Subject:* RE: Running in cluster mode causes native library linking to
> fail
>
>
>
> Hi,
>
>
>
> Also try the path settings given here:
> http://stackoverflow.com/questions/12279833/videocapture-opencv-2-4-2-error-in-windows/29278322#29278322
>
>
>
> Forgot to add this link in my response earlier:
>
> https://blogs.oracle.com/darcy/entry/purging_ld_library_path
>
> http://www.oracle.com/technetwork/java/javase/jdk7-relnotes-418459.html
>
>
>
> So from java 7, LD_LIBRARY_PATH is ignored. This is for Linux and Solaris.
> And probably for all other Unix derivatives.
>
>
>
> Also check : System.loadLibrary() should be inside a static {  } block.
> Please check for its syntax on the internet. The loadlibrary function has
> to be called during class load time. That is why the static block is
> required.
>
>
>
> What is your ?
>
> 1.      Spark version
>
> 2.      OS type and version
>
> 3.      Library that you are trying to load.
>
>
>
> [I was using OpenCV. Had to go through many trials to get it working
> consistently. Initially, it would work only on dev environment(windows) but
> not on Ubuntu. Its been a few months. There is a stackoverflow answer I
> have given regarding this:
> http://stackoverflow.com/questions/12279833/videocapture-opencv-2-4-2-error-in-windows/29278322#29278322
> ]
>
>
>
> Regards,
>
> Prajod
>
>
>
> *From:* Bernardo Vecchia Stein [mailto:bernardovst...@gmail.com]
> *Sent:* 15 October 2015 00:36
> *To:* Prajod S Vettiyattil (WT01 - BAS) <prajod.vettiyat...@wipro.com>
> *Subject:* Re: Running in cluster mode causes native library linking to
> fail
>
>
>
> Hello Prajod,
>
> Thanks for your reply! I am also using the standalone cluster manager. I
> do not build the jars in Eclipse and neither use Maven. They are built with
> sbt by hand.
>
> I was setting LD_LIBRARY_PATH and LIBRARY_PATH to point to the paths with
> the library. When I didn't set them and set only PATH instead, spark would
> just not find the libraries (it was another error). I'm not sure what
> version you are using, but it appears I do have to set LD_LIBRARY_PATH in
> order to make things work.
>
> I tried a simpler approach using System.load() with a specific path to the
> library, so I don't have to deal with these paths. However, I still get the
> same error when executing in cluster mode (exactly the same error). Do you
> have any idea why that might be failing?
>
> Thank you again for your attention,
>
> Bernardo
>
>
>
> On 14 October 2015 at 03:30, <prajod.vettiyat...@wipro.com> wrote:
>
> Hi,
>
>
>
> I have successfully made this working using the “standalone”cluster
> manager. Not tried with Mesos or YARN.
>
>
>
> Which of these cluster managers are you using ?
> https://spark.apache.org/docs/1.1.0/cluster-overview.html
>
> ·        Standalone
> <https://spark.apache.org/docs/1.1.0/spark-standalone.html> – a simple
> cluster manager included with Spark that makes it easy to set up a cluster.
>
> ·        Apache Mesos
> <https://spark.apache.org/docs/1.1.0/running-on-mesos.html> – a general
> cluster manager that can also run Hadoop MapReduce and service applications.
>
> ·        Hadoop YARN
> <https://spark.apache.org/docs/1.1.0/running-on-yarn.html> – the resource
> manager in Hadoop 2.
>
>
>
> I have run Spark using Scala in cluster mode, using the standalone cluster
> manager. It took a lot of effort. Also I think  that “UnsatisfiedLinkError”
> means that your .so could not be found.
>
>
>
> There are two settings to make this work:
>
> 1.      “native library location” in the Eclipse configuration.(my jar
> for spark_submit () was built using maven build from Eclipse). I think that
> value(native library location) goes into the jar manifest.
>
> 2.      The PATH environment variable has to be set with the full path of
> the directory where your .so is located. The LD_LIBRARY_PATH was where it
> was done in earlier versions(prior to java 6) of java. But now it seems to
> be deprecated in favor of PATH. So you will find a lot of answers on the
> internet asking to set LD_LIBRARY_PATH. But that variable is ignored now.
>
>
>
> Also note that the environment variable settings have to be made in each
> machine where your Spark Worker is running. This also needs that you
> understand where your app code is executed: in the driver(master machine)
> or in the executor(worker machine)
>
>
>
> Prajod
>
>
>
> *From:* Bernardo Vecchia Stein [mailto:bernardovst...@gmail.com]
> *Sent:* 14 October 2015 10:15
> *To:* user@spark.apache.org
> *Subject:* Running in cluster mode causes native library linking to fail
>
>
>
> Hello,
>
> I am trying to run some scala code in cluster mode using spark-submit.
> This code uses addLibrary to link with a .so that exists in the machine,
> and this library has a function to be called natively (there's a native
> definition as needed in the code).
>
> The problem I'm facing is: whenever I try to run this code in cluster
> mode, spark fails with the following message when trying to execute the
> native function:
>
> java.lang.UnsatisfiedLinkError:
> org.name.othername.ClassName.nativeMethod([B[B)[B
>
>
> Apparently, the library is being found by spark, but the required function
> isn't found.
>
> When trying to run in client mode, however, this doesn't fail and
> everything works as expected.
>
> Does anybody have any idea of what might be the problem here? Is there any
> bug that could be related to this when running in cluster mode?
>
> I appreciate any help.
>
> Thanks,
>
> Bernardo
>
> The information contained in this electronic message and any attachments
> to this message are intended for the exclusive use of the addressee(s) and
> may contain proprietary, confidential or privileged information. If you are
> not the intended recipient, you should not disseminate, distribute or copy
> this e-mail. Please notify the sender immediately and destroy all copies of
> this message and any attachments. WARNING: Computer viruses can be
> transmitted via email. The recipient should check this email and any
> attachments for the presence of viruses. The company accepts no liability
> for any damage caused by any virus transmitted by this email.
> www.wipro.com
>
>
> The information contained in this electronic message and any attachments
> to this message are intended for the exclusive use of the addressee(s) and
> may contain proprietary, confidential or privileged information. If you are
> not the intended recipient, you should not disseminate, distribute or copy
> this e-mail. Please notify the sender immediately and destroy all copies of
> this message and any attachments. WARNING: Computer viruses can be
> transmitted via email. The recipient should check this email and any
> attachments for the presence of viruses. The company accepts no liability
> for any damage caused by any virus transmitted by this email.
> www.wipro.com
>

Reply via email to