I guess you didn't install R package `genalg` for all worker nodes. This is
not built-in package for basic R, so you need to install it to all worker
nodes manually or running `install.packages` inside of your SparkR UDF.
Regards to how to download third party packages and install them inside of
SparkR UDF, please refer this test case:
https://github.com/apache/spark/blob/master/R/pkg/tests/fulltests/test_context.R#L171

Thanks
Yanbo

On Tue, Sep 5, 2017 at 6:42 AM, Felix Cheung <felixcheun...@hotmail.com>
wrote:

> Can you include the code you call spark.lapply?
>
>
> ------------------------------
> *From:* patcharee <patcharee.thong...@uni.no>
> *Sent:* Sunday, September 3, 2017 11:46:40 PM
> *To:* spar >> user@spark.apache.org
> *Subject:* sparkR 3rd library
>
> Hi,
>
> I am using spark.lapply to execute an existing R script in standalone
> mode. This script calls a function 'rbga' from a 3rd library 'genalg'.
> This rbga function works fine in sparkR env when I call it directly, but
> when I apply this to spark.lapply I get the error
>
> could not find function "rbga"
>      at org.apache.spark.api.r.RRunner.compute(RRunner.scala:108)
>      at org.apache.spark.api.r.BaseRRDD.compute(RRDD.scala:51)
>      at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
>      at org.apache.spark.rdd.RDD.iterator(RDD.scala
>
> Any ideas/suggestions?
>
> BR, Patcharee
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to