[ 
https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shivaram Venkataraman resolved SPARK-12019.
-------------------------------------------
       Resolution: Fixed
    Fix Version/s: 1.6.1
                   2.0.0

Issue resolved by pull request 10034
[https://github.com/apache/spark/pull/10034]

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -----------------------------------------------------------------------------
>
>                 Key: SPARK-12019
>                 URL: https://issues.apache.org/jira/browse/SPARK-12019
>             Project: Spark
>          Issue Type: Bug
>          Components: R, SparkR
>    Affects Versions: 1.5.0, 1.5.1, 1.5.2
>            Reporter: liushiqi9
>            Priority: Minor
>             Fix For: 2.0.0, 1.6.1
>
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
>  --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in 
> the spark UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to