[jira] [Updated] (SPARK-12019) SparkR.init does not support character vector for sparkJars and sparkPackages

2015-12-24 Thread Sean Owen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-12019:
--
Fix Version/s: (was: 1.6.1)
   (was: 2.0.0)
   1.6.0

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -
>
> Key: SPARK-12019
> URL: https://issues.apache.org/jira/browse/SPARK-12019
> Project: Spark
>  Issue Type: Bug
>  Components: R, SparkR
>Affects Versions: 1.4.0, 1.4.1, 1.5.0, 1.5.1, 1.5.2
>Reporter: liushiqi9
>Assignee: Felix Cheung
>Priority: Minor
> Fix For: 1.6.0
>
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
>  --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in 
> the spark UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-12019) SparkR.init does not support character vector for sparkJars and sparkPackages

2015-12-24 Thread Sean Owen (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-12019:
--
Fix Version/s: 2.0.0

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -
>
> Key: SPARK-12019
> URL: https://issues.apache.org/jira/browse/SPARK-12019
> Project: Spark
>  Issue Type: Bug
>  Components: R, SparkR
>Affects Versions: 1.4.0, 1.4.1, 1.5.0, 1.5.1, 1.5.2
>Reporter: liushiqi9
>Assignee: Felix Cheung
>Priority: Minor
> Fix For: 1.6.0, 2.0.0
>
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
>  --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in 
> the spark UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-12019) SparkR.init does not support character vector for sparkJars and sparkPackages

2015-12-03 Thread Shivaram Venkataraman (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Shivaram Venkataraman updated SPARK-12019:
--
Assignee: Felix Cheung

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -
>
> Key: SPARK-12019
> URL: https://issues.apache.org/jira/browse/SPARK-12019
> Project: Spark
>  Issue Type: Bug
>  Components: R, SparkR
>Affects Versions: 1.5.0, 1.5.1, 1.5.2
>Reporter: liushiqi9
>Assignee: Felix Cheung
>Priority: Minor
> Fix For: 1.6.1, 2.0.0
>
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
>  --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in 
> the spark UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-12019) SparkR.init does not support character vector for sparkJars and sparkPackages

2015-12-03 Thread Felix Cheung (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Felix Cheung updated SPARK-12019:
-
Affects Version/s: 1.4.0
   1.4.1

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -
>
> Key: SPARK-12019
> URL: https://issues.apache.org/jira/browse/SPARK-12019
> Project: Spark
>  Issue Type: Bug
>  Components: R, SparkR
>Affects Versions: 1.4.0, 1.4.1, 1.5.0, 1.5.1, 1.5.2
>Reporter: liushiqi9
>Assignee: Felix Cheung
>Priority: Minor
> Fix For: 1.6.1, 2.0.0
>
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
>  --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in 
> the spark UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-12019) SparkR.init does not support character vector for sparkJars and sparkPackages

2015-12-02 Thread Felix Cheung (JIRA)

 [ 
https://issues.apache.org/jira/browse/SPARK-12019?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Felix Cheung updated SPARK-12019:
-
Summary: SparkR.init does not support character vector for sparkJars and 
sparkPackages  (was: SparkR.init have wrong example)

> SparkR.init does not support character vector for sparkJars and sparkPackages
> -
>
> Key: SPARK-12019
> URL: https://issues.apache.org/jira/browse/SPARK-12019
> Project: Spark
>  Issue Type: Bug
>  Components: R, SparkR
>Affects Versions: 1.5.0, 1.5.1, 1.5.2
>Reporter: liushiqi9
>Priority: Minor
>
> https://spark.apache.org/docs/1.5.2/api/R/sparkR.init.html
> The example says initial the sparkJars variable with
> sparkJars=c("jarfile1.jar","jarfile2.jar")
> But when I try this in Rstudio, it actually gave me a warning:
> Warning message:
> In if (jars != "") { :
>   the condition has length > 1 and only the first element will be used
> and you can see in the logs :
> Launching java with spark-submit command 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/phemi-spark/spark/bin/spark-submit
>  --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-datasource.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd --jars 
> /home/liushiqi9/dev/data-science/tools/phemi-spark-zeppelin-client/phemi/libs/phemi-spark-lib-1.0-all.jar
>   sparkr-shell /tmp/RtmpThLAQn/backend_port39cd33f76fcd 
> So it's try to upload this two jars into two different shell I think. And in 
> the spark UI environment page I only see the first jar.
> The right way to do it is:
> sparkJars=c("jarfile1.jar,jarfile2.jar")



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org