[jira] [Commented] (SPARK-18810) SparkR install.spark does not work for RCs, snapshots
[ https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15739257#comment-15739257 ] Apache Spark commented on SPARK-18810: -- User 'felixcheung' has created a pull request for this issue: https://github.com/apache/spark/pull/16248 > SparkR install.spark does not work for RCs, snapshots > - > > Key: SPARK-18810 > URL: https://issues.apache.org/jira/browse/SPARK-18810 > Project: Spark > Issue Type: Bug > Components: SparkR >Affects Versions: 2.0.2, 2.1.0 >Reporter: Shivaram Venkataraman >Assignee: Felix Cheung > > We publish source archives of the SparkR package now in RCs and in nightly > snapshot builds. One of the problems that still remains is that > `install.spark` does not work for these as it looks for the final Spark > version to be present in the apache download mirrors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-18810) SparkR install.spark does not work for RCs, snapshots
[ https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15736230#comment-15736230 ] Felix Cheung commented on SPARK-18810: -- Also to expand on the earlier note above, I think the main thing to be able to run existing tests, build vignettes and so on - without having to change any code or - without having to manually call install.spark in a separate session first to cache the spark jar this is why I think it makes sense to have an environment override instead of an API parameter switch. > SparkR install.spark does not work for RCs, snapshots > - > > Key: SPARK-18810 > URL: https://issues.apache.org/jira/browse/SPARK-18810 > Project: Spark > Issue Type: Bug > Components: SparkR >Affects Versions: 2.0.2, 2.1.0 >Reporter: Shivaram Venkataraman > > We publish source archives of the SparkR package now in RCs and in nightly > snapshot builds. One of the problems that still remains is that > `install.spark` does not work for these as it looks for the final Spark > version to be present in the apache download mirrors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-18810) SparkR install.spark does not work for RCs, snapshots
[ https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15736233#comment-15736233 ] Shivaram Venkataraman commented on SPARK-18810: --- Yeah I think that sounds good. This need not be an advertised feature that we tell users about but more of a flag we use for testing > SparkR install.spark does not work for RCs, snapshots > - > > Key: SPARK-18810 > URL: https://issues.apache.org/jira/browse/SPARK-18810 > Project: Spark > Issue Type: Bug > Components: SparkR >Affects Versions: 2.0.2, 2.1.0 >Reporter: Shivaram Venkataraman > > We publish source archives of the SparkR package now in RCs and in nightly > snapshot builds. One of the problems that still remains is that > `install.spark` does not work for these as it looks for the final Spark > version to be present in the apache download mirrors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-18810) SparkR install.spark does not work for RCs, snapshots
[ https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15736217#comment-15736217 ] Felix Cheung commented on SPARK-18810: -- For RC, it actually expects to have a subdirectory `spark-2.1.0` (==version) so it doesn't exactly match `spark-2.1.0-rc2-bin` https://github.com/apache/spark/blob/39e2bad6a866d27c3ca594d15e574a1da3ee84cc/R/pkg/R/install.R#L71 > SparkR install.spark does not work for RCs, snapshots > - > > Key: SPARK-18810 > URL: https://issues.apache.org/jira/browse/SPARK-18810 > Project: Spark > Issue Type: Bug > Components: SparkR >Affects Versions: 2.0.2, 2.1.0 >Reporter: Shivaram Venkataraman > > We publish source archives of the SparkR package now in RCs and in nightly > snapshot builds. One of the problems that still remains is that > `install.spark` does not work for these as it looks for the final Spark > version to be present in the apache download mirrors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-18810) SparkR install.spark does not work for RCs, snapshots
[ https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15736182#comment-15736182 ] Shivaram Venkataraman commented on SPARK-18810: --- I think the snapshot case and the RC case are probably a bit different. - In the case of RCs the artifact name matches what would be the final release (for example http://people.apache.org/~pwendell/spark-releases/spark-2.1.0-rc2-bin/) so we only need to change the base url (environment variable could work for this) - For nightly builds the artifact name also changes and this probably needs some more thought. I guess having a way to override the entire URL would solve both the cases ? > SparkR install.spark does not work for RCs, snapshots > - > > Key: SPARK-18810 > URL: https://issues.apache.org/jira/browse/SPARK-18810 > Project: Spark > Issue Type: Bug > Components: SparkR >Affects Versions: 2.0.2, 2.1.0 >Reporter: Shivaram Venkataraman > > We publish source archives of the SparkR package now in RCs and in nightly > snapshot builds. One of the problems that still remains is that > `install.spark` does not work for these as it looks for the final Spark > version to be present in the apache download mirrors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org
[jira] [Commented] (SPARK-18810) SparkR install.spark does not work for RCs, snapshots
[ https://issues.apache.org/jira/browse/SPARK-18810?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15736161#comment-15736161 ] Felix Cheung commented on SPARK-18810: -- I've found the same issue while testing as well, and was going to propose a change to support this. Essentially for snapshot and RC build, since the jar is not on the Apache mirror, install.spark is unable to download it. We need to have a way to override the url (details: as it is constructing the url from a base url and a version path, it is expecting the source as a certain directory structure - currently this structure does not match how the snapshot and RC build are published, so we need a way to override the entire url) I propose we have an environment variable instead of a parameter since we want to be able to run everything the same way without having to make code changes. > SparkR install.spark does not work for RCs, snapshots > - > > Key: SPARK-18810 > URL: https://issues.apache.org/jira/browse/SPARK-18810 > Project: Spark > Issue Type: Bug > Components: SparkR >Affects Versions: 2.0.2, 2.1.0 >Reporter: Shivaram Venkataraman > > We publish source archives of the SparkR package now in RCs and in nightly > snapshot builds. One of the problems that still remains is that > `install.spark` does not work for these as it looks for the final Spark > version to be present in the apache download mirrors. -- This message was sent by Atlassian JIRA (v6.3.4#6332) - To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org