GitHub user felixcheung opened a pull request:

    https://github.com/apache/spark/pull/16077

    [SPARK-18643][SPARKR] SparkR hangs at session start when installed as a 
package without Spark

    ## What changes were proposed in this pull request?
    
    If SparkR is running as a package and it has previously downloaded Spark 
Jar it should be able to run as before without having to set SPARK_HOME. 
Basically with this bug the auto install Spark will only work in the first 
session.
    
    This seems to be a regression on the earlier behavior.
    Fix is to always try to install or check for the cached Spark if running in 
an interactive session.
    As discussed before, we should probably only install Spark iff running in 
an interactive session (R shell, RStudio etc)
    
    ## How was this patch tested?
    
    Manually


You can merge this pull request into a Git repository by running:

    $ git pull https://github.com/felixcheung/spark rsessioninteractive

Alternatively you can review and apply these changes as the patch at:

    https://github.com/apache/spark/pull/16077.patch

To close this pull request, make a commit to your master/trunk branch
with (at least) the following in the commit message:

    This closes #16077
    
----
commit 866727d775c45bc8f2f6891ab685f3b6e20109b3
Author: Felix Cheung <[email protected]>
Date:   2016-11-30T06:16:54Z

    install or check for cached installation if interactive

----


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to