Github user sun-rui commented on the pull request:

    https://github.com/apache/spark/pull/7139#issuecomment-121884026
  
    @brkyvz, more comments:)
    3. Is it possible to run R specific tests only when -psparkr (sparkr 
profile) is specified? If sparkr profile is specified when building spark, it 
is assumed that the user has installed R on the machine.
    
    4. Just FYI, now SparkR supports distributing binary SparkR package to 
driver and worker nodes. you can take a look at 
https://issues.apache.org/jira/browse/SPARK-6797. Is it possible to base your 
work on this?


---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---

---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]

Reply via email to