Github user felixcheung commented on a diff in the pull request:
https://github.com/apache/spark/pull/14179#discussion_r70904015
--- Diff: R/pkg/R/sparkR.R ---
@@ -155,6 +155,9 @@ sparkR.sparkContext <- function(
existingPort <- Sys.getenv("EXISTING_SPARKR_BACKEND_PORT", "")
if (existingPort != "") {
+ if(sparkPackages != ""){
+ warning("--packages flag should be used with with spark-submit")
--- End diff --
@shivaram maybe it should but sparkR.session() is already called in sparkR
shell, and calling SparkSession again with the sparkPackages does nothing:
```
> sparkR.session(sparkPackages = "com.databricks:spark-avro_2.10:2.0.1")
Java ref type org.apache.spark.sql.SparkSession id 1
> read.df("", source = "avro")
16/07/14 23:55:43 ERROR RBackendHandler: loadDF on
org.apache.spark.sql.api.r.SQLUtils failed
Error in invokeJava(isStatic = TRUE, className, methodName, ...) :
org.apache.spark.sql.AnalysisException: Failed to find data source: avro.
Please use Spark package
http://spark-packages.org/package/databricks/spark-avro;
```
@krishnakalyan3 something like "sparkPackages has no effect when using
spark-submit or sparkR shell, please use the --packages commandline instead"
---
If your project is set up for it, you can reply to this email and have your
reply appear on GitHub as well. If your project does not have this feature
enabled and wishes so, or if the feature is enabled but not working, please
contact infrastructure at [email protected] or file a JIRA ticket
with INFRA.
---
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]