Hi, Joseph,
This is a known issue but not a bug.
This issue does not occur when you use interactive SparkR session, while it
does occur when you execute an R file.
The reason behind this is that in case you execute an R file, the R backend
launches before the R interpreter, so there is no oppo
Hi all,
I find an issue in sparkR, maybe it's a bug:
When I read csv file, it's normal to use the following way:
${SPARK_HOME}/bin/spark-submit --packages com.databricks:spark-csv_2.11:1.4.0
example.R
But using the following way will give an error:
sc <- sparkR.init(sparkPackages="com.datab