Github user srowen commented on a diff in the pull request:

    https://github.com/apache/spark/pull/23098#discussion_r236714704
  
    --- Diff: R/pkg/R/sparkR.R ---
    @@ -269,7 +269,7 @@ sparkR.sparkContext <- function(
     #' sparkR.session("yarn-client", "SparkR", "/home/spark",
     #'                list(spark.executor.memory="4g"),
     #'                c("one.jar", "two.jar", "three.jar"),
    -#'                c("com.databricks:spark-avro_2.11:2.0.1"))
    +#'                c("com.databricks:spark-avro_2.12:2.0.1"))
    --- End diff --
    
    @felixcheung was the conclusion that we can make this a dummy package? I 
just want to avoid showing _2.11 usage here.


---

---------------------------------------------------------------------
To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org
For additional commands, e-mail: reviews-h...@spark.apache.org

Reply via email to