Github user felixcheung commented on the issue: https://github.com/apache/spark/pull/14179 A couple of test files are reusing an existing SparkSession/SparkContext by calling `sparkR.sparkContext` I think they are now hitting the warning statement that is added in this PR, somehow. I'm not sure which ones are calling `sparkR.sparkContext` with `sparkPackages` set. If you find them you are add `suppressWarnings()` around it. ``` SerDe functionality : Error in sparkR.sparkContext(master, appName, sparkHome, sparkConfigMap, : (converted from warning) sparkPackages has no effect when using spark-submit or sparkR shell,please use the --packages commandline instead Calls: test_package ... eval -> eval -> sparkR.session -> sparkR.sparkContext ```
--- If your project is set up for it, you can reply to this email and have your reply appear on GitHub as well. If your project does not have this feature enabled and wishes so, or if the feature is enabled but not working, please contact infrastructure at infrastruct...@apache.org or file a JIRA ticket with INFRA. --- --------------------------------------------------------------------- To unsubscribe, e-mail: reviews-unsubscr...@spark.apache.org For additional commands, e-mail: reviews-h...@spark.apache.org