[
https://issues.apache.org/jira/browse/SPARK-21752?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16138948#comment-16138948
]
Jakub Nowacki commented on SPARK-21752:
---------------------------------------
OK I did one more extra test and, indeed, on the newest version 2.2.0 (and also
2.1.1) all three configs work fine; though I'm pretty sure one did not work at
least once, but maybe this was a coincident. I investigated further, and when I
rolled back to 2.0.2, which I have on a different setup, only the
{{PYSPARK_SUBMIT_ARGS}} worked reliably and the other ones didn't; maybe in
case of this version the {{config}} ones work not deterministically. Thus,
this seems to be an issue for versions up to 2.0.2, and for the newer ones it
seems to work, but not sure, if all the time.
First question is if there is a way to check if fact that it works for 2.1.1
and 2.2.0 also can stop working on occasion? Also, do we still should have a
form of documentation for the safer way of configuration?
> Config spark.jars.packages is ignored in SparkSession config
> ------------------------------------------------------------
>
> Key: SPARK-21752
> URL: https://issues.apache.org/jira/browse/SPARK-21752
> Project: Spark
> Issue Type: Bug
> Components: SQL
> Affects Versions: 2.2.0
> Reporter: Jakub Nowacki
>
> If I put a config key {{spark.jars.packages}} using {{SparkSession}} builder
> as follows:
> {code}
> spark = pyspark.sql.SparkSession.builder\
> .appName('test-mongo')\
> .master('local[*]')\
> .config("spark.jars.packages",
> "org.mongodb.spark:mongo-spark-connector_2.11:2.2.0")\
> .config("spark.mongodb.input.uri", "mongodb://mongo/test.coll") \
> .config("spark.mongodb.output.uri", "mongodb://mongo/test.coll") \
> .getOrCreate()
> {code}
> the SparkSession gets created but there are no package download logs printed,
> and if I use the loaded classes, Mongo connector in this case, but it's the
> same for other packages, I get {{java.lang.ClassNotFoundException}} for the
> missing classes.
> If I use the config file {{conf/spark-defaults.comf}}, command line option
> {{--packages}}, e.g.:
> {code}
> import os
> os.environ['PYSPARK_SUBMIT_ARGS'] = '--packages
> org.mongodb.spark:mongo-spark-connector_2.11:2.2.0 pyspark-shell'
> {code}
> it works fine. Interestingly, using {{SparkConf}} object works fine as well,
> e.g.:
> {code}
> conf = pyspark.SparkConf()
> conf.set("spark.jars.packages",
> "org.mongodb.spark:mongo-spark-connector_2.11:2.2.0")
> conf.set("spark.mongodb.input.uri", "mongodb://mongo/test.coll")
> conf.set("spark.mongodb.output.uri", "mongodb://mongo/test.coll")
> spark = pyspark.sql.SparkSession.builder\
> .appName('test-mongo')\
> .master('local[*]')\
> .config(conf=conf)\
> .getOrCreate()
> {code}
> The above is in Python but I've seen the behavior in other languages, though,
> I didn't check R.
> I also have seen it in older Spark versions.
> It seems that this is the only config key that doesn't work for me via the
> {{SparkSession}} builder config.
> Note that this is related to creating new {{SparkSession}} as getting new
> packages into existing {{SparkSession}} doesn't indeed make sense. Thus this
> will only work with bare Python, Scala or Java, and not on {{pyspark}} or
> {{spark-shell}} as they create the session automatically; it this case one
> would need to use {{--packages}} option.
--
This message was sent by Atlassian JIRA
(v6.4.14#64029)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]