hi all
 i have the folllowign snippet which loads a dataframe from  a csv file and
tries to save
it to mongodb.
For some reason, the MongoSpark.save method raises the following exception

Exception in thread "main" java.lang.IllegalArgumentException: Missing
database name. Set via the 'spark.mongodb.output.uri' or
'spark.mongodb.output.database' property
    at
com.mongodb.spark.config.MongoCompanionConfig$class.databaseName(MongoCompanionConfig.scala:260)
    at
com.mongodb.spark.config.WriteConfig$.databaseName(WriteConfig.scala:36)

Which is bizzarre as i m pretty sure i am setting all the necessary
properties in the SparkConf

could you kindly assist?

I am running Spark 2.0.1 locally with a local mongodb instance running at
127.0.0.1:27017
I am using version 2.0.0 of mongo-spark-connector
I am running on Scala 2.11

kr

val spark = SparkSession
         .builder()
         .master("local")
         .appName("Spark Mongo Example")
         .getOrCreate()
    spark.conf.set("spark.mongodb.input.uri", "mongodb://127.0.0.1:27017/")
    spark.conf.set("spark.mongodb.output.uri", "mongodb://127.0.0.1:27017/")
    spark.conf.set("spark.mongodb.output.database", "test")

    println(s"SparkPRoperties:${spark.conf.getAll}")


    val df = getDataFrame(spark) // Loading any dataframe from a file

    df.printSchema()

    println(s"Head:${df.head()}")
    println(s"Count:${df.count()}")
    println("##################  SAVING TO MONGODB #####################")
    import com.mongodb.spark.config._

    import com.mongodb.spark.config._

    val writeConfig = WriteConfig(Map("collection" -> "spark",
"writeConcern.w" -> "majority"), Some(WriteConfig(spark.sparkContext)))
    MongoSpark.save(df, writeConfig)

Reply via email to