Maher Hattabi created SPARK-20388:
-------------------------------------
Summary: problem loading data from mongoDB 3.0.5 to apache spark
Key: SPARK-20388
URL: https://issues.apache.org/jira/browse/SPARK-20388
Project: Spark
Issue Type: Bug
Components: Spark Core
Affects Versions: 2.0.0
Environment: windows 7 64 bits , using Scala IDE,mongoDB 3.0.5
Reporter: Maher Hattabi
Priority: Critical
Hi support
I am using apache spark 2.0.0 along side with mongoDB 3.0.5 , i added the
mongo-spark-connector_2.11-2.0.0 in order to load data from mongoDB to spark
dataframe here is the code i used
import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.SQLContext
import com.mongodb.spark.sql._
import com.mongodb.spark._
import org.bson.Document
import com.mongodb.spark.config._
object Mongo extends App {
try {
val sparkSession = SparkSession.builder().master("local").getOrCreate()
def makeMongoURI(uri:String,database:String,collection:String) =
(s"${uri}/${database}.${collection}")
val mongoURI = "mongodb://127.0.0.1:27017"
val Conf = makeMongoURI(mongoURI,"io","thing")
val readConfigintegra: ReadConfig = ReadConfig(Map("uri" -> Conf))
// Uses the ReadConfig
val df = sparkSession.sqlContext.loadFromMongoDB(ReadConfig(Map("uri" ->
"mongodb://127.0.0.1:27017/io.thing")))
df.printSchema()
} catch {
case t: Throwable => t.printStackTrace() // TODO: handle error
println(t.getMessage)
}
}
and i got the exception below
7/04/19 10:21:54 INFO SharedState: Warehouse path is 'file:D:\KNet Big Data
Analytics Workspace\TestMongoDB/spark-warehouse'.
java.lang.NoClassDefFoundError: com/mongodb/ConnectionString
at
com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at
com.mongodb.spark.config.MongoCompanionConfig$$anonfun$4.apply(MongoCompanionConfig.scala:278)
at scala.util.Try$.apply(Try.scala:192)
at
com.mongodb.spark.config.MongoCompanionConfig$class.connectionString(MongoCompanionConfig.scala:278)
at
com.mongodb.spark.config.ReadConfig$.connectionString(ReadConfig.scala:39)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:51)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at
com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:124)
at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:39)
at
TestMongoDB.Mongo$.delayedEndpoint$TestMongoDB$Mongo$1(Mongo.scala:18)
at TestMongoDB.Mongo$delayedInit$body.apply(Mongo.scala:11)
at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
at
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.App$$anonfun$main$1.apply(App.scala:76)
at scala.collection.immutable.List.foreach(List.scala:381)
at
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
at scala.App$class.main(App.scala:76)
at TestMongoDB.Mongo$.main(Mongo.scala:11)
at TestMongoDB.Mongo.main(Mongo.scala)
Caused by: java.lang.ClassNotFoundException: com.mongodb.ConnectionString
at java.net.URLClassLoader.findClass(Unknown Source)
at java.lang.ClassLoader.loadClass(Unknown Source)
at sun.misc.Launcher$AppClassLoader.loadClass(Unknown Source)
i put a screen shot of the error below in the link
http://www.mediafire.com/view/k6x2x0urxr7k5y9/mongodberror.PNG
Thank in advance
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]