Ajay Chaudhary created ZEPPELIN-1565:
----------------------------------------

             Summary: Unable to query from Mongodb from Zeppelin using spark
                 Key: ZEPPELIN-1565
                 URL: https://issues.apache.org/jira/browse/ZEPPELIN-1565
             Project: Zeppelin
          Issue Type: Bug
          Components: zeppelin-server
    Affects Versions: 0.6.0
            Reporter: Ajay Chaudhary
            Priority: Critical


Hi Team,

we are trying to connect to query data from Mongo database from Zeppelin using 
spark and we are getting below exception.

can you please look into this and advise what could be the problem. we are 
query from Zeppelin Notebook.

Query:

%spark

val options1 = Map("spark.mongodb.input.uri" -> 
"mongodb://user/password@serverip:37017",
"spark.mongodb.input.database" -> "dataing",
"spark.mongodb.input.collection" -> "MEM",
"spark.mongodb.input.readPreference.name" -> "primaryPreferred")
val df1 = 
sqlContext.read.format("com.mongodb.spark.sql").options(options1).load()
//MongoSpark.load(sqlContext, options1)

println("df1 Schema:")
df1.printSchema()

df1.registerTempTable("MEM")

val sql1 = "SELECT DB_NAME FROM MEM"
val results1 = sqlContext.sql(sql1)
results1.show()


Error details:
-------------

otions1: scala.collection.immutable.Map[String,String] = 
Map(spark.mongodb.input.uri -> mongodb://rsinha/rsinha123@10.11.5.78:37017, 
spark.mongodb.input.database -> dataing, spark.mongodb.input.collection -> MEM, 
spark.mongodb.input.readPreference.name -> primaryPreferred)
java.lang.IllegalArgumentException: Missing database name. Set via the 
'spark.mongodb.input.uri' or 'spark.mongodb.input.database' property
        at 
com.mongodb.spark.config.MongoCompanionConfig$class.databaseName(MongoCompanionConfig.scala:169)
        at 
com.mongodb.spark.config.ReadConfig$.databaseName(ReadConfig.scala:35)
        at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:46)
        at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:35)
        at 
com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:83)
        at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:35)
        at 
com.mongodb.spark.config.MongoCompanionConfig$class.apply(MongoCompanionConfig.scala:73)
        at com.mongodb.spark.config.ReadConfig$.apply(ReadConfig.scala:35)
        at 
com.mongodb.spark.sql.DefaultSource.connectorAndReadConfig(DefaultSource.scala:127)
        at 
com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:66)
        at 
com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:52)
        at 
com.mongodb.spark.sql.DefaultSource.createRelation(DefaultSource.scala:37)
        at 
org.apache.spark.sql.execution.datasources.ResolvedDataSource$.apply(ResolvedDataSource.scala:158)
        at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:119)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:29)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:35)
        at $iwC$$iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:37)
        at $iwC$$iwC$$iwC$$iwC$$iwC.<init>(<console>:39)
        at $iwC$$iwC$$iwC$$iwC.<init>(<console>:41)
        at $iwC$$iwC$$iwC.<init>(<console>:43)
        at $iwC$$iwC.<init>(<console>:45)
        at $iwC.<init>(<console>:47)
        at <init>(<console>:49)
        at .<init>(<console>:53)
        at .<clinit>(<console>)
        at .<init>(<console>:7)
        at .<clinit>(<console>)
        at $print(<console>)
        at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
        at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
        at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
        at java.lang.reflect.Method.invoke(Method.java:606)
        at 
org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1045)
        at 
org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1326)
        at 
org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:821)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:852)
        at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:800)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpretInput(SparkInterpreter.java:810)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:753)
        at 
org.apache.zeppelin.spark.SparkInterpreter.interpret(SparkInterpreter.java:746)
        at 
org.apache.zeppelin.interpreter.LazyOpenInterpreter.interpret(LazyOpenInterpreter.java:94)
        at 
org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:341)
        at org.apache.zeppelin.scheduler.Job.run(Job.java:176)
        at 
org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:139)
        at 
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)
        at java.util.concurrent.FutureTask.run(FutureTask.java:262)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:178)
        at 
java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:292)
        at 
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)
        at 
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)
        at java.lang.Thread.run(Thread.java:745)




--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

Reply via email to