Your tables were registered in the SqlContext, whereas the thrift server
works with HiveContext. They seem to be in two different worlds today.



On 9/9/14, 5:16 PM, "alexandria1101" <alexandria.shea...@gmail.com> wrote:

>Hi,
>
>I want to use the sparksql thrift server in my application and make sure
>everything is loading and working. I built Spark 1.1 SNAPSHOT and ran the
>thrift server using ./sbin/start-thrift-server.  In my application I load
>tables into schemaRDDs and I expect that the thrift-server should pick
>them
>up.   In the app I then perform SQL queries on a table called mutation
>(the
>same name as the table I registered from the schemaRDD).
>
>I set the driver to "org.apache.hive.jdbc.HiveDriver" and the url to
>"jdbc:hive2://localhost:10000/mutation?zeroDateTimeBehavior=convertToNull"
>.
>
>When I check the terminal for the thrift server output, it gets the
>query. 
>However, I cannot use a jdbc console to communicate with it to show all of
>the databases and tables to see if mutation is loaded.
>
>
>I get the following errors:
>
>14/09/09 16:51:02 WARN component.AbstractLifeCycle: FAILED
>SelectChannelConnector@0.0.0.0:4040: java.net.BindException: Address
>already
>in use
>java.net.BindException: Address already in use
>       at sun.nio.ch.Net.bind0(Native Method)
>       at sun.nio.ch.Net.bind(Net.java:444)
>       at sun.nio.ch.Net.bind(Net.java:436)
>       at
>sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>       at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConn
>ector.java:187)
>       at
>org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:
>316)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelC
>onnector.java:265)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at
>org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(
>JettyUtils.scala:192)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at
>org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Ut
>ils.scala:1446)
>       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
>       at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:202)
>       at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>       at
>org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:5
>3)
>       at com.illumina.phoenix.util.Runner.createSparkContext(Runner.java:144)
>       at
>com.illumina.phoenix.etl.EtlPipelineRunner.main(EtlPipelineRunner.java:116
>)
>1053 [main] WARN org.eclipse.jetty.util.component.AbstractLifeCycle  -
>FAILED SelectChannelConnector@0.0.0.0:4040: java.net.BindException:
>Address
>already in use
>java.net.BindException: Address already in use
>       at sun.nio.ch.Net.bind0(Native Method)
>       at sun.nio.ch.Net.bind(Net.java:444)
>       at sun.nio.ch.Net.bind(Net.java:436)
>       at
>sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>       at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConn
>ector.java:187)
>       at
>org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:
>316)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelC
>onnector.java:265)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at
>org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(
>JettyUtils.scala:192)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at
>org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Ut
>ils.scala:1446)
>       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
>       at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:202)
>       at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>       at
>org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:5
>3)
>       at com.illumina.phoenix.util.Runner.createSparkContext(Runner.java:144)
>       at
>com.illumina.phoenix.etl.EtlPipelineRunner.main(EtlPipelineRunner.java:116
>)
>14/09/09 16:51:02 WARN component.AbstractLifeCycle: FAILED
>org.eclipse.jetty.server.Server@35241119: java.net.BindException: Address
>already in use
>java.net.BindException: Address already in use
>       at sun.nio.ch.Net.bind0(Native Method)
>       at sun.nio.ch.Net.bind(Net.java:444)
>       at sun.nio.ch.Net.bind(Net.java:436)
>       at
>sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>       at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConn
>ector.java:187)
>       at
>org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:
>316)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelC
>onnector.java:265)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at
>org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(
>JettyUtils.scala:192)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at
>org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Ut
>ils.scala:1446)
>       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
>       at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:202)
>       at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>       at
>org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:5
>3)
>       at com.illumina.phoenix.util.Runner.createSparkContext(Runner.java:144)
>       at
>com.illumina.phoenix.etl.EtlPipelineRunner.main(EtlPipelineRunner.java:116
>)
>1055 [main] WARN org.eclipse.jetty.util.component.AbstractLifeCycle  -
>FAILED org.eclipse.jetty.server.Server@35241119: java.net.BindException:
>Address already in use
>java.net.BindException: Address already in use
>       at sun.nio.ch.Net.bind0(Native Method)
>       at sun.nio.ch.Net.bind(Net.java:444)
>       at sun.nio.ch.Net.bind(Net.java:436)
>       at
>sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
>       at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.open(SelectChannelConn
>ector.java:187)
>       at
>org.eclipse.jetty.server.AbstractConnector.doStart(AbstractConnector.java:
>316)
>       at
>org.eclipse.jetty.server.nio.SelectChannelConnector.doStart(SelectChannelC
>onnector.java:265)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at org.eclipse.jetty.server.Server.doStart(Server.java:293)
>       at
>org.eclipse.jetty.util.component.AbstractLifeCycle.start(AbstractLifeCycle
>.java:64)
>       at
>org.apache.spark.ui.JettyUtils$.org$apache$spark$ui$JettyUtils$$connect$1(
>JettyUtils.scala:192)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at org.apache.spark.ui.JettyUtils$$anonfun$3.apply(JettyUtils.scala:202)
>       at
>org.apache.spark.util.Utils$$anonfun$startServiceOnPort$1.apply$mcVI$sp(Ut
>ils.scala:1446)
>       at scala.collection.immutable.Range.foreach$mVc$sp(Range.scala:141)
>       at org.apache.spark.util.Utils$.startServiceOnPort(Utils.scala:1442)
>       at 
> org.apache.spark.ui.JettyUtils$.startJettyServer(JettyUtils.scala:202)
>       at org.apache.spark.ui.WebUI.bind(WebUI.scala:102)
>       at org.apache.spark.SparkContext.<init>(SparkContext.scala:224)
>       at
>org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:5
>3)
>       at com.illumina.phoenix.util.Runner.createSparkContext(Runner.java:144)
>       at
>com.illumina.phoenix.etl.EtlPipelineRunner.main(EtlPipelineRunner.java:116
>)
>
>org.apache.spark.SparkException: Job aborted due to stage failure: Task 0
>in
>stage 2.0 failed 1 times, most recent failure: Lost task 0.0 in stage 2.0
>(TID 17, localhost): org.springframework.jdbc.UncategorizedSQLException:
>StatementCallback; uncategorized SQLException for SQL [SELECT
>mrnafeatureid,
>mappedid, COUNT(DISTINCT pos) FROM mutation WHERE chromosomeid = 1 AND pos
>BETWEEN 10617 AND 10637 GROUP BY mrnafeatureid, mappedid]; SQL state
>[null];
>error code [0]; org.apache.hadoop.hive.ql.metadata.InvalidTableException:
>Table not found mutation; nested exception is java.sql.SQLException:
>org.apache.hadoop.hive.ql.metadata.InvalidTableException: Table not found
>mutation
>       
>org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.tr
>anslate(AbstractFallbackSQLExceptionTranslator.java:84)
>       
>org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.tr
>anslate(AbstractFallbackSQLExceptionTranslator.java:81)
>       
>org.springframework.jdbc.support.AbstractFallbackSQLExceptionTranslator.tr
>anslate(AbstractFallbackSQLExceptionTranslator.java:81)
>       
>org.springframework.jdbc.core.JdbcTemplate.execute(JdbcTemplate.java:413)
>       
>org.springframework.jdbc.core.JdbcTemplate.query(JdbcTemplate.java:468)
>       
>com.illumina.phoenix.genomedb.jdbc.MutationDAOJdbc.getMutationEntriesBetwe
>en(MutationDAOJdbc.java:143)
>       
>com.illumina.phoenix.etl.ClassificationService.assignMutationClassIndel(Cl
>assificationService.java:342)
>       
>com.illumina.phoenix.etl.ClassificationService.call(ClassificationService.
>java:663)
>        com.illumina.phoenix.etl.Classifier.call(Classifier.java:72)
>        com.illumina.phoenix.etl.Classifier.call(Classifier.java:19)
>       
>org.apache.spark.api.java.JavaPairRDD$$anonfun$toScalaFunction$1.apply(Jav
>aPairRDD.scala:923)
>       
>org.apache.spark.rdd.MappedValuesRDD$$anonfun$compute$1.apply(MappedValues
>RDD.scala:31)
>       
>org.apache.spark.rdd.MappedValuesRDD$$anonfun$compute$1.apply(MappedValues
>RDD.scala:31)
>        scala.collection.Iterator$$anon$11.next(Iterator.scala:328)
>       
>org.apache.spark.storage.MemoryStore.unrollSafely(MemoryStore.scala:236)
>       
>org.apache.spark.CacheManager.putInBlockManager(CacheManager.scala:163)
>        org.apache.spark.CacheManager.getOrCompute(CacheManager.scala:70)
>        org.apache.spark.rdd.RDD.iterator(RDD.scala:227)
>        org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)
>        org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
>        org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
>        org.apache.spark.rdd.FlatMappedRDD.compute(FlatMappedRDD.scala:33)
>        org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
>        org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
>        org.apache.spark.rdd.MappedRDD.compute(MappedRDD.scala:31)
>        org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:262)
>        org.apache.spark.rdd.RDD.iterator(RDD.scala:229)
>        org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:62)
>        org.apache.spark.scheduler.Task.run(Task.scala:54)
>       
>org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:177)
>       
>java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:
>1145)
>       
>java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java
>:615)
>        java.lang.Thread.run(Thread.java:745)
>Driver stacktrace:
>       at
>org.apache.spark.scheduler.DAGScheduler.org$apache$spark$scheduler$DAGSche
>duler$$failJobAndIndependentStages(DAGScheduler.scala:1185)
>       at
>org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSch
>eduler.scala:1174)
>       at
>org.apache.spark.scheduler.DAGScheduler$$anonfun$abortStage$1.apply(DAGSch
>eduler.scala:1173)
>       at
>scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala
>:59)
>       at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:47)
>       at
>org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:1173
>)
>       at
>org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.app
>ly(DAGScheduler.scala:688)
>       at
>org.apache.spark.scheduler.DAGScheduler$$anonfun$handleTaskSetFailed$1.app
>ly(DAGScheduler.scala:688)
>       at scala.Option.foreach(Option.scala:236)
>       at
>org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.s
>cala:688)
>       at
>org.apache.spark.scheduler.DAGSchedulerEventProcessActor$$anonfun$receive$
>2.applyOrElse(DAGScheduler.scala:1391)
>       at akka.actor.ActorCell.receiveMessage(ActorCell.scala:498)
>       at akka.actor.ActorCell.invoke(ActorCell.scala:456)
>       at akka.dispatch.Mailbox.processMailbox(Mailbox.scala:237)
>       at akka.dispatch.Mailbox.run(Mailbox.scala:219)
>       at
>akka.dispatch.ForkJoinExecutorConfigurator$AkkaForkJoinTask.exec(AbstractD
>ispatcher.scala:386)
>       at scala.concurrent.forkjoin.ForkJoinTask.doExec(ForkJoinTask.java:260)
>       at
>scala.concurrent.forkjoin.ForkJoinPool$WorkQueue.runTask(ForkJoinPool.java
>:1339)
>       at 
>scala.concurrent.forkjoin.ForkJoinPool.runWorker(ForkJoinPool.java:1979)
>       at
>scala.concurrent.forkjoin.ForkJoinWorkerThread.run(ForkJoinWorkerThread.ja
>va:107)
>
>
>
>
>
>
>
>--
>View this message in context:
>http://apache-spark-user-list.1001560.n3.nabble.com/Table-not-found-using-
>jdbc-console-to-query-sparksql-hive-thriftserver-tp13840.html
>Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
>---------------------------------------------------------------------
>To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
>For additional commands, e-mail: user-h...@spark.apache.org
>


---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to