[ 
https://issues.apache.org/jira/browse/SPARK-6042?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Sean Owen updated SPARK-6042:
-----------------------------
          Component/s:     (was: Spark Submit)
                       SQL
     Target Version/s:   (was: 1.3.0)
    Affects Version/s:     (was: 1.3.0)
               Labels: hive  (was: hive scala spark)

This is probably better as a question on the mailing list. It sounds like you 
have mismatched Spark versions. You should not be bundling Spark with your app. 
Check that first.

> spark-submit giving Exception in thread "main" java.lang.NoSuchMethodError: 
> org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;
> -------------------------------------------------------------------------------------------------------------------------------------------------------------------------
>
>                 Key: SPARK-6042
>                 URL: https://issues.apache.org/jira/browse/SPARK-6042
>             Project: Spark
>          Issue Type: Question
>          Components: SQL
>            Reporter: Tarek Abouzeid
>              Labels: hive
>
> i am trying to create a table in hive using spark , i tried the code in 
> spark-shell and it worked and created the table , but when i use spark-submit 
> it gives this error:
> Exception in thread "main" java.lang.NoSuchMethodError: 
> org.apache.spark.sql.hive.HiveContext.sql(Ljava/lang/String;)Lorg/apache/spark/sql/SchemaRDD;
> at this line : sqlContext.sql("CREATE TABLE IF NOT EXISTS Test123 (key INT, 
> value STRING)")
> the code i submit is :
> import org.apache.spark.SparkContext
> import org.apache.spark.SparkContext._
> import org.apache.spark.SparkConf
> import org.apache.spark._
> import org.apache.spark.streaming._
> import org.apache.spark.streaming.StreamingContext._
> import org.apache.spark.storage.StorageLevel
> import org.apache.spark.streaming.flume._
> import org.apache.spark.util.IntParam
> import org.apache.spark.sql._
> import org.apache.spark.sql.hive.HiveContext
> object WordCount {
>   def main(args: Array[String]) {
>     if (args.length < 2) {
>       System.err.println(
>         "Usage: WordCount <host> <port>")
>       System.exit(1)
>     }
>     val Array(host, port) = args
>     val batchInterval = Milliseconds(2000)
>     // Create the context and set the batch size
>     val sparkConf = new SparkConf().setAppName("WordCount")
>     val sc = new SparkContext(sparkConf)
>     val ssc = new StreamingContext(sc, batchInterval)
>     // Create a flume stream
>     val stream = FlumeUtils.createStream(ssc, host, port.toInt)
>     // Print out the count of events received from this server in each batch
>     stream.count().map(cnt => "Received !!!:::::" + cnt + " flume events." 
> ).print()
>     
>     // it holds the string stream (converted event body array into string)
>     val body = stream.map(e => new String(e.event.getBody.array))
>     
>     
>    val counts = body.flatMap(line => 
> line.toLowerCase.replaceAll("[^a-zA-Z0-9\\s]", "").split("\\s+"))
>                  .map(word => (word, 1))
>                  .reduceByKey(_ + _)
>  
> val sqlContext = new org.apache.spark.sql.hive.HiveContext(sc)
> sqlContext.sql("CREATE TABLE IF NOT EXISTS tarek (key INT, value STRING)")
>  
>     ssc.start()
>     ssc.awaitTermination()
>   }
> }
> i tried to submit this code on local[*]   and yarn-master and both gave same 
> error , the error is at this specific line : 
> "sqlContext.sql("CREATE TABLE IF NOT EXISTS tarek (key INT, value STRING)")"
> but i executed the exact same line to execute create table query and it 
> succeeded ,, i found a kinda similar issue here :
> https://issues.apache.org/jira/browse/SPARK-6018
> can any one helps please ? 



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to